0 users browsing Discussion. | 2 bots  
    Main » Discussion » I have yet to have never seen it all.
    Pages: First Previous 32 33 34 35 36 37 38 39 40 41 42 Next Last
    Posted on 20-01-13, 01:20

    Post: #42 of 105
    Since: 11-13-19

    Last post: 1223 days
    Last view: 1223 days
    Also, no, fuck you, 1 pixel = whatever goddamn unit I want it to be. I have a 2x scale monitor, and I have that paired with a 1x scale monitor. Both the same approximate physical dimensions, but one's 3840x2160 and the other is 1920x1080. I expect this to cause problems, but I expect every app I launch to start zoomed to the configured 200% level, and it doesn't matter with the other monitor, since a single force-100% app is running there, gigantic title bar and all.

    I expect GUI libraries to gracefully support arbitrary resolution scaling, and to gracefully render vectors and lines and fonts relative to that scale factor. I expect them to handle bitmaps semi-gracefully. I know Windows now has a thing where it supports HQ2x or similar for upscaling bitmaps when rendering GDI apps on higher DPI displays.

    I don't care if this is painful, you can't expect every single user to have identical hardware.
    Posted on 20-01-13, 02:30
    Stirrer of Shit
    Post: #704 of 717
    Since: 01-26-19

    Last post: 1525 days
    Last view: 1523 days
    Posted by tomman
    :facepalm:

    You're neither an end user nor a coder that actually cares about doing things the right way in the real world.

    I'm an end user concerned with aesthetic sensibilities and a programmer wishing things would be done the right way in the real world. Alas, they are not, why we have 4-digit dates and React "Native" instead.

    Reminds me the whole currency reconversion fiasco over here, where there are oh-so-many ways to lop all those extra zeroes, each one with its unique set of pitfalls (mostly rounding errors). It hurt me badly because the VEF->VES transition was done MID-MONTH, instead of the following January 1st (as it was done with VEB->VEF).

    If you are in a hurry then perhaps it is not ideal to wait until the new year.

    Do you store all your amounts in VEF and convert them on the fly, by checking transaction dates and moving the decimal point accordingly? (BigDecimal is a godsend here!) Or store amount in whatever currency it was done, then enjoy hell when you're going to perform operations with mixed currencies? (because you now have to check transaction dates at EVERY CALCULATION STEP! Oh, and when you get SQL stored procs, things certainly become quite painful. I survived it, but I had to deal with the beancounters fallout until a few months ago...

    They're two separate currencies, you can't mix and match data like that.
    On the date of the switchover, you debit your VEF accounts with their balance in VEF and credit your VES accounts with their balance in VES.

    It's an icky problem because it is an icky problem. It's like handling Germany adopting the Euro by multiplying all balances by 1.95583 and calling it a day. An actual transaction took place, even if it was imaginary, and you can't just handwave it.

    Posted by CaptainJistuce
    The problem is obvious. You're saying what they said in 2000. And, well, what they said when the original programmers wrote this stuff in the 1960s.

    And was there a problem in 2000? No, because they fixed it. And they did so without making everything ugly.

    It's disgusting. Instead of YYMMDD, a nice compact number, you get yyYY-MM-DD, where the first 2 digits are just pointless decoration.



    7-bit isn't even enough to encode a full set of punctuation alongside a full english alphabet and roman numeral set.

    It misses one quote sign, and nobody uses the rich single quotes anyway. What's the problem here?
    Roman numerals are encoded thusly: I, II, III, IV, V, VI, &c. Even in an Unicode world, they are.

    2-digit years were a memory-saving hack that was relevant when we measured memory available in bytes, and has no place in the modern world.

    They have a place in my mind: 20200113 is hard to parse, why you need dashes, 2020-01-13, and suddenly you've almost doubled the size from a nice, clean, 200113.

    Here's how serious and respectable countries with well-established traditions of record-keeping handle dates:
    Iceland: The number is composed of 10 digits, of which the first six are the individual's birth date or corporation's founding date in the format DDMMYY. The next two digits are chosen at random when the identification number is allocated, the ninth digit is a check digit, and the last digit indicates the century in which the individual was born (for instance, '9' for the period 1900–1999, or '0' for the period 2000–2099). An example would be 120174-3399, the person being born on the twelfth day of January 1974.
    Norway: Historically, the number has been composed of the date of birth (DDMMYY), a three digit individual number, and two check digits. The individual number and the check digits are collectively known as the Personal Number. In 2017, the Norwegian Ministry of Finance approved changes to the numbering system. After the changes, the number will no longer indicate gender, and the first check digit will be 'released' to become part of the individual number.
    Sweden: The number uses ten digits, YYMMDD-NNGC. The first six give the birth date in [b]YYMMDD format.[/b] Digits seven to nine (NNG) are used to make the number unique, where digit nine (G) is odd for men and even for women. For numbers issued before 1990, the seventh and eighth digit identify the county of birth or foreign-born people, but privacy-related criticism caused this system to be abandoned for new numbers. The tenth digit (C) is created using the Luhn, or "mod 10", checksum algorithm. The year a person turns 100 the hyphen with a plus sign. A common missunderstanding is that the hyphen is replaced with a plus sign when the person turns 100 years old, but according to the definition, it happens the first of January that year.
    Finland: It consists of eleven characters of the form DDMMYYCZZZQ, where DDMMYY is the day, month and year of birth, C the century sign, ZZZ the individual number and Q the control character (checksum). The sign for the century is either + (1800–1899), - (1900–1999), or A (2000–2099).
    Denmark: The CPR number is a ten-digit number with the format DDMMYY-SSSS, where DDMMYY is the date of birth and SSSS is a sequence number. The first digit of the sequence number encodes the century of birth (so that centenarians are distinguished from infants, 0-4 in odd centuries, 5-9 in even centuries), and the last digit of the sequence number is odd for males and even for females.

    As you can see, a lot of different solutions can be used, none of which involve appending on extra junk.

    The thinking man, I think, would in legacy systems use the excess space in the day field to encode the century. 01-32 1950-2049, 33-64 2050-2149, 65-96 2150-2249. After that, the months field can be used.

    Nope. These databases still contain data reaching back to 1960, sometimes for regulatory purposes.

    That does seem to be a problem, but presenting stuff with 4-digit years is still unacceptable. A hidden single-bit field "centuryLSB" set to 1 for the third millennium seems like a more sensible solution.

    Isn't anything serious stored with Unix time anyway? Handling raw dates is just unpleasant, and at least they ought to be using Excel-style dates (Unix time / 86400). I mean, then you might as well be storing your dates as "The thirteenth of January in the year of our Lord twenty hundred and twenty". Complete clown style.

    On the topic of Unix time, leap seconds are disgusting. They should have went with GPS time and had the timestamp been the amount of second-sized time intervals since the epoch. It's not like they subtract 3600 each time DST hits, right?

    The past will never go away,

    I implore you to read my signature.
    But sure. After all, we need to compute taxes on it.
    and every computer program going forward needs to be able to handle dates from any point in computing history AT A MINIMUM.

    No. The applications here are games, they certainly don't. Mission-critical software shouldn't be relying on string representations of dates.

    Also, two-digit years will break again in 2100, which is a single lifetime away. Three-digit years is a minimum requirement for any new software development.

    Three-digit years is the worst of both worlds. Then I would rather have four, which at least encodes a legitimate concept. This is an absurd conversation anyway. What about the year 10K problem? What then, huh? Meanwhile, 253402300800 is a perfectly legitimate 64-bit integer.

    Posted by kode54
    I don't care if this is painful, you can't expect every single user to have identical hardware.

    You can, you should, and it's regrettable they don't.
    Terry Davis had the right idea, but he was too extreme. I think 24-bit 1920x1080 / Windows 7 / Stereo / 100 Mbit would be a good default. Enthusiasts can use IPS panels or homemade scaling solutions.

    There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this.
    Posted on 20-01-13, 03:24

    Post: #43 of 105
    Since: 11-13-19

    Last post: 1223 days
    Last view: 1223 days
    You know what? Fuck it, there's no point even conversing with you. I'll just apply a little magic, and not worry about anything you have to say ever again.
    Posted on 20-01-13, 04:51 (revision 1)
    Custom title here

    Post: #811 of 1150
    Since: 10-30-18

    Last post: 6 days
    Last view: 1 day
    Posted by funkyass
    Windowing is a good solution, because it means you don't need to touch the data in the archives, but would need re-assessing the fix yearly - which no one ever did. Fixing it now means either moving the window, or maintaining three systems - one for pre y2k data, one for the windowed dates, and the system that uses a 64-bit integer timestamp that handles time-zones and leap-seconds.

    the WWE2K20 game is a code base from the PS1 era, if anyone was wondering why it happened.

    It is a good SHORT-TERM fix that keeps things working while you create a real fix. But the real fixes weren't created and it just created an ongoing upkeep cost as the "fix" is reworked over and over and over.

    Also, the WWE2K20 date code isn't from the PS1 era, because the PS1 didn't have a clock or network connectivity. Regardless, it contained date code that HAD been hacked to keep working through Y2K, but not truly FIXED.




    Posted by sureanem

    Posted by CaptainJistuce
    The problem is obvious. You're saying what they said in 2000. And, well, what they said when the original programmers wrote this stuff in the 1960s.

    And was there a problem in 2000? No, because they fixed it. And they did so without making everything ugly.

    It's disgusting. Instead of YYMMDD, a nice compact number, you get yyYY-MM-DD, where the first 2 digits are just pointless decoration.

    They fixed it in a fragile, non-extensible way that was guaranteed to break again, and the cost of millions of dollars and countless manhours. Because no one ever thought "Hey, maybe computers will still exist in thirty years and we should make more robust code".

    Also, I know of one major corporation still operating systems flagged as non-Y2K-compliant after Y2K. They had special update and reboot procedures to keep them working in spite of everyone, because management refused to authorize expenditure for replacements.

    2-digit years were a memory-saving hack that was relevant when we measured memory available in bytes, and has no place in the modern world.

    They have a place in my mind: 20200113 is hard to parse, why you need dashes, 2020-01-13, and suddenly you've almost doubled the size from a nice, clean, 200113.

    Two-digit years is fine for a presentation format, but not a data format. Hyphens shouldn't be stored in the date, they're a part of the presentation, just as a choice of slashes or hyphens is presentation, or year presented at the beginning or end of the date, or even suppressed entirely.

    Here's how serious and respectable countries with well-established traditions of record-keeping handle dates:
    Iceland: The number is composed of 10 digits, of which the first six are the individual's birth date or corporation's founding date in the format DDMMYY. The next two digits are chosen at random when the identification number is allocated, the ninth digit is a check digit, and the last digit indicates the century in which the individual was born (for instance, '9' for the period 1900–1999, or '0' for the period 2000–2099). An example would be 120174-3399, the person being born on the twelfth day of January 1974.

    So you're saying that they solved the century rollover by extending the date in a backwards-compatible manner? That's fantastic!


    As you can see, a lot of different solutions can be used, none of which involve appending on extra junk.

    As you can see, Iceland created a robust solution that will serve them for centuries to come without risk of overlap or confusion.
    Also, birth dates on state-issued IDs are a limited case of date handling, as it is almost always obvious from other parts of the ID what century the birth date is.

    Incidentally, my state issues an 8-digit ID number that does not encode any of that data. Birth date is presented as a separate field, in the form of MM/DD/YYYY. And did so in late twentieth century as well(I've seen my parents' old drivers licenses, and they had 4-digit years).


    The thinking man, I think, would in legacy systems use the excess space in the day field to encode the century. 01-32 1950-2049, 33-64 2050-2149, 65-96 2150-2249. After that, the months field can be used.

    That is a super-ugly hack. It is so revolting that I want to inflict bodily harm on you for suggesting it.
    It is also incompatible with databases storing the date fields in BCD, or using ugly five-bit fields for binary storage of M and D. Or that simply store D as 0-356 and leave the month to the presentation layer.

    Nope. These databases still contain data reaching back to 1960, sometimes for regulatory purposes.

    That does seem to be a problem, but presenting stuff with 4-digit years is still unacceptable. A hidden single-bit field "centuryLSB" set to 1 for the third millennium seems like a more sensible solution.

    That's a good fix, and kicks the can a century down the road instead of twenty years down the road. Presentation and storage format do not have to match, and never have.

    But since most every piece of hardware made in the last fifty years is byte-oriented, you may as well allocate a whole byte to the century field as an unsigned integer. If centuries are back-allocated for "legacy dates" then that'll get us to AD 25599 with no issues in the date format. And quite bluntly, if that code is in use long enough for there to be a Y256K bug, then whatever society descended from us deserves it.

    On the topic of Unix time, leap seconds are disgusting. They should have went with GPS time and had the timestamp been the amount of second-sized time intervals since the epoch. It's not like they subtract 3600 each time DST hits, right?

    On this I agree with you. Unix time should be a continuous count forward, and adjusting for leap-seconds should be an issue for presentation formatting(which makes even more sense when you consider that the Unix time format already abandons the concept of years, months, days, hours, and minutes).

    and every computer program going forward needs to be able to handle dates from any point in computing history AT A MINIMUM.

    No. The applications here are games, they certainly don't. Mission-critical software shouldn't be relying on string representations of dates.

    No, the applications here contained a game. While we can argue the transience of entertainment software, the article said four of five of ALL Y2K fixes were windowing, which is a cheap hack that had to be reworked to make it over the end of the window.

    Dates definitely shouldn't be stored as strings. String dates are a sign of inept database management. If we're storing Gregorian dates(or Julian, those are still in use some places!), I'd expect 3 fields in BCD or unsigned-integer. A field for year, another for day, and one more for month. In any order you please, because that is a problem for the presentation layer.

    Not that an 8-bit unsigned integer for the year meant software was Y2k-compliant, as plenty of software dutifully advanced the year to "19100" due to in-built assumptions.


    Also, two-digit years will break again in 2100, which is a single lifetime away. Three-digit years is a minimum requirement for any new software development.

    Three-digit years is the worst of both worlds. Then I would rather have four, which at least encodes a legitimate concept. This is an absurd conversation anyway. What about the year 10K problem? What then, huh? Meanwhile, 253402300800 is a perfectly legitimate 64-bit integer.
    I'd rather store a four-digit year too.
    But a three-digit year is used by Iceland's ID system, and they clearly know what they are doing.


    Posted by kode54
    I don't care if this is painful, you can't expect every single user to have identical hardware.

    You can, you should, and it's regrettable they don't.
    Terry Davis had the right idea, but he was too extreme. I think 24-bit 1920x1080 / Windows 7 / Stereo / 100 Mbit would be a good default. Enthusiasts can use IPS panels or homemade scaling solutions.

    We clearly should've all stuck with Commodore 64s as the one true computer.
    ZX Spectrums for Europe, it's what they deserve for allowing the british to popularize the ZX Spectrum.

    No, mandatory fixed hardware standards are a terrible idea. I'm still mad at 1920x1080 becoming one. I'm glad that displays have started to advance meaningfully beyond what I was running in 1999. (1280p@70Hz > 1080p@60Hz, since we only do one-number resolutions these days.)

    --- In UTF-16, where available. ---
    Posted on 20-01-13, 05:28

    Post: #44 of 105
    Since: 11-13-19

    Last post: 1223 days
    Last view: 1223 days
    Also, since Jistuce made me read it, I thought I'd respond to it.

    Fuck you, Windows 7 dies in less than two days. Good riddance, it was already replaced years ago. There are many choices for alternatives that continue to receive updates. Not the least of which is Windows 10.

    If you still insist on using Windows 7, enjoy your final Update Tuesday, then indefinite silence on the updates front. Maybe Anonymous will come along and help you continue back porting patches from Windows 10 to make it a safe choice, if you trust Anonymous more than you trust, say, Microsoft with Windows 10, or your favorite Linux distribution, or your favorite BSD, since I heard all the chuds migrated to that after the EEEVIL Code of Conduct appeared.
    Posted on 20-01-13, 05:57
    Custom title here

    Post: #812 of 1150
    Since: 10-30-18

    Last post: 6 days
    Last view: 1 day
    Posted by kode54
    Also, since Jistuce made me read it...

    PH33R M4 1337 M1ND C0NTR0L P0\/\/44!


    Fuck you, Windows 7 dies in less than two days. Good riddance, it was already replaced years ago. There are many choices for alternatives that continue to receive updates. Not the least of which is Windows 10.

    And if you don't want patches, Windows 2000 is better than Windows 7... or really, any other Windows. It is objectively the best Windows.

    --- In UTF-16, where available. ---
    Posted on 20-01-13, 06:52

    Post: #232 of 449
    Since: 10-29-18

    Last post: 9 days
    Last view: 13 hours
    Posted by sureanem
    Isn't anything serious stored with Unix time anyway?

    Yeah, about that...

    Posted by CaptainJistuce
    And if you don't want patches, Windows 2000 is better than Windows 7... or really, any other Windows. It is objectively the best Windows.

    But what about my DOS games?! Also, Windows 95 had that sweet extra 3D shading.

    My current setup: Super Famicom ("2/1/3" SNS-CPU-1CHIP-02) → SCART → OSSC → StarTech USB3HDCAP → AmaRecTV 3.10
    Posted on 20-01-13, 06:54 (revision 1)
    Custom title here

    Post: #813 of 1150
    Since: 10-30-18

    Last post: 6 days
    Last view: 1 day
    Posted by creaothceann
    Posted by sureanem
    Isn't anything serious stored with Unix time anyway?

    Yeah, about that...

    Posted by CaptainJistuce
    And if you don't want patches, Windows 2000 is better than Windows 7... or really, any other Windows. It is objectively the best Windows.

    But what about my DOS games?! Also, Windows 95 had that sweet extra 3D shading.

    Didn't 2000 have the same 3D shading and title bar gradients? I admit it has been a while.

    For DOS games you use DOS. Boot floppies for the win. :P

    --- In UTF-16, where available. ---
    Posted on 20-01-13, 09:51

    Post: #233 of 449
    Since: 10-29-18

    Last post: 9 days
    Last view: 13 hours
    Apparently Windows 98 also used that look, but in either case you needed a graphics mode with more than 16 colors.

    Posted by CaptainJistuce
    Didn't 2000 have the same 3D shading and title bar gradients? I admit it has been a while.

    It has the gradient but not the shading:

    98
    2k

    My current setup: Super Famicom ("2/1/3" SNS-CPU-1CHIP-02) → SCART → OSSC → StarTech USB3HDCAP → AmaRecTV 3.10
    Posted on 20-01-13, 10:20
    Custom title here

    Post: #814 of 1150
    Since: 10-30-18

    Last post: 6 days
    Last view: 1 day
    Ahhhhh. I see it. Subtle difference, but it is there.

    --- In UTF-16, where available. ---
    Posted on 20-01-13, 12:03
    You read my title. That's enough social interaction for one day.

    Post: #457 of 598
    Since: 10-29-18

    Last post: 86 days
    Last view: 9 hours
    So. Stirfry hasn't learned a thing still, Y2K fixes are hacky, and the 9x-style Windows UI is a subtle fucker. What else've we got?
    Posted on 20-01-13, 12:06
    Stirrer of Shit
    Post: #705 of 717
    Since: 01-26-19

    Last post: 1525 days
    Last view: 1523 days
    Posted by CaptainJistuce
    They fixed it in a fragile, non-extensible way that was guaranteed to break again, and the cost of millions of dollars and countless manhours. Because no one ever thought "Hey, maybe computers will still exist in thirty years and we should make more robust code".

    Millions of dollars? I doubt it. Y2K wasn't a big problem.

    Two-digit years is fine for a presentation format, but not a data format. Hyphens shouldn't be stored in the date, they're a part of the presentation, just as a choice of slashes or hyphens is presentation, or year presented at the beginning or end of the date, or even suppressed entirely.

    Yeah, but you rarely have one without the other. When people argue for "four-digit dates," they want it everywhere. In theory, you're right they could be separated. In practice? See: Unix timestamps, UTC, GPS coordinates, &c.

    So you're saying that [Iceland] solved the century rollover by extending the date in a backwards-compatible manner? That's fantastic!

    As you can see, Iceland created a robust solution that will serve them for centuries to come without risk of overlap or confusion.

    Iceland does stupid things which don't scale because they're a tiny island with large sums of money and social trust. It is heavily overrated and not a serious country. It is like arguing Dubai has a good economy. It's not strictly wrong, but that doesn't mean they have useful ideas to follow.

    I would rank the systems thusly:

    1. Denmark. Clean and simple system. However, no checksum is not ideal.
    2. Iceland. See above.
    3. Finland. 20200113A1234 is just not as clean as 19200113-1234.
    4. Sweden. Mutable identifiers are not nice. They should have changed the checksum algorithm in 2000, and not removed useful information in the dashing '90s.
    5. Norway. Eleven digits isn't a good look, nor is removing rather useful information from the system which will certainly cause problems in the future.

    Also, birth dates on state-issued IDs are a limited case of date handling, as it is almost always obvious from other parts of the ID what century the birth date is.

    Not just ID cards. All government records, basically. There was a case not long ago where they accidentally sterilized someone because they got the age wrong by 100 years because they eyeballed it from the ID number without taking a closer look. So it definitely is a problem they should be taking a closer look at solving, I think. The problem is people have been starting to be born after the millennium changeover for a while now, so whatever changes you can do will not be nice at all. The best approach would be to switch over to a common system, although it would require quite some serious engineering of all kinds.

    Incidentally, my state issues an 8-digit ID number that does not encode any of that data. Birth date is presented as a separate field, in the form of MM/DD/YYYY. And did so in late twentieth century as well(I've seen my parents' old drivers licenses, and they had 4-digit years).

    ID numbers are different from all this, being actually very secret information and specific to the card.
    Or are you saying you have cards with DOB, ID#, and DL#?

    That is a super-ugly hack. It is so revolting that I want to inflict bodily harm on you for suggesting it.
    It is also incompatible with databases storing the date fields in BCD, or using ugly five-bit fields for binary storage of M and D. Or that simply store D as 0-356 and leave the month to the presentation layer.

    Many of the aforementioned countries apply such systems to immigrants. It mostly works fine.
    As for the objection, this is my point entirely - serious databases don't use BCD or five-bit fields. They use Unix time. Using BCD or whatever is a quasi-string date. It's reliant on the human presentation. Why should whether your dates are Julian or Gregorian be a problem for the database layer? Just use damn Unix time, which is good until 2038 unchanged, 2106 if we use the negative values (not really used atm), and basically forever with 64 bits.

    No, mandatory fixed hardware standards are a terrible idea. I'm still mad at 1920x1080 becoming one. I'm glad that displays have started to advance meaningfully beyond what I was running in 1999. (1280p@70Hz > 1080p@60Hz, since we only do one-number resolutions these days.)

    I'm sad 1080p is dying. All of this new stuff is just pointless. It's not a meaningful advance at all. Wow, phones now have X times more computing power. So what? It will just be used to compensate for the web developers who are getting stupider and stupider. In ten years' time, we will have smartphones which are say thirty times faster, and web developers will thus have gotten approximately thirty times stupider to compensate for it. The overall gain to society from this enterprise is limited, but the loss is tangible. In other words, it is why we can't have nice things.

    I do not understand it. You put some of the smartest PhDs to have ever lived to optimize web browsers' JS engines so you can hire the dumbest fucking idiots you can find to make websites. Why? Couldn't they just hire normal people to be web developers instead? People rant about PHP being dangerous and MySQL being slow, but it's 100x better than the mess we are in now.

    If they still would have to develop for users in Asia on IE6, sure they'd complain about standards and whatnot, but they would make useful websites. I use a laptop with 4 GB RAM for my daily use, and it's already hitting swap for some websites with JS enabled. It's sickening and repulsive. It ought to be disqualifying for serious web development work to have experience with JQuery, Node.JS, NoSQL databases, or Rust.

    Like, seriously. How much innovation in software have we had for the last decade? Absolutely squat of value. Tor and Bitcoin got slightly better, and Monero came out. That's it. Now what did the 2000's bring? What did the 1990's bring? What did the 1980's bring?

    Some observers point to this being due to the lack of a good financial crisis the past decade, so I'm keeping my hopes open.

    It's clear it's not the hardware that is the problem and that people don't deserve their new toys. For each passing day, you start to wonder whether Ted really didn't have a point.

    Posted by kode54
    Also, since Jistuce made me read it, I thought I'd respond to it.

    Fuck you, Windows 7 dies in less than two days. Good riddance, it was already replaced years ago. There are many choices for alternatives that continue to receive updates. Not the least of which is Windows 10.

    If you still insist on using Windows 7, enjoy your final Update Tuesday, then indefinite silence on the updates front. Maybe Anonymous will come along and help you continue back porting patches from Windows 10 to make it a safe choice, if you trust Anonymous more than you trust, say, Microsoft with Windows 10, or your favorite Linux distribution, or your favorite BSD, since I heard all the chuds migrated to that after the EEEVIL Code of Conduct appeared.

    Come on now. Windows 10? On this board?

    Personally I use Debian so I'm not affected. I still maintain that Windows 7 ought to be the standard operating system, however. In Windows 8 they just broke everything for no good reason and in Windows 10 they pulled the age-old trick of "selling a solution to their problem".

    On this topic, I wonder what they got him for. Was the anonymous source right? Or maybe they just made him an offer he couldn't refuse?

    There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this.
    Posted on 20-01-13, 14:45
    Off-Model

    Post: #458 of 598
    Since: 10-29-18

    Last post: 86 days
    Last view: 9 hours
    Posted by stirfry
    I still maintain that Windows 7 ought to be the standard operating system, however. In Windows 8 they just broke everything for no good reason and in Windows 10 they pulled the age-old trick of "selling a solution to their problem".
    At least in this, we can agree to some non-zero extent.
    Posted on 20-01-13, 16:22
    Custom title here

    Post: #815 of 1150
    Since: 10-30-18

    Last post: 6 days
    Last view: 1 day
    Windows 10 is nice. And doesn't require a 3D accelerator to display multiple 2D windows properly.

    I don't even think the Win8 Start Screen was a fundamentally bad idea, though it definitely needed polish.
    The Start Menu wvs garbage when it was new and isn't improving with age. And most complaints about the Start Screen can be summed up as "it isn't the same Start Menu we've been kludging fifes onto since 1995."

    --- In UTF-16, where available. ---
    Posted on 20-01-13, 19:54

    Post: #147 of 175
    Since: 10-30-18

    Last post: 1212 days
    Last view: 1212 days
    Posted by sureanem

    I'm sad 1080p is dying. All of this new stuff is just pointless. It's not a meaningful advance at all.

    Are you blind? 1080p is blurry as hell, and even worse at over 22”. Yeah, let’s just stay at the arbitrary 96 dpi because some nerds got into computers when that was the norm.

    Sorry, I’ll take my ultra-sharp text and polygon edges over fucking with font antialiasing and filtering trying to make my eyes not hurt.
    Posted on 20-01-13, 21:23
    Dinosaur

    Post: #614 of 1282
    Since: 10-30-18

    Last post: 4 days
    Last view: 22 hours
    I'm already ready for the Win7ocalypse: I'll stay on Debian for profit, and keep Win7 for fun (which basically limits itself to whatever games I bought on Steam but can't be bothered to test with Proton/Wine). And since I don't even game that much nowadays, it means my Win7 installs will remain idle for quite some time in the future.

    (I do have my legit W10 licenses for both of my Win7 laptops, and a test setup on a spare HDD, but that's all)

    People still run over XP over here. Hell, I just noticed one of my banks still uses an Acrobat Reader version from late '90s (IIRC it was 5.0?) on their i5 Thinkcentres running shoehorned XP (bare metal - not even a VM!). I've been warning people on public/workplace computers with dubious digital hygiene practices to get the hell away from XP, and now the advice extends to W7 (and considering the pathetic state of our hardware specs in the field, it also means W10 is not an option at all for most, so it means "Linux or GTFO"), but the reality is "nobody that matters care".

    But at MY house? W8+ is permabanned, XP/7 is booted only when absolutely required, and for new software acquisitions (free or otherwise), if your system requirements list doesn't include "a reasonably recent Linux distro", I'm out. Thankfully I don't Adobe/Autodesk/big audio at all :)

    Kinda relevant: "the PC is dead, the PC is dead, the PC is dead"
    https://tech.slashdot.org/story/20/01/13/0244242/the-end-of-windows-7-marks-the-end-of-the-pc-era-too

    Licensed Pirate® since 2006, 100% Buttcoin™-free, enemy of All Things JavaScript™
    Posted on 20-01-14, 14:47 (revision 1)
    Stirrer of Shit
    Post: #706 of 717
    Since: 01-26-19

    Last post: 1525 days
    Last view: 1523 days
    Posted by tomman
    (and considering the pathetic state of our hardware specs in the field, it also means W10 is not an option at all for most, so it means "Linux or GTFO"), but the reality is "nobody that matters care".

    Is properly configured W10 significantly more demanding than W7? Can't you run Windows Server? That's what all the "almost privacy enthusiasts but not enthusiastic enough to use Linux" people do.

    Kinda relevant: "the PC is dead, the PC is dead, the PC is dead"
    https://tech.slashdot.org/story/20/01/13/0244242/the-end-of-windows-7-marks-the-end-of-the-pc-era-too

    It is dying. The CIA want to ban compilers, and all that.

    EDIT: what? Do I have three names now?

    There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this.
    Posted on 20-01-16, 08:14

    Post: #235 of 449
    Since: 10-29-18

    Last post: 9 days
    Last view: 13 hours
    Sutro Baths Cinemagraph - San Francisco

    My current setup: Super Famicom ("2/1/3" SNS-CPU-1CHIP-02) → SCART → OSSC → StarTech USB3HDCAP → AmaRecTV 3.10
    Posted on 20-01-18, 21:42 (revision 1)
    Dinosaur

    Post: #615 of 1282
    Since: 10-30-18

    Last post: 4 days
    Last view: 22 hours
    Toshiba claims they've found code that is faster than computers you can't buy (including computers that don't exist yet):
    https://tech.slashdot.org/story/20/01/17/1431207/toshiba-touts-algorithm-thats-faster-than-a-supercomputer
    They even claim you can run it at "room temperature", whatever that means...

    Banks already called first dibs on it, naturally.
    Wonder that it can also help Toshiba better cover their accounting scandals help with their day-to-day cutting edge financial operations.

    Licensed Pirate® since 2006, 100% Buttcoin™-free, enemy of All Things JavaScript™
    Posted on 20-01-18, 23:41

    Post: #236 of 449
    Since: 10-29-18

    Last post: 9 days
    Last view: 13 hours
    Posted by tomman
    They even claim you can run it at "room temperature", whatever that means...

    Quantum computers require near absolute-zero temperatures, unlike a normal CPU.

    My current setup: Super Famicom ("2/1/3" SNS-CPU-1CHIP-02) → SCART → OSSC → StarTech USB3HDCAP → AmaRecTV 3.10
    Pages: First Previous 32 33 34 35 36 37 38 39 40 41 42 Next Last
      Main » Discussion » I have yet to have never seen it all.
      This does not actually go there and I regret nothing.