Main » Discussion » Computer Hardware News » New reply
    New reply
    Post help

    Presentation

    [b]…[/b] — bold type
    [i]…[/i] — italic
    [u]…[/u] — underlined
    [s]…[/s] — strikethrough
    [code]…[/code] — code block
    [spoiler]…[/spoiler] — spoiler block
    [spoiler=…]…[/spoiler]
    [source]…[/source] — colorcoded block, assuming C#
    [source=…]…[/source] — colorcoded block, specific language[which?]
    [abbr=…]…[/abbr] — abbreviation
    [color=…]…[/color] — set text color
    [jest]…[/jest] — you're kidding
    [sarcasm]…[/sarcasm] — you're not kidding

    Links

    [img]http://…[/img] — insert image
    [url]http://…[/url]
    [url=http://…]…[/url]
    >>… — link to post by ID
    [user=##] — link to user's profile by ID

    Quotations

    [quote]…[/quote] — untitled quote
    [quote=…]…[/quote] — "Posted by …"
    [quote="…" id="…"]…[/quote] — ""Post by …" with link by post ID

    Embeds

    [youtube]…[/youtube] — video ID only please
    Thread review
    creaothceann Absolutely disgusting.

    https://www.youtube.com/watch?v=yE9n_llJOVQ
    tomman
    Posted by Screwtape
    Is NVIDIA's announcement enough?

    I believe Apple, Qualcomm and maybe Samsung have perpetual licences, so it doesn't matter what NVIDIA does, they can noodle around and do their own thing. However, one of the big advantages of ARM is its popularity - if NVIDIA screws up and the low end of the market moves over to RISC-V or OpenSPARC or something, those companies might want to shift over as well.

    The official NVDA press release

    I see too much "AI" buzzword bingo BS, but aside of that... welp, it's official. Masayoshi Son, thanks for ruining computers and fucking the UK pride!. Yeah, gotta grab some popcorn to watch the regulatory shitshow unfold, particularly at the EU and China. Fun fact, a lot of people believe nVidia is a Chinese corporation, a-la Huawei. But nope, it's American - its founder is half-Taiwanese (y'know, from the original China, fuck PRC's "One China" BS), but yeah, people can't even read these days.

    > RISC-V
    And now I can see the huge smile at Western Digital SoC designers when they told Marvell that they were going all-in with this newfangled RISC-V thing. Now I guess there must be a huge party going up there celebrating "the death of ARM" or something :D
    Can't wait for those $50 RISC-V Android phones if nVidia screws it up this acquisition.

    There IS a possitive side of all this mess: the faster nVidia kills Mali, the better. Or maybe will they finally have a half-decent binary blob driver, at least?
    CaptainJistuce I will be REALLY interested in seeing what the regulatory agencies have to say about this.

    And FUCK YEAH! OPENSPARC MASTER RACE!
    Screwtape Is NVIDIA's announcement enough?

    I believe Apple, Qualcomm and maybe Samsung have perpetual licences, so it doesn't matter what NVIDIA does, they can noodle around and do their own thing. However, one of the big advantages of ARM is its popularity - if NVIDIA screws up and the low end of the market moves over to RISC-V or OpenSPARC or something, those companies might want to shift over as well.
    tomman Apparently nVidia's checkbook is the only one willing to go wide open for that ARM buy: Masayoshi Son is set to sell ARM to greedy greens for $40 BILLION. Turns out that if this comes through, Son would end being a lucky moron as he would be making a $8B profit on the sale. Too bad he will keep blowing those monies on hookers and blackjack unicorns :/

    (Does anyone have a non-paywalled link? All sources I can find are the WSJ and Financial Times, which as expected want your money and personal info to read beyond the headline)

    In the meanwhile, this is gonna become yet another regulatory disaster (divestment hell time, or nVidia will be very busy buying lawmakers) - Apple, Qualcomm, Google, and Samsung have very good reasons to be against the sale. Where is that "Save ARM" consortium / crowdfunding initiative?!
    CaptainJistuce I consider a black background preferable to stretched wallpaper, so file this one as "feature, not bug"
    tomman Didn't noticed this as I'm finally catching up with the final batch of Win7 updates (I haven't even booted to W7 at home THIS YEAR): as a final "fuck you and thanks for all the fish", MS broke your wallpaper on the final update, leaving you with a solid black background (the punishment pirates get when their Windows is deemed "not genuine" by the system)... but only under specific conditions (you have to use the "stretch" mode).

    Originally there was no fix, except to downgrade to Wintendo Diez rollback the patch, but I guess that after the hordes of angry tweets, MS had to push an exceptional ~50MB patch to undo their silly fuckup. You won't be getting this one from WU, though - you have to source it from the Catalog instead. Install, reboot, enjoy XP4LYFE best Windows~

    Sidenote: Windows releases past XP really dislike being booted only a couple time a year. They get all wonky (missing icons, activation failures, excessive slowness), but all you need to restore things to normal is another couple of reboots.
    kode54 See, I like that too. But the tiny appliance that you can't manipulate in any way, and may as well have the software limiting what you can install on it, is what people above are saying the future is. And I maintain several Mac apps anyway. I'll probably need to go that path some day if I want to continue producing Mac apps for people to use.
    tomman So people prefer the "disposable, portless, soldered everything" future to the "we're in total control of our lives" future because... smaller is better?

    And people ask me why I wonder computers are doomed :/
    I suggest to stop calling such crippled simple devices "computers" and instead refer to them as appliances, because that's what they are at the end: turn it on, press buttons, get results.

    No, I don't want a freakin' IBM mainframe on my room (although I wouldn't mind re-purposing one as a storage locker :D). But I also don't want a computer I can fit inside a book either! Not as my main system / daily driver, at least - I can have fun with appliances, but I can't depend solely on these for being a fully functional person. I want ports. I want slots. I want expandability. I want to be able to get my shit working again by replacing whatever part it failed instead of "ooooh, time to buy a new appliance"!

    But then, that's just me. If you're happy with your appliances and are more productive with them, more power to you.
    Just leave our big fat ports-a-galore boxes alone and we good.
    CaptainJistuce Judging you.
    kode54 No, see, my next PC purchase will be an ARM Mac Mini. See? Solves those pesky size problems, heat problems, and I don't have to worry about upgrades either, because what it comes with will be more than sufficient for the next generation or two.
    CaptainJistuce
    Posted by wertigon


    There will still be room for large cases - but just like we moved from MicroVAX form factor to full tower to mid-tower, I think the future desktops will be small enough to mount on the back of our monitors.
    You left out the part where we went from mini-towers back to mid-towers. My AT machine is smaller than my ATX machine, in large part because the drive cage and power supply overhang the motherboard. Like an ITX machine, only with less origami and no special cooling considerations.

    Modern processor and graphics adapter power demands make this a very challenging form factor to work with instead of just "how computers are built". Yes, even smaller systems exist, but they wind up being very restricted by heat unless a LOT of care is put into the cooling design(as we continue to see with laptops every year)
    tomman Either spinning rust goes full native USB on both 2.5" and 3.5" (the latter won't be happening at least until USB can supply 12V - there will always be a SATA bridgeboard somewhere), or we leave SATA ports alone.

    If performance of HDDs eventually bottleneck SATA/6Gb (which may or may not happen in our lifetimes considering that the tech is currently as cutting-edge as it is physically possible), why not just raise clock speeds again? It worked fine for the SAS guys (they do have SAS/12Gb, although by then 10/15K screamers were already passé, NVMe happened, and in the case of slow, bottomless tanks they were just fine with cheap SATA drives as RAID is still a thing)

    Posted by wertigon
    Posted by kode54
    Fine, you buy your new shit, I'll keep using something that still works for another decade.


    You are aware single-core computers will not exist anywhere on the market in 2030 right? Maybe for small, low-end embedded systems, but otherwise most of the world will have moved on. Today high-end CPUs starts at 16c32t beasts all the way to 64c128t, this will probably be at *least* 256c/1024t machines by 2030 running at 4-5 GHz boost clocks. Lowest end will probably be 16 cores by then.

    At that point, lending one core to focus 100% on a single data transfer OP will be perfectly acceptable even for a Low-end NAS. Software raid will probably also be a thing.


    By then even those will not be enough for your Facebooks, Slacks and Discords, because Chinese Silly Valley will still be too busy churning out Javascript (or the hipster language of the year) bloatfests. We don't need faster CPUs - what we need is for us coders (yes, that includes me!) to stop considering that "hardware is cheap, developer time is not". Hardware is NOT cheap when it's your users who have to afford the needless upgrade bills!

    I'm all for faster video encoding boxes and sub-second Linux kernel build times, but if the main force for upgrading is being able to run the latest $SOCIALFAD whose functionality at its core is no different from '90s IRC (but with emojis!), then I'll stick to my museum-grade hardware, thanks. And don't get me started with videogames (remind me why we play videogames, pretty please? It's not for the specs, dummy!)
    wertigon
    Posted by kode54
    Fine, you buy your new shit, I'll keep using something that still works for another decade.


    You are aware single-core computers will not exist anywhere on the market in 2030 right? Maybe for small, low-end embedded systems, but otherwise most of the world will have moved on. Today high-end CPUs starts at 16c32t beasts all the way to 64c128t, this will probably be at *least* 256c/1024t machines by 2030 running at 4-5 GHz boost clocks. Lowest end will probably be 16 cores by then.

    At that point, lending one core to focus 100% on a single data transfer OP will be perfectly acceptable even for a Low-end NAS. Software raid will probably also be a thing.

    Posted by kode54
    Also, what the fuck, are you going to route external cables for internal drives, or is everyone supposed to be moving to tiny ass micro PCs by 2022 too?


    There will still be room for large cases - but just like we moved from MicroVAX form factor to full tower to mid-tower, I think the future desktops will be small enough to mount on the back of our monitors. Instead of having an iMac we simply combine a screen and an upgradable PC unit, which is actually an awesome setup to have. :)

    2TB is not enough for everyone, but it's a lot of storage for most people. Especially given how ubiquitous cloud storage is (where you let another company back up your data).

    But 'dem warez and pr0n are a thing of course...
    funkyass
    Posted by kode54
    Fine, you buy your new shit, I'll keep using something that still works for another decade.

    Also, what the fuck, are you going to route external cables for internal drives, or is everyone supposed to be moving to tiny ass micro PCs by 2022 too?


    Futures gonna be wireless- seriously USB-C before then.
    kode54 Fine, you buy your new shit, I'll keep using something that still works for another decade.

    Also, what the fuck, are you going to route external cables for internal drives, or is everyone supposed to be moving to tiny ass micro PCs by 2022 too?
    funkyass copying 2 different isos from two different usb 3.0 sticks barely moved the utilization % past 7 on my 3700x, both sticks supplying 128 MB/s each to my nvme drive.

    Firewire was also less of a CPU hog vs USB 2.0, but its not 2005.
    CaptainJistuce Hook 'em up through USB-C in thunderbolt mode.
    kode54 It doesn't notice when you're churning 200MB/s across it? Ok, shove a RAID on USB ports. That's perfectly fine.
    funkyass
    Posted by kode54
    Also, you do know that USB is CPU bound, right? And that SATA isn't?


    my CPU doesn't seem to notice.
      Main » Discussion » Computer Hardware News » New reply
      [Your ad here? Why not!]