Main » Programming » Your daily dose of processor unit vulnerabilities » New reply
    Alert
    You are about to bump an old thread. This is usually a very bad idea. Please think about what you are about to do before you press the Post button.
    New reply
    Post help

    Presentation

    [b]…[/b] — bold type
    [i]…[/i] — italic
    [u]…[/u] — underlined
    [s]…[/s] — strikethrough
    [code]…[/code] — code block
    [spoiler]…[/spoiler] — spoiler block
    [spoiler=…]…[/spoiler]
    [source]…[/source] — colorcoded block, assuming C#
    [source=…]…[/source] — colorcoded block, specific language[which?]
    [abbr=…]…[/abbr] — abbreviation
    [color=…]…[/color] — set text color
    [jest]…[/jest] — you're kidding
    [sarcasm]…[/sarcasm] — you're not kidding

    Links

    [img]http://…[/img] — insert image
    [url]http://…[/url]
    [url=http://…]…[/url]
    >>… — link to post by ID
    [user=##] — link to user's profile by ID

    Quotations

    [quote]…[/quote] — untitled quote
    [quote=…]…[/quote] — "Posted by …"
    [quote="…" id="…"]…[/quote] — ""Post by …" with link by post ID

    Embeds

    [youtube]…[/youtube] — video ID only please
    Thread review
    tomman The logo-and-website vulnerability of the day: Plundervolt

    Some researchers just figured out that if you tamper with a power supply circuit, Bad Things™ can happen.
    Also, root required, but then if you've got root, why bother going deeper?

    "Buy AMD", they say.
    ‮strfry("emanresu")
    I'm not crying over spilt milk, I'm raging over the fact that no one's standing the jug up before the rest of it dumps out on the floor. Things CAN be done.

    But whatever. You're actually the enemy. This "take it and be happy because you are powerless" attitude is why things have gotten as bad as they have. I'm wasting time I could spend on literally anything else arguing with either a corporate shill or a consumer whore.

    Hey, calm it with the personal attacks. I am neither a corporate shill nor a consumer whore. I have never set foot inside an Intel office (or any chipmaker's office, for that matter), nor have I ever purchased an Intel CPU in my life, except for when it was inside a laptop.
    I don't own any stock in Intel either, except for whatever exposure I get through retirement funds and whatnot. My closest connection to them is probably that I think the AVX intrinsics are fun to program with and the documentation site is kind of neat, but that doesn't mean I like them.

    And it isn't that I think what Intel is doing is particularly good or that I feel enthusiasm about it, just apathy. You can rage all you want, but nothing can be done about the matter. The "solution" essentially boils down to "go buy AMD or something". Even if this were a good solution, there's no point to doing so - if their CPUs are good enough, they'll get sold anyway, and if they're not, they won't get sold anyway. In no way do my actions ever enter the equation, so I might as well take it easy and save my money.

    Anyway, the main concern isn't that chipmaker A would best chipmaker B reducing CPU costs by X%, that's rearranging the deck chairs on the Titanic. Soon the desktop will be an utterly irrelevant platform, and we'll all be stuck with Chinese locked-down ARM smartdevices with hardware DRM, with nobody even bothering to translate the SDK (which would require a license, of course, and depend on access to The Cloud™) into English.

    Nothing can be done about this either, lest you accuse me of changing the subject, I'm just saying that chipmaker profits are a fairly irrelevant concern in the grand scheme of things.
    Posted by CaptainJistuce
    If only profit margins, processor performance, and feature sets were publicly available data. Oh, wait...
    It is rather easy to confirm that having segmented markets for full-featured enterprise processors and kneecapped personal-use processors is a modern business practice that didn't even START until the Core line, and resulted in wildly larger profit margins(because businesses will buy what they need no matter the cost IF there isn't a lower-cost alternative).

    I wasn't around for the early era of microprocessors, but that should still make sense. VIA and Cyrix were making CPUs, because the cost to entry back then was lower. There's even a law, Moore's second law, stating that "the cost of a semiconductor chip fabrication plant doubles every four years". The 486 came out in 1989, according to the article a plant cost $14B in 2015, so that gives around $80M for a plant back in 1989. About the same as in this article. Also, the R&D you'd have to catch up to hadn't gotten so far yet.
    Businesses would pay money out the ass even if there is a lower-cost alternative, as long as paying out the ass is the so-called "industry best practice". The same goes for universities (in the first world), or at least a good chunk of them.


    So every processor from the 486 onward has made a mistake by integrating the FPU? At the time, it was a niche feature that normal users didn't need. There was some annoyance that people who were only going to need integer performance were being forced to pay for a FP coprocessor.
    ONCE IT WAS AVAILABLE, it started seeing widespread usage because, hey, it turns out that it wasn't that useless after all.

    I suppose the FP is different, but that's a good point. However, stuff like AVX is a fair bit more niche. Compilers can't vectorize well, so all you get is optimizations for programs specially written to take advantage of it (and yes, they do get far faster, even though a GPU is often better still) and a ~5% speedup for other stuff.
    Joe Q. Public won't get anything much out of AVX. They're mostly used in video en/decoding, but for H264, the only codec of value to him, that's hardware accelerated anyway.
    https://dxr.mozilla.org/mozilla-central/search?q=m256&redirect=true
    As you can see, mostly used in libvpx. Doesn't make browser go faster. Also, mr. Public would only use Chrome, which isn't any better.

    It is market segregatoin that largely did not exist before the Core series, and didn't exist AT ALL before the Pentium Pro. The PPro had the excuse of being genuinely expensive to manufacture. The Pentium 2 started the trend of ARTIFICIAL market segmentation, where you disable features on most of your parts so that you can charge a premium for the ones that allow access to those parts.

    Actually, you traditionally gain much better warranty terms and support in exchange for the Quadro prices. Though nVidia started feature-gating once businesses started becoming more concerned with the price than the support.
    And now they flat-out include usage restrictions in the driver license so you legally CAN'T use cheaper enthusiast cards in a business or research setting REGARDLESS of the performance and feature set.

    Yeah, but wasn't that just because the enthusiasts and professionals had about the same needs back then, but then they diverged?
    Market segmentation is a good thing, in theory. If a company makes $X more on the expensive "industry-grade" products, they would need to earn $X less on the consumer-grade ones to keep the same level of profit. This is what Adobe does: for personal use, Photoshop is effectively free (yes, this is a strategy they encourage), but for businesses that's obviously not an option outside of China.
    The Quadro example proves my point: they get warranty and support, yes, but is it really worth paying thousands of dollars for? And the license, same thing. It's not a crime to violate an EULA, and the probability of them getting sued is infinitesimal.
    Stuff like the PS3 supercomputer does exist, you know. But nobody wants to save money, because they'd prefer wasting it for use as a status symbol so they get the tuitions.
    Journal subscriptions (an utter fraud) are another great example of this. Cut them, the students will teach each other about sci-hub, wait until the Elsevier problem solves itself.
    But I digress. Anyway, that big corporations need to pay big $$$ for status symbols isn't a big problem to me.

    But not the way it always was, or the way it has to be.

    It wasn't that way before because the barriers to entry were lower. At least now things are okay - you can still purchase loose CPUs. I don't think they can get much better, since there's no market for it.

    Individuals just get less for their dollar than they used to(and businesses get MUCH less).

    To be blunt, fuck businesses. They waste money, that's on them.
    I get more for my dollar now than at any other time in the past. Especially if I buy used CPUs. The most expensive CPU in, say, 1990, probably cost around the same as the most expensive now, but it sure as heck isn't as fast.

    I'm paying Intel to manufacture a processor, and I'm willing to pay more for a part that is actually harder to make. But when Intel tells me I should pay more not because it is harder, but because they want a wider profit margin, I get upset.

    Well, what should be done about it? They can justify this profit margin with the higher costs to entry, a la might makes right, and unless you have $billions lying around in your couch you sadly have to agree they're right on this.
    (Also, the profit margins are not so high. P/E of 10 is high but not stellar nor ATH.)


    Even when they were ruling the roost, they binned processors honestly(easy enough to test given that their processors were easy to unlock during their dominance), and didn't gate features.

    Don't they do the exact same stuff, like disabling broken/unstable cores with lasers? I know you could reflash 270 to 270X, but I don't think you could do that for 370 -> 370X, for instance.

    If AMD ever gets close to ahead of Intel, Intel will ease up on their bullshit. We've already seen that as Zen has proven damaging to their sales and they've responded by offering better products at lower prices. Hooray for some fucking competition.

    Yeah, but this only works past a certain point, because if AMD gets far ahead enough they'll start pulling the same moves Intel did.
    CaptainJistuce
    Posted by sureanem
    Posted by CaptainJistuce

    It is a shift in business model, it gets worse every year, and that R&D budget was CLEARLY not going to actually improving their product if the raft of Intel-exclusive vulnerabilities is any indication.

    They've boosted profits wildly over the last several years without actually offering improved products, and in fact with the pace of computer upgrades slowing greatly in part due to a lack of meaningful updates.

    Things haven't always been this way? I doubt it.

    If only profit margins, processor performance, and feature sets were publicly available data. Oh, wait...
    It is rather easy to confirm that having segmented markets for full-featured enterprise processors and kneecapped personal-use processors is a modern business practice that didn't even START until the Core line, and resulted in wildly larger profit margins(because businesses will buy what they need no matter the cost IF there isn't a lower-cost alternative).

    They tried to set up separate processors entirely, but the Itanium was a flop, and the Pentium 4 was a hilarious trainwreck(in part due to poor engineering, in part due to Intel proclaiming that enthusiasts didn't need 64-bit while AMD was offering 64-bit processors to enthusiasts). So Intel licensed AMD64 from AMD. And once they recovered their footing(mostly because Bulldozer was a mistake), they started feature-gating the Core line.



    I mean, if you want to defend a shift from "manufacturer shipping the best product we can at a competitive price" to "manufacturer shipping the worst product we think people will actually buy at the highest price we think people will tolerate", then okay. It isn't a stance I will ever agree with.

    What do you mean, a shift? Presumably, they're in it for the money like everyone else, so they'd always have done that. Most people nowadays have no need for "the best product" like they did in, say, the 90's, because what makes a CPU the best is often niche features that far from all applications can leverage.

    So every processor from the 486 onward has made a mistake by integrating the FPU? At the time, it was a niche feature that normal users didn't need. There was some annoyance that people who were only going to need integer performance were being forced to pay for a FP coprocessor.
    ONCE IT WAS AVAILABLE, it started seeing widespread usage because, hey, it turns out that it wasn't that useless after all.



    Xeon CPUs and server-grade chipsets are kind of a special case. The people who buy them are quite price-insensitive, they're the kind of people who throw out the whole machine whenever a part breaks. So it's just regular market segregation.

    It is market segregatoin that largely did not exist before the Core series, and didn't exist AT ALL before the Pentium Pro. The PPro had the excuse of being genuinely expensive to manufacture. The Pentium 2 started the trend of ARTIFICIAL market segmentation, where you disable features on most of your parts so that you can charge a premium for the ones that allow access to those parts.

    This is not much different to Nvidia and their Quadro cards. The Quadro cards are just more expensive for no real reason, because the people who buy them have infinity money, and wouldn't ever dare hack something together out of reflashed warranty voided cards.

    Actually, you traditionally gain much better warranty terms and support in exchange for the Quadro prices. Though nVidia started feature-gating once businesses started becoming more concerned with the price than the support.
    And now they flat-out include usage restrictions in the driver license so you legally CAN'T use cheaper enthusiast cards in a business or research setting REGARDLESS of the performance and feature set.


    That's just the way it is.
    But not the way it always was, or the way it has to be.

    For regular consumer processors, prices are still normal.
    Individuals just get less for their dollar than they used to(and businesses get MUCH less).

    I don't agree with you on that. You can buy a silicon wafer for next to nothing, what you're paying for is Intel's monopoly on the IP needed to manufacture one of whatever processor it is you have.
    I'm paying Intel to manufacture a processor, and I'm willing to pay more for a part that is actually harder to make. But when Intel tells me I should pay more not because it is harder, but because they want a wider profit margin, I get upset.



    AMD would if they could, I'd reckon. AMD did sell locked CPUs in the past, they just didn't do it for Ryzen.

    Even when they were ruling the roost, they binned processors honestly(easy enough to test given that their processors were easy to unlock during their dominance), and didn't gate features.

    If AMD ever gets ahead of Intel, presumably the sides would switch and Intel would be selling cheaper unlocked processors. And then we would be talking about how AMD is greedy and how we should all buy Intel instead, or something like that.

    If AMD ever gets close to ahead of Intel, Intel will ease up on their bullshit. We've already seen that as Zen has proven damaging to their sales and they've responded by offering better products at lower prices. Hooray for some fucking competition.



    "Everything is terrible, and it is only going to get worse, so take it with a smile."
    Ummm, how about no? That's a terrible fucking attitude.

    Whether you smile or not makes no difference, but that's the way it is, terrible or not. There's no use in crying over spilled milk.
    I agree that it's regrettable, but nothing can be done about it, so why care?

    I'm not crying over spilt milk, I'm raging over the fact that no one's standing the jug up before the rest of it dumps out on the floor. Things CAN be done.


    But whatever. You're actually the enemy. This "take it and be happy because you are powerless" attitude is why things have gotten as bad as they have. I'm wasting time I could spend on literally anything else arguing with either a corporate shill or a consumer whore.
    ‮strfry("emanresu")
    Posted by BearOso
    most kernel programmers work for virtualization providers.

    Ah, so that's why.
    Always thought it seemed overblown.
    BearOso
    Posted by JieFK
    Coincidentally, a mail was posted today on Dng that gives a few flags to pass to your kernel command line to disable those patches :
    https://lists.dyne.org/lurker* JieFKssage/20190516.104800.1d3cc002.en.html
    Almost the same as tomman's post above.

    We just need the last one now, mitigations=off. Thankfully, someone had some sense and realized these aren't exploitable without already giving a person significant access to a system. The only reason we're seeing such panic is because most kernel programmers work for virtualization providers.
    ‮strfry("emanresu")
    Posted by CaptainJistuce

    It is a shift in business model, it gets worse every year, and that R&D budget was CLEARLY not going to actually improving their product if the raft of Intel-exclusive vulnerabilities is any indication.

    They've boosted profits wildly over the last several years without actually offering improved products, and in fact with the pace of computer upgrades slowing greatly in part due to a lack of meaningful updates.

    Things haven't always been this way? I doubt it.
    As for the R&D budget, those are two disparate things. If I'd have to choose between two processors, where one has an obscure theoretical vulnerability, and the other one is 20% faster, I'd sure as heck go with the other one. Even at just 5% or 10% or whatever. I mean, sure, fix Spectre if it's been disclosed, but don't go around sacrificing performance for "security".

    I mean, if you want to defend a shift from "manufacturer shipping the best product we can at a competitive price" to "manufacturer shipping the worst product we think people will actually buy at the highest price we think people will tolerate", then okay. It isn't a stance I will ever agree with.

    What do you mean, a shift? Presumably, they're in it for the money like everyone else, so they'd always have done that. Most people nowadays have no need for "the best product" like they did in, say, the 90's, because what makes a CPU the best is often niche features that far from all applications can leverage.

    ECC RAM disabling, of course, goes back to before the DRAM controller was even ON the processor. Intel worked hard to get everyone else out of the motherboard chipset business, then immediately turned around and started removing features from their chipsets and telling people it was an improvement because they were removing features people didn't need(except on the new server-grade chipsets that were suspiciously much more expensive despite being the same thing aside from those removed features still being present).

    ...
    Umm,, yes? Have you SEEN the markup on Xeon processors?

    Xeon CPUs and server-grade chipsets are kind of a special case. The people who buy them are quite price-insensitive, they're the kind of people who throw out the whole machine whenever a part breaks. So it's just regular market segregation.
    Case in point: For any consumer-grade CPU, the MSRP is pretty much what you get if you're not buying used. For server-grade CPUs, you can get all sorts of deals on Engineering Samples and whatnot. Just go on Taobao and search for Xeon, and you'll see what I mean.
    This is not much different to Nvidia and their Quadro cards. The Quadro cards are just more expensive for no real reason, because the people who buy them have infinity money, and wouldn't ever dare hack something together out of reflashed warranty voided cards.
    That's just the way it is. For regular consumer processors, prices are still normal.

    Your analogy is also flawed, because I may be paying for software rather than media, but in the case of the processor, I am actually paying for the physical object.

    If you bought a car and it had a V8 engine under the hood, but the ECU only fired four cylinders... you'd be mad, right? That's what Intel is doing.

    I don't agree with you on that. You can buy a silicon wafer for next to nothing, what you're paying for is Intel's monopoly on the IP needed to manufacture one of whatever processor it is you have.

    It's a good analogy with the car. But no. I mean, sure, it's aggravating that parts are locked away. But not much can be done about it. It's like with injection molding: the marginal cost is almost zero, but there's a fat starting cost. And it's indifferent to me whether I get a CPU that has speed X, or one that has speed Y but is hardware locked to speed X in a way that I can't ever change. So they might as well save their money. It's not like I wouldn't get an unlocked CPU if they didn't do that, they'd just have slightly higher expenses.

    They aren't offsetting R&D costs, they are boosting profit margins.
    And I'm not going to thank them for throwing shit at my face just because they didn't actively force it down my throat.

    What's the difference? Both are just return on investment. I don't mean that they have the right to exact payment because they have R&D costs, I mean that they have the ability to do it because everyone else has R&D costs too.
    That the desktop is a viable platform there's nothing to thank Intel for. Not Microsoft either. Really, I don't know who to thank for it. PC gamers, maybe?


    Then they shouldn't have any R&D budget because there's nothing to develop.
    (also buttcoin is done on graphics cards or ASICs these days. No one digs for buttnuggets on CPU.)

    That's a reasonable point, and I suppose a race to the bottom might commence soon. Diminishing returns, I guess. They did make some strides even if Moore's law is dead. AVX-512 became commonplace in 2017 (Skylake-X), and still isn't fully implemented. We'll only get _mm512_aesdec_epi128 in Ice Lake, for instance. And then they might work on making AVX-512 work well; as it is now it has quite bad thermal throttling, and AMD doesn't have it. There's always going to be more features to add, at least for a few years. But after that, sure, you might get your $10 ARM processors.

    AMD AREN'T feature-gating, their processors AREN'T frequency-locked(and binned much more realistically in any case), and their profit margins are lower. I don't think that AMD actually respects its customers, but I think they aren't actively raping them in the ass with a railroad spike. I'm not sure why you would paint AMD and Intel with the same brush.

    AMD would if they could, I'd reckon. AMD did sell locked CPUs in the past, they just didn't do it for Ryzen. If AMD ever gets ahead of Intel, presumably the sides would switch and Intel would be selling cheaper unlocked processors. And then we would be talking about how AMD is greedy and how we should all buy Intel instead, or something like that.

    Just imagine this political cartoon, but with the political parties swapped for chipmakers.
    Ow!! Next time I'll buy AMD!


    "Everything is terrible, and it is only going to get worse, so take it with a smile."
    Ummm, how about no? That's a terrible fucking attitude.

    Whether you smile or not makes no difference, but that's the way it is, terrible or not. There's no use in crying over spilled milk.
    I agree that it's regrettable, but nothing can be done about it, so why care?
    CaptainJistuce
    Posted by sureanem
    Posted by CaptainJistuce

    It really sounds like that WAS what you're saying, but...

    No, my point is that the manufacturing process doesn't matter. It's like complaining about how Adobe would just burn one kind of DVD-ROM and then laser out a part of it on the Photoshop Elements disks. The disk costs mere cents and you're not paying for the piece of plastic, so why care about how they make their disks?

    It is a shift in business model, it gets worse every year, and that R&D budget was CLEARLY not going to actually improving their product if the raft of Intel-exclusive vulnerabilities is any indication.

    They've boosted profits wildly over the last several years without actually offering improved products, and in fact with the pace of computer upgrades slowing greatly in part due to a lack of meaningful updates.

    ...

    I mean, if you want to defend a shift from "manufacturer shipping the best product we can at a competitive price" to "manufacturer shipping the worst product we think people will actually buy at the highest price we think people will tolerate", then okay. It isn't a stance I will ever agree with.


    ECC RAM disabling, of course, goes back to before the DRAM controller was even ON the processor. Intel worked hard to get everyone else out of the motherboard chipset business, then immediately turned around and started removing features from their chipsets and telling people it was an improvement because they were removing features people didn't need(except on the new server-grade chipsets that were suspiciously much more expensive despite being the same thing aside from those removed features still being present).



    Your analogy is also flawed, because I may be paying for software rather than media, but in the case of the processor, I am actually paying for the physical object.

    If you bought a car and it had a V8 engine under the hood, but the ECU only fired four cylinders... you'd be mad, right? That's what Intel is doing.


    Is that greedy, then?

    Umm,, yes? Have you SEEN the markup on Xeon processors?


    I mean, I know the memes about Intel, but what are they to do? They do have R&D costs, so unless the regulator forces them to sell at margin cost or they get nationalized not much will happen. Be happy you can still buy a CPU, even if they're not gracious enough to give you complete control over it.

    They aren't offsetting R&D costs, they are boosting profit margins.
    And I'm not going to thank them for throwing shit at my face just because they didn't actively force it down my throat.


    There's not any legitimate use for more computing power anyway, so people would just squander it on electron/mining crypto/making HTML5 even more retarded.

    Then they shouldn't have any R&D budget because there's nothing to develop.
    (also buttcoin is done on graphics cards or ASICs these days. No one digs for buttnuggets on CPU.)

    If this offends you, you can do absolutely nothing about it, because it's not feasible (unless you're Chinese) to catch up to Intel/AMD

    AMD AREN'T feature-gating, their processors AREN'T frequency-locked(and binned much more realistically in any case), and their profit margins are lower. I don't think that AMD actually respects its customers, but I think they aren't actively raping them in the ass with a railroad spike. I'm not sure why you would paint AMD and Intel with the same brush.

    In other words, it's absurd to complain about things being bad when you can tell from a mile away they'll get far, far worse.

    "Everything is terrible, and it is only going to get worse, so take it with a smile."
    Ummm, how about no? That's a terrible fucking attitude.
    Screwtape https://www.youtube.com/watch?v=rKncAFAShkQ <-- a video of the ZombieLoad vulnerability in action, recovering the first few characters of root's hashed password from /etc/shadow in about a minute and a half.

    This goes out to all those people who hear about timing side-channel attacks and say "I know, let's add a random, small delay to each operation!"
    ‮strfry("emanresu")
    Posted by CaptainJistuce

    It really sounds like that WAS what you're saying, but...

    No, my point is that the manufacturing process doesn't matter. It's like complaining about how Adobe would just burn one kind of DVD-ROM and then laser out a part of it on the Photoshop Elements disks. The disk costs mere cents and you're not paying for the piece of plastic, so why care about how they make their disks?

    God help us all if this kind of rapacious and abusive behavior is considered a good thing now.

    Is that greedy, then? I mean, I know the memes about Intel, but what are they to do? They do have R&D costs, so unless the regulator forces them to sell at margin cost or they get nationalized not much will happen. Be happy you can still buy a CPU, even if they're not gracious enough to give you complete control over it.
    There's not any legitimate use for more computing power anyway, so people would just squander it on electron/mining crypto/making HTML5 even more retarded.
    If this offends you, you can do absolutely nothing about it, because it's not feasible (unless you're Chinese) to catch up to Intel/AMD, and for them it's cheaper/easier to license it than to obtain it via industrial espionage leapfrog it. I suppose you could buy yourself a Hygon or Zhaoxin CPU, but you wouldn't be accomplishing very much.
    In other words, it's absurd to complain about things being bad when you can tell from a mile away they'll get far, far worse.
    CaptainJistuce
    Posted by sureanem

    No, I don't mean they're restricted by yields anymore, R&D is their main cost. And chip binning is just a simple optimization to do.


    "So it makes more sense to only manufacture a few types of CPUs, disable the parts that don't turn out so well, and then market them as different processor models based on what clock frequency, core count, etc they could sustain.

    Sure, you could argue this is immoral, but it's more efficient than trying to make all the different kinds of CPUs, throwing away some, and wasting enormous overclocking potential in some."

    It really sounds like that WAS what you're saying, but...



    But R&D is still a cost. You could argue that they're holding back the computer industry, which I suppose is true, but I think this is for the best. We don't deserve any better. Say clock speeds were to jump tomorrow, what would happen?
    A) people keep writing software like usual, but it now goes 2x as fast
    B) people make their software use 2x as much resources and claim the compiler will optimize it

    As much as I hate to say it, Intel did nothing wrong.

    God help us all if this kind of rapacious and abusive behavior is considered a good thing now.
    JieFK Coincidentally, a mail was posted today on Dng that gives a few flags to pass to your kernel command line to disable those patches :
    https://lists.dyne.org/lurker/message/20190516.104800.1d3cc002.en.html
    Almost the same as tomman's post above.
    ‮strfry("emanresu")
    Posted by CaptainJistuce
    If this were like the 486 SX, I'd agree.

    Intel moved several years ago to gating off features and setting clockspeeds based on what they WANTED to sell rather than supply being restricted by yields. It is why you occasionally see a processor model that reliably overclocks by 100%. The reason it isn't being sold as a faster part is because Intel doesn't want there to be a larger supply of faster parts.

    Especially obvious with things like ECC RAM and hyper-threading. There's no hardware failure I can imagine that renders them unusable that doesn't also kill the rest of the processor.

    No, I don't mean they're restricted by yields anymore, R&D is their main cost. And chip binning is just a simple optimization to do.
    But R&D is still a cost. You could argue that they're holding back the computer industry, which I suppose is true, but I think this is for the best. We don't deserve any better. Say clock speeds were to jump tomorrow, what would happen?
    A) people keep writing software like usual, but it now goes 2x as fast
    B) people make their software use 2x as much resources and claim the compiler will optimize it

    As much as I hate to say it, Intel did nothing wrong.

    Posted by Screwtape
    Security issues tend to be subtle, and understanding them usually requires a thorough understanding of the product in question, such as the product's designer might have. If you announce a vulnerability alongside patches and mitigation guides written by the vendor, you've probably found something cool and worth paying attention to. If you announce a vulnerability out of the blue, the best that's going to happen is that people will ask the vendor what's up, and the vendor will say "we dunno, give us a month or two to figure out whether this has any merit whatsoever". Much less dramatic, and hence much less likely to win you the respect of the security industry, or tenure, or whatever.

    Depends on how big the vulnerability is. The Israelis messed up by posting a paper that was as shady as can be, with no proof, and by an unknown company. If a (relatively) well-renowned university had posted it, with proof, and with some PoC code (ready for use) included, then the stock probably wouldn't do so well.

    Would you really think Intel would help them exploit their CPU? They'd do research, sure, but I don't think they'd share it with the public until they've fixed it.

    Of course, the interesting thing with *this* vulnerability is that basically the same thing was found by many individuals and research groups, and Intel made all of them swear to secrecy individually, and strung them all along for up to a year, never letting them know about each other. If they'd been allowed to talk to each other, they might have been able to properly explore the security implications of today's CPUs, maybe even discover the *next* vulnerability-with-a-logo. Instead, they sat and twiddled their thumbs waiting for the responsible disclosure period to elapse.

    EDIT: Also, https://make-linux-fast-again.com/

    Now THAT's impressive. Say they had 10 people at each "discoverer" who knew about it, that's 110 people, and none of them took out short positions.

    Or maybe they did. I suppose we wouldn't know about it unless they told us. It's perfectly legal, so the bank wouldn't tattle on them, and if they want to be on good terms with the uni they wouldn't tell them about it either. So the only one who would hear about it would be their friends, if any, and even if they in turn would tattle on them, there'd be no conclusive proof either way. Especially not if they were smart and got someone else to do it.
    wareya The Mill was right.
    Screwtape
    Posted by sureanem
    Security people are infamous for having lax morals and a tenuous grasp on reality. But yet they felt the need to do the whole "responsible disclosure" schtick?

    Security issues tend to be subtle, and understanding them usually requires a thorough understanding of the product in question, such as the product's designer might have. If you announce a vulnerability alongside patches and mitigation guides written by the vendor, you've probably found something cool and worth paying attention to. If you announce a vulnerability out of the blue, the best that's going to happen is that people will ask the vendor what's up, and the vendor will say "we dunno, give us a month or two to figure out whether this has any merit whatsoever". Much less dramatic, and hence much less likely to win you the respect of the security industry, or tenure, or whatever.

    Of course, the interesting thing with *this* vulnerability is that basically the same thing was found by many individuals and research groups, and Intel made all of them swear to secrecy individually, and strung them all along for up to a year, never letting them know about each other. If they'd been allowed to talk to each other, they might have been able to properly explore the security implications of today's CPUs, maybe even discover the *next* vulnerability-with-a-logo. Instead, they sat and twiddled their thumbs waiting for the responsible disclosure period to elapse.

    EDIT: Also, https://make-linux-fast-again.com/
    CaptainJistuce
    Posted by sureanem

    More cores and hyper-threading are complementary, rather than contradictory. And it is frankly embarrassing that they disable hyper-threading on any of their products(just like so many other things they disable so they can charge a premium for something their entire product line is capable of).

    That's actually reasonable, though. The marginal cost of manufacturing a chip is negligible, R&D (and masks) is the expensive part. So it makes more sense to only manufacture a few types of CPUs, disable the parts that don't turn out so well, and then market them as different processor models based on what clock frequency, core count, etc they could sustain.

    Sure, you could argue this is immoral, but it's more efficient than trying to make all the different kinds of CPUs, throwing away some, and wasting enormous overclocking potential in some.

    I don't get it.

    If this were like the 486 SX, I'd agree.

    Intel moved several years ago to gating off features and setting clockspeeds based on what they WANTED to sell rather than supply being restricted by yields. It is why you occasionally see a processor model that reliably overclocks by 100%. The reason it isn't being sold as a faster part is because Intel doesn't want there to be a larger supply of faster parts.

    Especially obvious with things like ECC RAM and hyper-threading. There's no hardware failure I can imagine that renders them unusable that doesn't also kill the rest of the processor.
    ‮strfry("emanresu")
    Posted by CaptainJistuce
    Intel stopped caring too. That's the problem.

    I don't see any problem. I've never once been affected by these Spectre or Meltdown vulnerabilities. If I would be, then whatever exploit kit manages to use it would affect many other people too, and then presumably someone (who still isn't me, mind you) would do something. I don't really have a dog in this fight, so why would I care?

    More cores and hyper-threading are complementary, rather than contradictory. And it is frankly embarrassing that they disable hyper-threading on any of their products(just like so many other things they disable so they can charge a premium for something their entire product line is capable of).

    That's actually reasonable, though. The marginal cost of manufacturing a chip is negligible, R&D (and masks) is the expensive part. So it makes more sense to only manufacture a few types of CPUs, disable the parts that don't turn out so well, and then market them as different processor models based on what clock frequency, core count, etc they could sustain.

    Sure, you could argue this is immoral, but it's more efficient than trying to make all the different kinds of CPUs, throwing away some, and wasting enormous overclocking potential in some.

    I don't get it.
    On 14 May 2019 ... coordinated with Intel, disclosed their discovery


    Security people are infamous for having lax morals and a tenuous grasp on reality. But yet they felt the need to do the whole "responsible disclosure" schtick?
    Fine if you can actually fix it. But in this case, we're all fucked anyway. So why not release the paper, and then just go all-in on puts? It's not even illegal.
    Say they're 10 people, can pool up $10k each. That'd be big enough to get a good deal on the transaction costs. Buy puts with to get like 50x leverage. Say it goes down by 2%, they've doubled their money. Spectre/Meltdown had it down by over 5%, and that was with responsible disclosure too.

    Probably they could get more leverage or money too. The odds of it going up are infinitesimal, so they only really have to fear random fluctuations before the news hit the market. And I'm sure the smart money'd be happy to back them.

    There was a guy who had an idea for a hedge fund like this, but he got into some legal trouble (for other stuff, later exonerated) and then later on some political trouble. Not sure if I'm allowed to speak positively about him here. But it seems like a workable idea, and one that would be good for society.
    CaptainJistuce
    Posted by tomman
    Damn Intel, where is my new, hardware-fixed i9?!

    I stopped caring, just like most people out there.

    Intel stopped caring too. That's the problem.


    Also, I've always knew HT is a bad idea. Never liked the concept (I'll take full cores rather than "not leaving resources idle", please).
    Hyper-threading is about getting more from what you have, in the same way that out-of-order execution, pipelining, and branch prediction.

    More cores and hyper-threading are complementary, rather than contradictory. And it is frankly embarrassing that they disable hyper-threading on any of their products(just like so many other things they disable so they can charge a premium for something their entire product line is capable of).
    tomman *yawn*

    Another website-and-logo processor vulnerability. Damn Intel, where is my new, hardware-fixed i9?!

    I stopped caring, just like most people out there. It's quite amuse to read that "Disable JavaScript" is a common mitigation strategy nowadays.

    Also, I've always knew HT is a bad idea. Never liked the concept (I'll take full cores rather than "not leaving resources idle", please).
    Screwtape Another bunch of attacks for the pile!

    - Intel-only
    - Affecting processors built since 2011
    - Another kind of speculative execution attack, but not prevented by the Spectre/Meltdown workarounds

    One mitigation is disabling hyper-threading in the BIOS, which can be a 40% performance hit if you're running thread-heavy code (i.e. not higan).
    tomman
    Posted by tomman
    On Windows, run the GRC tool (InSpectre) as admin, click the button, reboot, done.
    On Linux, add the magic enchants to your GRUB cmdline, update-grub, reboot, done.

    On Mac, well, Macs have no flaws, Apple knows what's best for you, and you're due for a new port-less MacBook anyway. Yes, you can't disable the patches on macOS.
      Main » Programming » Your daily dose of processor unit vulnerabilities » New reply
      Yes, it's an ad.