JieFK |
Posted on 19-05-16, 20:24
|
Post: #10 of 10
Since: 10-29-18 Last post: 2018 days Last view: 1778 days |
Coincidentally, a mail was posted today on Dng that gives a few flags to pass to your kernel command line to disable those patches : https://lists.dyne.org/lurker/message/20190516.104800.1d3cc002.en.html Almost the same as tomman's post above. |
CaptainJistuce |
Posted on 19-05-17, 06:07
|
Custom title here
Post: #453 of 1164 Since: 10-30-18 Last post: 65 days Last view: 2 days |
Posted by sureanem "So it makes more sense to only manufacture a few types of CPUs, disable the parts that don't turn out so well, and then market them as different processor models based on what clock frequency, core count, etc they could sustain. Sure, you could argue this is immoral, but it's more efficient than trying to make all the different kinds of CPUs, throwing away some, and wasting enormous overclocking potential in some." It really sounds like that WAS what you're saying, but...
God help us all if this kind of rapacious and abusive behavior is considered a good thing now. --- In UTF-16, where available. --- |
Duck Penis |
Posted on 19-05-17, 08:51
|
Stirrer of Shit
Post: #291 of 717 Since: 01-26-19 Last post: 1766 days Last view: 1764 days |
Posted by CaptainJistuce No, my point is that the manufacturing process doesn't matter. It's like complaining about how Adobe would just burn one kind of DVD-ROM and then laser out a part of it on the Photoshop Elements disks. The disk costs mere cents and you're not paying for the piece of plastic, so why care about how they make their disks? God help us all if this kind of rapacious and abusive behavior is considered a good thing now. Is that greedy, then? I mean, I know the memes about Intel, but what are they to do? They do have R&D costs, so unless the regulator forces them to sell at margin cost or they get nationalized not much will happen. Be happy you can still buy a CPU, even if they're not gracious enough to give you complete control over it. There's not any legitimate use for more computing power anyway, so people would just squander it on electron/mining crypto/making HTML5 even more retarded. If this offends you, you can do absolutely nothing about it, because it's not feasible (unless you're Chinese) to catch up to Intel/AMD, and for them it's cheaper/easier to license it than to In other words, it's absurd to complain about things being bad when you can tell from a mile away they'll get far, far worse. There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
Screwtape |
Posted on 19-05-17, 09:08
|
Full mod
Post: #246 of 443 Since: 10-30-18 Last post: 1103 days Last view: 175 days |
https://www.youtube.com/watch?v=rKncAFAShkQ <-- a video of the ZombieLoad vulnerability in action, recovering the first few characters of root's hashed password from /etc/shadow in about a minute and a half. This goes out to all those people who hear about timing side-channel attacks and say "I know, let's add a random, small delay to each operation!" The ending of the words is ALMSIVI. |
CaptainJistuce |
Posted on 19-05-17, 10:53
|
Custom title here
Post: #454 of 1164 Since: 10-30-18 Last post: 65 days Last view: 2 days |
Posted by sureanem It is a shift in business model, it gets worse every year, and that R&D budget was CLEARLY not going to actually improving their product if the raft of Intel-exclusive vulnerabilities is any indication. They've boosted profits wildly over the last several years without actually offering improved products, and in fact with the pace of computer upgrades slowing greatly in part due to a lack of meaningful updates. ... I mean, if you want to defend a shift from "manufacturer shipping the best product we can at a competitive price" to "manufacturer shipping the worst product we think people will actually buy at the highest price we think people will tolerate", then okay. It isn't a stance I will ever agree with. ECC RAM disabling, of course, goes back to before the DRAM controller was even ON the processor. Intel worked hard to get everyone else out of the motherboard chipset business, then immediately turned around and started removing features from their chipsets and telling people it was an improvement because they were removing features people didn't need(except on the new server-grade chipsets that were suspiciously much more expensive despite being the same thing aside from those removed features still being present). Your analogy is also flawed, because I may be paying for software rather than media, but in the case of the processor, I am actually paying for the physical object. If you bought a car and it had a V8 engine under the hood, but the ECU only fired four cylinders... you'd be mad, right? That's what Intel is doing. Is that greedy, then? Umm,, yes? Have you SEEN the markup on Xeon processors? I mean, I know the memes about Intel, but what are they to do? They do have R&D costs, so unless the regulator forces them to sell at margin cost or they get nationalized not much will happen. Be happy you can still buy a CPU, even if they're not gracious enough to give you complete control over it. They aren't offsetting R&D costs, they are boosting profit margins. And I'm not going to thank them for throwing shit at my face just because they didn't actively force it down my throat. There's not any legitimate use for more computing power anyway, so people would just squander it on electron/mining crypto/making HTML5 even more retarded. Then they shouldn't have any R&D budget because there's nothing to develop. (also buttcoin is done on graphics cards or ASICs these days. No one digs for buttnuggets on CPU.) If this offends you, you can do absolutely nothing about it, because it's not feasible (unless you're Chinese) to catch up to Intel/AMD AMD AREN'T feature-gating, their processors AREN'T frequency-locked(and binned much more realistically in any case), and their profit margins are lower. I don't think that AMD actually respects its customers, but I think they aren't actively raping them in the ass with a railroad spike. I'm not sure why you would paint AMD and Intel with the same brush. In other words, it's absurd to complain about things being bad when you can tell from a mile away they'll get far, far worse. "Everything is terrible, and it is only going to get worse, so take it with a smile." Ummm, how about no? That's a terrible fucking attitude. --- In UTF-16, where available. --- |
Duck Penis |
Posted on 19-05-17, 14:34
|
Stirrer of Shit
Post: #293 of 717 Since: 01-26-19 Last post: 1766 days Last view: 1764 days |
Posted by CaptainJistuce Things haven't always been this way? I doubt it. As for the R&D budget, those are two disparate things. If I'd have to choose between two processors, where one has an obscure theoretical vulnerability, and the other one is 20% faster, I'd sure as heck go with the other one. Even at just 5% or 10% or whatever. I mean, sure, fix Spectre if it's been disclosed, but don't go around sacrificing performance for "security". I mean, if you want to defend a shift from "manufacturer shipping the best product we can at a competitive price" to "manufacturer shipping the worst product we think people will actually buy at the highest price we think people will tolerate", then okay. It isn't a stance I will ever agree with. What do you mean, a shift? Presumably, they're in it for the money like everyone else, so they'd always have done that. Most people nowadays have no need for "the best product" like they did in, say, the 90's, because what makes a CPU the best is often niche features that far from all applications can leverage. ECC RAM disabling, of course, goes back to before the DRAM controller was even ON the processor. Intel worked hard to get everyone else out of the motherboard chipset business, then immediately turned around and started removing features from their chipsets and telling people it was an improvement because they were removing features people didn't need(except on the new server-grade chipsets that were suspiciously much more expensive despite being the same thing aside from those removed features still being present). Xeon CPUs and server-grade chipsets are kind of a special case. The people who buy them are quite price-insensitive, they're the kind of people who throw out the whole machine whenever a part breaks. So it's just regular market segregation. Case in point: For any consumer-grade CPU, the MSRP is pretty much what you get if you're not buying used. For server-grade CPUs, you can get all sorts of deals on Engineering Samples and whatnot. Just go on Taobao and search for Xeon, and you'll see what I mean. This is not much different to Nvidia and their Quadro cards. The Quadro cards are just more expensive for no real reason, because the people who buy them have infinity money, and wouldn't ever dare hack something together out of reflashed warranty voided cards. That's just the way it is. For regular consumer processors, prices are still normal. Your analogy is also flawed, because I may be paying for software rather than media, but in the case of the processor, I am actually paying for the physical object. I don't agree with you on that. You can buy a silicon wafer for next to nothing, what you're paying for is Intel's monopoly on the IP needed to manufacture one of whatever processor it is you have. It's a good analogy with the car. But no. I mean, sure, it's aggravating that parts are locked away. But not much can be done about it. It's like with injection molding: the marginal cost is almost zero, but there's a fat starting cost. And it's indifferent to me whether I get a CPU that has speed X, or one that has speed Y but is hardware locked to speed X in a way that I can't ever change. So they might as well save their money. It's not like I wouldn't get an unlocked CPU if they didn't do that, they'd just have slightly higher expenses. They aren't offsetting R&D costs, they are boosting profit margins. What's the difference? Both are just return on investment. I don't mean that they have the right to exact payment because they have R&D costs, I mean that they have the ability to do it because everyone else has R&D costs too. That the desktop is a viable platform there's nothing to thank Intel for. Not Microsoft either. Really, I don't know who to thank for it. PC gamers, maybe? Then they shouldn't have any R&D budget because there's nothing to develop. That's a reasonable point, and I suppose a race to the bottom might commence soon. Diminishing returns, I guess. They did make some strides even if Moore's law is dead. AVX-512 became commonplace in 2017 (Skylake-X), and still isn't fully implemented. We'll only get _mm512_aesdec_epi128 in Ice Lake, for instance. And then they might work on making AVX-512 work well; as it is now it has quite bad thermal throttling, and AMD doesn't have it. There's always going to be more features to add, at least for a few years. But after that, sure, you might get your $10 ARM processors. AMD AREN'T feature-gating, their processors AREN'T frequency-locked(and binned much more realistically in any case), and their profit margins are lower. I don't think that AMD actually respects its customers, but I think they aren't actively raping them in the ass with a railroad spike. I'm not sure why you would paint AMD and Intel with the same brush. AMD would if they could, I'd reckon. AMD did sell locked CPUs in the past, they just didn't do it for Ryzen. If AMD ever gets ahead of Intel, presumably the sides would switch and Intel would be selling cheaper unlocked processors. And then we would be talking about how AMD is greedy and how we should all buy Intel instead, or something like that. Just imagine this political cartoon, but with the political parties swapped for chipmakers. Ow!! Next time I'll buy AMD! "Everything is terrible, and it is only going to get worse, so take it with a smile." Whether you smile or not makes no difference, but that's the way it is, terrible or not. There's no use in crying over spilled milk. I agree that it's regrettable, but nothing can be done about it, so why care? There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
BearOso |
Posted on 19-05-17, 17:54
|
Post: #82 of 175 Since: 10-30-18 Last post: 1453 days Last view: 1453 days |
Posted by JieFK We just need the last one now, mitigations=off. Thankfully, someone had some sense and realized these aren't exploitable without already giving a person significant access to a system. The only reason we're seeing such panic is because most kernel programmers work for virtualization providers. |
Duck Penis |
Posted on 19-05-17, 18:45
|
Stirrer of Shit
Post: #296 of 717 Since: 01-26-19 Last post: 1766 days Last view: 1764 days |
Posted by BearOso Ah, so that's why. Always thought it seemed overblown. There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
CaptainJistuce |
Posted on 19-05-18, 04:44 (revision 1)
|
Custom title here
Post: #456 of 1164 Since: 10-30-18 Last post: 65 days Last view: 2 days |
Posted by sureanem If only profit margins, processor performance, and feature sets were publicly available data. Oh, wait... It is rather easy to confirm that having segmented markets for full-featured enterprise processors and kneecapped personal-use processors is a modern business practice that didn't even START until the Core line, and resulted in wildly larger profit margins(because businesses will buy what they need no matter the cost IF there isn't a lower-cost alternative). They tried to set up separate processors entirely, but the Itanium was a flop, and the Pentium 4 was a hilarious trainwreck(in part due to poor engineering, in part due to Intel proclaiming that enthusiasts didn't need 64-bit while AMD was offering 64-bit processors to enthusiasts). So Intel licensed AMD64 from AMD. And once they recovered their footing(mostly because Bulldozer was a mistake), they started feature-gating the Core line. I mean, if you want to defend a shift from "manufacturer shipping the best product we can at a competitive price" to "manufacturer shipping the worst product we think people will actually buy at the highest price we think people will tolerate", then okay. It isn't a stance I will ever agree with. So every processor from the 486 onward has made a mistake by integrating the FPU? At the time, it was a niche feature that normal users didn't need. There was some annoyance that people who were only going to need integer performance were being forced to pay for a FP coprocessor. ONCE IT WAS AVAILABLE, it started seeing widespread usage because, hey, it turns out that it wasn't that useless after all.
It is market segregatoin that largely did not exist before the Core series, and didn't exist AT ALL before the Pentium Pro. The PPro had the excuse of being genuinely expensive to manufacture. The Pentium 2 started the trend of ARTIFICIAL market segmentation, where you disable features on most of your parts so that you can charge a premium for the ones that allow access to those parts. This is not much different to Nvidia and their Quadro cards. The Quadro cards are just more expensive for no real reason, because the people who buy them have infinity money, and wouldn't ever dare hack something together out of reflashed warranty voided cards. Actually, you traditionally gain much better warranty terms and support in exchange for the Quadro prices. Though nVidia started feature-gating once businesses started becoming more concerned with the price than the support. And now they flat-out include usage restrictions in the driver license so you legally CAN'T use cheaper enthusiast cards in a business or research setting REGARDLESS of the performance and feature set. That's just the way it is.But not the way it always was, or the way it has to be. For regular consumer processors, prices are still normal.Individuals just get less for their dollar than they used to(and businesses get MUCH less). I don't agree with you on that. You can buy a silicon wafer for next to nothing, what you're paying for is Intel's monopoly on the IP needed to manufacture one of whatever processor it is you have.I'm paying Intel to manufacture a processor, and I'm willing to pay more for a part that is actually harder to make. But when Intel tells me I should pay more not because it is harder, but because they want a wider profit margin, I get upset. AMD would if they could, I'd reckon. AMD did sell locked CPUs in the past, they just didn't do it for Ryzen. Even when they were ruling the roost, they binned processors honestly(easy enough to test given that their processors were easy to unlock during their dominance), and didn't gate features. If AMD ever gets ahead of Intel, presumably the sides would switch and Intel would be selling cheaper unlocked processors. And then we would be talking about how AMD is greedy and how we should all buy Intel instead, or something like that. If AMD ever gets close to ahead of Intel, Intel will ease up on their bullshit. We've already seen that as Zen has proven damaging to their sales and they've responded by offering better products at lower prices. Hooray for some fucking competition.
I'm not crying over spilt milk, I'm raging over the fact that no one's standing the jug up before the rest of it dumps out on the floor. Things CAN be done. But whatever. You're actually the enemy. This "take it and be happy because you are powerless" attitude is why things have gotten as bad as they have. I'm wasting time I could spend on literally anything else arguing with either a corporate shill or a consumer whore. --- In UTF-16, where available. --- |
Duck Penis |
Posted on 19-05-18, 13:48
|
Stirrer of Shit
Post: #297 of 717 Since: 01-26-19 Last post: 1766 days Last view: 1764 days |
I'm not crying over spilt milk, I'm raging over the fact that no one's standing the jug up before the rest of it dumps out on the floor. Things CAN be done. Hey, calm it with the personal attacks. I am neither a corporate shill nor a consumer whore. I have never set foot inside an Intel office (or any chipmaker's office, for that matter), nor have I ever purchased an Intel CPU in my life, except for when it was inside a laptop. I don't own any stock in Intel either, except for whatever exposure I get through retirement funds and whatnot. My closest connection to them is probably that I think the AVX intrinsics are fun to program with and the documentation site is kind of neat, but that doesn't mean I like them. And it isn't that I think what Intel is doing is particularly good or that I feel enthusiasm about it, just apathy. You can rage all you want, but nothing can be done about the matter. The "solution" essentially boils down to "go buy AMD or something". Even if this were a good solution, there's no point to doing so - if their CPUs are good enough, they'll get sold anyway, and if they're not, they won't get sold anyway. In no way do my actions ever enter the equation, so I might as well take it easy and save my money. Anyway, the main concern isn't that chipmaker A would best chipmaker B reducing CPU costs by X%, that's rearranging the deck chairs on the Titanic. Soon the desktop will be an utterly irrelevant platform, and we'll all be stuck with Chinese locked-down ARM smartdevices with hardware DRM, with nobody even bothering to translate the SDK (which would require a license, of course, and depend on access to The Cloud™) into English. Nothing can be done about this either, lest you accuse me of changing the subject, I'm just saying that chipmaker profits are a fairly irrelevant concern in the grand scheme of things. Posted by CaptainJistuce I wasn't around for the early era of microprocessors, but that should still make sense. VIA and Cyrix were making CPUs, because the cost to entry back then was lower. There's even a law, Moore's second law, stating that "the cost of a semiconductor chip fabrication plant doubles every four years". The 486 came out in 1989, according to the article a plant cost $14B in 2015, so that gives around $80M for a plant back in 1989. About the same as in this article. Also, the R&D you'd have to catch up to hadn't gotten so far yet. Businesses would pay money out the ass even if there is a lower-cost alternative, as long as paying out the ass is the so-called "industry best practice". The same goes for universities (in the first world), or at least a good chunk of them. So every processor from the 486 onward has made a mistake by integrating the FPU? At the time, it was a niche feature that normal users didn't need. There was some annoyance that people who were only going to need integer performance were being forced to pay for a FP coprocessor. I suppose the FP is different, but that's a good point. However, stuff like AVX is a fair bit more niche. Compilers can't vectorize well, so all you get is optimizations for programs specially written to take advantage of it (and yes, they do get far faster, even though a GPU is often better still) and a ~5% speedup for other stuff. Joe Q. Public won't get anything much out of AVX. They're mostly used in video en/decoding, but for H264, the only codec of value to him, that's hardware accelerated anyway. https://dxr.mozilla.org/mozilla-central/search?q=m256&redirect=true As you can see, mostly used in libvpx. Doesn't make browser go faster. Also, mr. Public would only use Chrome, which isn't any better. It is market segregatoin that largely did not exist before the Core series, and didn't exist AT ALL before the Pentium Pro. The PPro had the excuse of being genuinely expensive to manufacture. The Pentium 2 started the trend of ARTIFICIAL market segmentation, where you disable features on most of your parts so that you can charge a premium for the ones that allow access to those parts. Actually, you traditionally gain much better warranty terms and support in exchange for the Quadro prices. Though nVidia started feature-gating once businesses started becoming more concerned with the price than the support. Yeah, but wasn't that just because the enthusiasts and professionals had about the same needs back then, but then they diverged? Market segmentation is a good thing, in theory. If a company makes $X more on the expensive "industry-grade" products, they would need to earn $X less on the consumer-grade ones to keep the same level of profit. This is what Adobe does: for personal use, Photoshop is effectively free (yes, this is a strategy they encourage), but for businesses that's obviously not an option outside of China. The Quadro example proves my point: they get warranty and support, yes, but is it really worth paying thousands of dollars for? And the license, same thing. It's not a crime to violate an EULA, and the probability of them getting sued is infinitesimal. Stuff like the PS3 supercomputer does exist, you know. But nobody wants to save money, because they'd prefer wasting it for use as a status symbol so they get the tuitions. Journal subscriptions (an utter fraud) are another great example of this. Cut them, the students will teach each other about sci-hub, wait until the Elsevier problem solves itself. But I digress. Anyway, that big corporations need to pay big $$$ for status symbols isn't a big problem to me. But not the way it always was, or the way it has to be. It wasn't that way before because the barriers to entry were lower. At least now things are okay - you can still purchase loose CPUs. I don't think they can get much better, since there's no market for it. Individuals just get less for their dollar than they used to(and businesses get MUCH less). To be blunt, fuck businesses. They waste money, that's on them. I get more for my dollar now than at any other time in the past. Especially if I buy used CPUs. The most expensive CPU in, say, 1990, probably cost around the same as the most expensive now, but it sure as heck isn't as fast. I'm paying Intel to manufacture a processor, and I'm willing to pay more for a part that is actually harder to make. But when Intel tells me I should pay more not because it is harder, but because they want a wider profit margin, I get upset. Well, what should be done about it? They can justify this profit margin with the higher costs to entry, a la might makes right, and unless you have $billions lying around in your couch you sadly have to agree they're right on this. (Also, the profit margins are not so high. P/E of 10 is high but not stellar nor ATH.) Even when they were ruling the roost, they binned processors honestly(easy enough to test given that their processors were easy to unlock during their dominance), and didn't gate features. Don't they do the exact same stuff, like disabling broken/unstable cores with lasers? I know you could reflash 270 to 270X, but I don't think you could do that for 370 -> 370X, for instance. If AMD ever gets close to ahead of Intel, Intel will ease up on their bullshit. We've already seen that as Zen has proven damaging to their sales and they've responded by offering better products at lower prices. Hooray for some fucking competition. Yeah, but this only works past a certain point, because if AMD gets far ahead enough they'll start pulling the same moves Intel did. There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
tomman |
Posted on 19-12-11, 18:20
|
Dinosaur
Post: #604 of 1317 Since: 10-30-18 Last post: 2 days Last view: 5 hours |
The logo-and-website vulnerability of the day: Plundervolt Some researchers just figured out that if you tamper with a power supply circuit, Bad Things™ can happen. Also, root required, but then if you've got root, why bother going deeper? "Buy AMD", they say. Licensed Pirate® since 2006, 100% Buttcoin™-free, enemy of All Things JavaScript™ |