Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
JoshBent2794410dIntel shouldn't have cheated their way through, simple as that; having 4 cores with thousands of threads instead of investing into properly implementing more cores and pushing the technology like AMD now finally did.
There wouldn't be such a performance penalty if Intel didn't have a flaw in their product
unsignedint57110dSecurity is always going to get in the way of Performance and Usability.
Most of the time it comes down to what your risk appetite is and how much you need to compromise Security for the sake of Performance and Usability.
If you're running PC's in an isolated / controlled environment and need the raw computing power - who gives a fuck about speculative execution vulnerabilities if "fixing" them requires you to buy 10% more hardware
sayaws9810dB-but I like to search for vulnerabilities, it's fun...
For real though, as @JoshBent said, Intel really cared more about having more models on the market instead of actually researching proper architecture designs like AMD because before the Zen line, AMD was almost out of the picture and Intel had an almost total monopoly on desktop and server CPUs. That's a huge part in why monopolies suck.
phorkyas121610dWhile I can share part of the feeling, I think it is wrong on different levels.
If you took for serious the anti-security pope Torvalds that security issues are just bugs, then just go ahead and write a bug free system! Well of course there is none, not in a unmanaged language like C (not even qmail) - and probably also beyond if business logics get involved enough.
Performance wise I miss Borland Turbopascal. What a compilation speed. Imagine a TDD loop with this old 80's machinery, you'd still outdo today's bloat on an i386. But friggin users demand GUI? And somehow today an 'app' is some slow shit that runs in a browser and needs a JIT to not be worse in responsiveness an usability than Win 3.11.
You cannot shoot the massenger, the brave man that explore the vast uncharted land of the CPU, found these side channels. The problem: we as devs should not accept our soft- nor hardware stacks to become stinking piles of entropy!
ddephor456510dThing is, that the security topic has become such a big deal only in the IT world.
Maybe it's because IT systems are accessible from anywhere around the globe today, but have you ever seen a lock manufacturer develop new locks because lock picking tools are invented, sending new locks to all user? Or anyone change their locks just because someone has successfully opened a lock of the same type with a screwdriver and a wire?
sayaws9810d@ddephor actually, yes. Not the whole "send new locks to everyone" thing, but people changing their locks because their neighbors or friends were actually lockpicked is a thing I've heard pretty often. And also, as you mentioned, the fact that it's much more spread in the IT world is partly due to the fact that it's easier to do it on a computer because 1. People don't know how to secure their stuff, most users don't even know how exactly does a computer work, 2. It's easier to do it remotely and also to automate and reproduce the attack on multiple machines at once, while unless you have an army of robots you can't pick several door locks at once, and 3. The whole "hacker who steals your credit card info" myth is still scary to people who are not really into computers, just like thiefs were when locks were invented.
swablu1439dHumans are the worst I hate them.
Your Job Suck?
Take a quick quiz from Triplebyte to skip the job search hassles and jump to final interviews at hot tech firms
Get a Better Job