Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
Voxera113882yIt could be that modern chips are so small that the IBM solution does not really work.
-
stop68022ythey are too small and too hot for air based cooling. Thats why gpus by now require big heatpipes to distribut the heat.
-
The problem starts with the layers...
My knowledge is a bit rusty, but I think Mainboards alone can have more than 8 different layers.
I somehow don't want to know how many layers the morbidly obscene obese GPUs have...
I remember NVIDIA having 12 PCB layers with the 3080 series - which made many manufacturers pretty unhappy, especially as the circuit design has become extremely complex.
Not sure how it looks for AMD, though I think it's better as AMD is less of a nutcase than NVIDIA. Nvidia always had the tendency to go completely crazy regarding designs.
The layer design, certain hotspots where e.g. memory is stacked etc makes cooling a lot more complicated than it was 50 years ago...
Related Rants
-
ctrl-alt-del33I just spent a full hour basically telling someone not to use filter as forEach
-
kiki6Any day when I'm manic. Can work 12 hours without feeling tired. Also, 30% intelligence boost. Feeling like a ...
-
30hrs28Elon musk has shown himself to be a terrible person, a worse manager and someone who hasn't a clue of what a c...
Theregister.com is wrestling with gpus that need 700 Watts of juice and how to cool them. 50 years ago I was reading an excellent magazine called "Electronics". And I remember that IBM came up with a scheme to absorb enormous amounts of heat from chips. You simply score the underside of the chip in a grid pattern and pump water through it. Hundreds of watts per degree Celsius can be removed. Problem solved.
rant
wk339