5
30hrs
1y

Theregister.com is wrestling with gpus that need 700 Watts of juice and how to cool them. 50 years ago I was reading an excellent magazine called "Electronics". And I remember that IBM came up with a scheme to absorb enormous amounts of heat from chips. You simply score the underside of the chip in a grid pattern and pump water through it. Hundreds of watts per degree Celsius can be removed. Problem solved.

Comments
  • 0
    It could be that modern chips are so small that the IBM solution does not really work.
  • 0
    they are too small and too hot for air based cooling. Thats why gpus by now require big heatpipes to distribut the heat.
  • 0
    @Voxera
    My guess is that modern chips are larger than, gulp, 40 years ago. I'm guessing that thickness would be similar as well.
  • 1
    The problem starts with the layers...

    My knowledge is a bit rusty, but I think Mainboards alone can have more than 8 different layers.

    I somehow don't want to know how many layers the morbidly obscene obese GPUs have...

    I remember NVIDIA having 12 PCB layers with the 3080 series - which made many manufacturers pretty unhappy, especially as the circuit design has become extremely complex.

    Not sure how it looks for AMD, though I think it's better as AMD is less of a nutcase than NVIDIA. Nvidia always had the tendency to go completely crazy regarding designs.

    The layer design, certain hotspots where e.g. memory is stacked etc makes cooling a lot more complicated than it was 50 years ago...
Add Comment