AI Needs So Much Power, It's Making Yours Worse

  • It's behind a paywall, but the gist of it is that so many bad power supplies are being connected to the grid that the sine waves are getting distorted. It's based on this research from Whisker Labs[1].

    Better power supplies can help with the distortion, but it really shouldn't matter much, except in extreme cases where things are at 99% of their limits in the power grid, on a hot day.

    By the way, you can learn interesting things from analysis of the power grid. Long ago I remember a Slashdot comment about using it with covertly gathered recordings estimate the effective yield of nuclear enrichment operations in the middle east. I've been trying to find that thread but Google isn't what it used to be

    [1] https://www.whiskerlabs.com/analysis-of-total-harmonic-disto...

  • It's better than using it on crypto mining which is a huge waste of resources for something so useless to humanity.

  • https://archive.today/f707o

    Previous:

    https://news.ycombinator.com/item?id=42523401 - Dec 2024 (3 comments)

  • Is the problem essentially that too much of the load on the grid is AC->DC converters that are drawing power only at the peak of each cycle?

    Maybe there's an alternative design for AC->DC conversion that can use the full cycle?

  • Require that new data centers be powered by their own renewables/batteries instead of being attached to the grid. Problem solved. I'm surprised anyone building a data center is not doing this already.