Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
Your answer is intuitively correct, but unfortunately has a couple of flaws
They didn’t, not that much anyways, a Cray-1 used 115kW to produce 160 MFLOPS of calculations. And while 150kW is a LOT, it’s not in the “needs its own power plant to operate” category, since even a small coal power plant (the least efficient electricity generation method) would produce a couple of orders of magnitude more than that.
Indeed, our phones are in the Teraflops range for just a couple of watts.
Unfortunately there isn’t, we’ve reached the end of Moore’s law, processors can’t get any smaller because they require to block electrons from passing on given conditions, and if we built transistors smaller than the current ones electrons would be able to quantum leap across them making them useless.
There might be a revolution in computing by using light instead of electricity (which would completely and utterly revolutionize computers as we know them), but until that happens computers are as small as they’re going to get, or more specifically they’re as space efficient as they’re going to get, i.e. to have more processing power you will need more space.