Fjodor2001
Diamond Member
I read that elsewhere too.The manufacturers are all changing lines over to produce HBM memory for AI accelerators since it is a far more lucrative market than producing standard DRAM.
But I wonder if it really can explain all of it. I mean the amount of HBM memory needed for AI should still be far less than the amount of DRAM for laptops/mobile/PCs/etc, right?
Just mobile phones alone sell ~1.5B units per year, and they have ~8 GB RAM each. How many GPUs are sold per year for AI?
Last edited:
