May 30, 2024 7:45 AM EDT
As AI systems proliferate, the demand for computing power to crunch large data sets has become monstrous. Cerebras, founded in 2015, has responded with the largest computer chip ever. Its Wafer-Scale Engine 3 is 8 sq. in., with 4 trillion transistors. Putting everything on one wafer rather than networking many chips reduces data-transfer times and energy use for the most compute-intense AI jobs, says CEO Andrew Feldman. “We didn't repurpose a graphics-processing device. We said, ‘What would we do if this was the only problem, the full purpose of our existence?’” In Cerebras’ multimillion-dollar CS-2 supercomputer, the company’s chips have been put to work on jobs like building AI medical assistants for the Mayo Clinic. While planning an IPO, the company is building the third of nine $100 million supercomputers that will be interconnected by Emirati AI firm G42 to build “the world’s largest supercomputer for AI training.”
ncG1vNJzZmismaKyb6%2FOpmZvcWduhnl8jpycq52Sp660edKyqq2dnah8