hc2021-cerebras-embargoed-8-24-slide-10.jpg
Cerebras added to its previously annouced CS-2 AI computer with a new switch product, the SwarmX, that does routing but also calculations, and a memory computer containing 2.4 petabytes of DRAM and NAND, called MemoryX. Cerebras Systems

Artificial intelligence in its deep learning form is producing neural networks that will have trillions and trillions of neural weights, or parameters, and the increasing scale presents special problems for the hardware and software used to develop such neural networks.

"In two years, models got a thousand times bigger and they required a thousand times more compute," says Andrew Feldman, co-founder and CEO of AI system maker Cerebras Systems, summing up the recent history of neural nets in an interview with ZDNet via Zoom.

"That is a tough trajectory," says Feldman.

Feldman's company this week is unveiling new computers at the annual Hot Chips computer chip conference[1] for advanced computing. The conference is being held virtually this year. Cerebras issued a press release announcing the new computers. [2]

Cerebras, which competes with the AI leader, Nvidia, and with other AI startups, such as Graphcore and SambaNova Systems, aims to lead in performance when training those increasingly large networks. Training is the phase where a neural net program is developed by subjecting it to large amounts of data and tuning the neural net weights until they produce the highest accuracy possible.

Also: 'We can solve this problem in an amount of time that no number of GPUs or CPUs can achieve,' startup Cerebras tells supercomputing conference[3]

It's no secret that neural networks have been steadily growing in size. In the past year, what had been the world's largest neural net as measured by neural

Read more from our friends at ZDNet