Cerebras develops chip to power super computer
Image: Cerebras

Progress in the computer chip industry is usually measured by size: the smaller the better. The chips that run an iPhone or iPad are about the size of a fingernail. Even hefty cloud servers use a chip about the size of a postage stamp. But a startup called Cerebras has developed an iPad sized chip to power artificial intelligence (AI), as Wired reports.

But the big chip isn’t necessarily a step backward. AI gains its intelligence from a process called deep learning. AI “learns” by “training.” This involves algorithms that optimize themselves to a task by mining data.

Go Deep

For instance, a language processing AI would quickly be able to learn Spanish because there’s a lot of data out there. But if it was to try and learn a relatively unknown language, the AI might struggle. In other words, the bigger the data pool, the better the AI will be at its job.

Advertisement

But sometimes data alone isn’t enough. AI can also hit a wall when it comes to computing power. This often hinges on energy consumption and in turn, cost. A recent study found that energy consumption on deep learning for a single piece of language processing software could cost around $350,000. Furthermore, AI lab OpenAI estimated that the cost of computing power on the largest AI projects doubled every three and a half months between 2012 and 2018.

Therefore, AI programmers looking for more power have turned to graphics processors (GPUs). The deep learning revolution stemmed from the discovery that GPUs excel in the algorithms utilized in the training technique. This fortuitous finding boosted the stock of leading GPU manufacturer Nvidia eight-fold in the past half-decade.

But deep learning requires more than just one GPU chip. For example, Open AI created a bot that took on the videogame “Dota 2” using hundreds of GPUs wired together. The experiment still lasted weeks. This is where a big chip comes in handy.

Big Chip, Big Brain, Big Business

Cerebras founder and CEO Andrew Feldman says that his big chip can do the work of hundreds of clusters of GPUs while taking up less space and energy. Feldman claims that the chip will allow AI technology to move faster. “You can ask more questions,” he told Wired. “There are things we simply haven’t been able to try.”

Feldman’s claims carry weight because of the big chip’s enhanced onboard memory. In addition, data can move 1,000 times faster around one large chip than it can between hundreds of small chips linked together. But there are downsides. A big chip puts off big heat. Most computers keep cool by blowing air around. Cerebras’ chip has to use water pipes to prevent overheating. It’s also expensive.

However, price is not an issue for big companies like Facebook and Amazon who want to stay competitive in the AI field. Feldman also claims that a “handful” of other companies are giving the big chip a whirl in areas like drug design. So, for Big Pharma or Big Tech, taking a bite out of a big chip might be the way to go.

Facebook Comments