Amazon now powering Alexa, Rekognition with self-made machine learning chips

The most fun games to play with Alexa on your Amazon Echo.

Amazon revealed it uses self-made system-on-a-chips (SoCs) to handle some of its Alexa and Rekognition cloud computing workloads. In the past, the e-commerce giant depended upon Nvidia-made SoCs to fulfill part of those backend processing needs.

However, the Big Tech giant changed tacts by developing its own silicon called Inferentia through its Annapurna Labs subsidiary. The firm’s self-made components reduced Alexa’s processing latency by 25 percent and cut its cost by 30 percent compared to third-party chips.

Why Amazon Started Making its Own SoCs

Amazon started making its own SOCs to maximize its earnings and foster growth for its online services revenue.

As the world’s largest e-commerce company, the corporation makes most of its money selling goods online. But in recent years, the firm has become a significant player in the cloud computing sector. In the third quarter of this year, its Amazon Web Services (AWS) division brought in $11.6 billion in revenue. It has also been working to dominate the smart speaker market with its Alexa-powered Echo products.

To sustain and expand its existing digital portfolio, Amazon needs high-performance hardware to quickly and affordably process vast quantities of data.

Previously, the corporation depended on established providers to supply it with the integrated circuits (ICs). Over time, it gained a greater understanding of its backend requirements and decided to bring its component fabrication in-house. In 2015, it purchased custom chipmaker Annapurna Labs for $350 million to optimize its remote machine learning calculations.

Two years ago, Annapurna Labs unveiled Inferentia, a chip that helps artificial intelligence (AI) programs expedite their calculations. The company’s tests determined its new ICs could outperform Nvidia’s T4 graphics processing units on performance and cost.

Last week, Amazon confirmed its AWS data centers utilize Inferentia chips to handle most of its Alexa queries. It also uses its self-made SoCs to complete some tasks for Rekognition, its cloud-based facial recognition platform.

Amazon’s Next Steps as a Chipmaker

The Motley Fool reports Amazon is not just using Inferentia components to make its products more efficient. The corporation uses the ICs to accelerate AI instances used by social network Snapchat, health insurance giant Anthem, and publisher Conde Nast. Given the effectiveness of its SoCs, it will likely transition more AWS clients to instances running on its own hardware.

Amazon has already developed chips to complete some of its other cloud computing tasks.

At one point, the retailer purchased central processing units (CPUs) from Intel to empower its datacenters. The firm subsequently transitioned to using Advanced Micro Devices (AMD) chipsets because they proved more cost-effective. In 2018, it started making a series of Arm-based processors called Graviton to tackle its hyperscale computing workloads.

In March, AnandTech reported AWS’ Graviton2 CPUs outstripped comparative Intel and AMD products on thermal design power and cache configuration.

That means Amazon found it can make custom hardware that addresses its need better than established providers in two crucial areas. In the future, the firm may replace all of its backend SoC with self-designed chips. With Apple also handling its CPU development in-house and producing eye-popping results, a new trend may be emerging.

To stay competitive, the world’s leading electronic components makers may need to start viewing their biggest customers as rivals.


Please enter your comment!
Please enter your name here