The big picture: Facebook has been added to a growing list of companies in a hurry to develop their chips. Apple, Microsoft, Google, Amazon, Tesla and Baidu are looking to reduce their reliance on silicon giants such as Intel, AMD, Nvidia and Qualcomm. The main reason is that custom silicon, designed for specific workloads, requires less power and can be more scalable than generic devices. The latter has become a commodity that any competitor can use and is not allowed to integrate with the software.
Facebook is one of several companies that use and rely on a dedicated silicon train. According to a report by The Information, the social giant is developing a family of private chips to speed up machine learning. One of them is used in AI training that uses content recommendations.
This effort dates back to 2018, when it became apparent that Facebook was looking to hire experienced engineers to design FPGAs and ASICs. A year later, the company revealed plans to build an artificial intelligence pipeline for its data centers with the help of partners such as Intel, Qualcomm, Marvel, Esperanto and Habana — now owned by Intel.
Image: Facebook pipeline for AI training
It has changed and is developing completely new chipsets inside. A company spokesperson explained to us that "Facebook is constantly exploring ways to increase performance and energy efficiency with its silicon partners and through our internal efforts." This indicates that the company intends to take small steps in the coming years, as the new chips have not completely replaced third-party solutions.
The company is also developing a video encoder chip to improve the infrastructure that provides live video and live broadcasting in its applications. This is similar to what Google did with its "Argos" programming units (VCUs) to speed up the encoding of videos uploaded to YouTube.
Facebook is said to be working on silicon dedicated to servers