Return to site

Revolutionizing the Digital Landscape: Meta's Groundbreaking Leap into Custom Silicon Chips

By WOM

May 18, 2023

SUMMARY

  • Meta invests heavily in AI technology, developing custom silicon chips that power video processing and various AI-specific tasks.
  • The innovative chips are part of Meta's commitment to increasing energy efficiency and reducing excess heat in its data centers.
  • In addition to serving the company's immediate needs, the new technology will power future projects, including metaverse-related tasks and the burgeoning field of generative AI.

Meta Platforms, formerly Facebook, recently pulled back the veil on its private endeavor of developing custom silicon chips. These chips, which are designed to enhance artificial intelligence (AI) operations and video processing, represent the company's proactive efforts in AI technology and infrastructure. Despite the significant cost involved in the design and manufacture of these tailor-made chips, Meta views this as a strategic investment towards future efficiencies.

In the pursuit of performance and productivity, the social networking colossus unveiled these internal silicon projects to the media earlier this week. This revelation comes ahead of a digital event that will focus on the company's technical infrastructure in AI. As Meta navigates through a "year of efficiency", marked by around 21,000 layoffs and significant cost reductions, stakeholders are paying close attention to its investments in AI and data center hardware.

Undoubtedly, the financial outlay involved in creating proprietary computer chips is considerable. However, Alexis Bjorlin, vice president of infrastructure at Meta, spoke with CNBC and assured that the performance boost these chips provide justifies their cost. Furthermore, the company is making a concerted effort to redesign its data centers with a keen focus on energy-efficient solutions like liquid cooling systems, aiming to mitigate heat generation.

The new silicon chips include the Meta Scalable Video Processor (MSVP), which helps in processing and transmitting videos to users while managing energy consumption effectively. Bjorlin mentioned that there was a significant market gap in this regard, as there were no commercial options capable of processing and delivering 4 billion videos per day as efficiently as Meta had intended.

Meta's chip portfolio also includes the first in its Meta Training and Inference Accelerator (MTIA) series, designed to facilitate AI tasks. This specific MTIA chip is built to handle "inference" – the process where a trained AI model predicts or makes decisions.

The novel AI inference chip contributes to Meta's recommendation algorithms, which curate content and ads for user news feeds. Although Bjorlin didn't reveal who was manufacturing these chips, a blog post indicated that they were "fabricated in TSMC 7nm process", suggesting that Taiwan Semiconductor Manufacturing is behind the technology's production.

Bjorlin also indicated that Meta has long-term plans for its AI chips, including processors used for training AI models. However, she remained tight-lipped on the details beyond the new inference chip. Earlier, Reuters reported about Meta canceling one AI inference chip project while initiating another, which was supposed to debut around 2025, but Bjorlin didn't comment on this information.

Meta hasn't followed other tech firms like Alphabet or Microsoft, who offer cloud computing services, and consequently, has not felt obliged to share its internal data center chip projects with the public. "What we're sharing—our first two chips—is definitely a glimpse into what we're doing internally," Bjorlin noted.

Aparna Ramani, Meta's vice president of engineering, said the new hardware was developed in tandem with its in-house PyTorch software, a popular tool among third-party developers for creating AI apps. The new hardware is planned to be employed for future metaverse-related tasks, such as virtual reality and augmented reality, and for the emerging field of generative AI, which relates to AI software that creates compelling text, images, and videos.

Ramani further mentioned that Meta has created an AI-powered coding assistant to aid its developers in more easily crafting and managing software. This new tool mirrors Microsoft’s GitHub Copilot tool, released in 2021 with assistance from AI startup OpenAI.

Finally, Meta announced the completion of the final phase of its supercomputer, known as Research SuperCluster (RSC), which was detailed last year. The supercomputer, equipped with 16,000 Nvidia A100 GPUs, has been used to train the company’s LLaMA language model, among other applications.

Ramani emphasized Meta's ongoing commitment to contributing to open-source technologies and AI research to advance the technology field. The company revealed that its largest LLaMA language model, LLaMA 65B, contains 65 billion parameters and was trained on 1.4 trillion tokens, which are the data used for AI training. Despite the leak of the LLaMA language model to the public, Meta remains committed to open science and cross-collaboration, continuously exploring open-source collaborations.


WOM Money Picks

Be a part of the winning team | 81% Success Rate.