Meta debuts new generation of AI chip

107279979 1690906950125 gettyimages 1569404479 Omarques 01082023 techPOL 3 of 21

107279979 1690906950125 gettyimages 1569404479 Omarques 01082023 techPOL 3 of 21

Meta Platforms unveiled details on Wednesday about the next generation of the company’s in-house artificial intelligence accelerator chip. Reuters reported earlier this year that Meta planned to deploy a new version of a custom data center chip to address the swelling amount of computing power necessary to run AI products in Facebook, Instagram, and WhatsApp. The chip, referred to internally as “Artemis,” will help Meta reduce its reliance on Nvidia’s AI chips and lower its energy costs overall.

“This chip’s architecture is fundamentally focused on providing the right balance of compute, memory bandwidth, and memory capacity for serving ranking and recommendation models,” the company wrote in a blog post. The new Meta Training and Inference Accelerator (MTIA) chip is part of a broad custom silicon effort at the company that includes looking at other hardware systems too. Beyond building the chips and hardware, Meta has made significant investments in developing the software necessary to harness the power of its infrastructure in the most efficient way.

The company is also spending billions on buying Nvidia and other AI chips. This year CEO Mark Zuckerberg said the company planned to acquire roughly 350,000 flagship H100 chips from Nvidia. Combined with other suppliers, Meta plans to accumulate the equivalent of 600,000 H100 chips this year. Taiwan Semiconductor Manufacturing Co will produce the new chip on its “5nm” process. Meta said it is capable of three times the performance of its first-generation processor.

The chip has been deployed in the data center and is engaged in serving AI applications. The company said it has several programs underway “aimed at expanding the scope of MTIA, including support of (generative AI) workloads.”

Meta Platforms, formerly known as Facebook, has unveiled its new in-house artificial intelligence accelerator chip, the Meta Training and Inference Accelerator (MTIA) chip, internally known as “Artemis.” This chip is designed to improve the efficiency of running AI products on the platform, reducing reliance on Nvidia’s AI chips and cutting energy costs. Meta plans to acquire 600,000 flagship H100 chips from Nvidia this year and has invested in developing software to maximize the capabilities of its custom silicon. The new chip, manufactured by Taiwan Semiconductor Manufacturing Co, is capable of three times the performance of its predecessor and is currently being used in data centers for AI applications.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top