This article first appeared on GuruFocus.
Meta Platforms (META, Financials) wants to deploy four generations of internally built AI processors by 2027.Meta’s Meta Training and Inference Accelerator (MTIA) initiative includes the new chips, MTIA 300, 400, 450, and 500. The chips, built by Broadcom, will meet Meta’s platforms’ AI systems’ rising computational needs.According to the business, the MTIA 300 chip is in production and utilized for ranking and suggestion training. MTIA 400 has finished testing and will shortly be implemented in data centers.Future CPUs can handle more complicated generative AI tasks. Meta stated MTIA 450 will optimize GenAI inference, while MTIA 500 will boost AI processing speed by expanding high-bandwidth memory capacity.As processing power rises, Meta wants to control more of its AI infrastructure. The business invests extensively in AI hardware from Nvidia, AMD, and Alphabet, but its in-house processors are likely to play a greater role.Meta intends to enhance productivity, decrease costs, and maintain its hardware compatible with fast growing AI models by creating chips internally.