OpenAI Launches Plans for Data Centers and Robotics

OpenAI is reportedly lining up suppliers to support a major product expansion over the next few years.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    The artificial intelligence (AI) startup has put out a request for proposals from companies that manufacturer in the United States and offer components related to data centers, robotics and consumer devices, Bloomberg reported Thursday (Jan. 15).

    “AI is a catalyst for the reindustrializing of the country,” OpenAI Chief Global Affairs Officer Chris Lehane said in the release. “We do have to bring back the supply chain here.”

    OpenAI said Wednesday (Jan. 14) that it will integrate 750 megawatts of ultra-low latency compute from chipmaker Cerebras to accelerate the response time of its AI models. Cerebras said the rollout of the compute will add up to the world’s largest deployment of high-speed AI inference.

    It was reported Jan. 1 that OpenAI is improving its audio AI models because it expects users of future personal AI devices to communicate with them through voice commands rather than via a screen. The AI startup plans to release the new audio model during the first quarter, its first personal AI device in about a year, and several personal AI devices, including glasses and a smart speaker, over time, The Information reported.

    In November 2025, OpenAI CEO Sam Altman and former Apple designer Jony Ive said the company’s first device would be ready in less than two years. Ive, who led the design team for the iMac, iPhone and iPad, and is now working on a device with OpenAI, said the design centers on creating something people can use without hesitation. Altman added that the device aims to reduce the number of steps required to interact with AI.

    Advertisement: Scroll to Continue

    When it comes to AI infrastructure, Altman said in October 2025 that OpenAI would like to build an AI factory to make 1 gigawatt of compute per week of new capacity. At the time, the company had committed to about 30 gigawatts of compute with a total cost of ownership of about $1.4 trillion. Altman said OpenAI was comfortable with that “given what we see on the horizon for model capability growth and revenue growth.”

    For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.