AI as OS Will Transform the Future of Connected Devices

group of people holding smartphones

Generative artificial intelligence (AI) ushered in a transformative phase shift this past year.

And that phase shift — akin to the commercialization of the internet — has kicked off an AI arms race among the biggest tech companies as well as the most well-funded and ambitious startups in the space.

As we head into 2024, the AI arms race has an emergent new weapon: intelligent hardware devices engineered and built for the generative AI era.

While this is technically still the initial development stage of the AI economy, where companies such as OpenAI, Meta, Microsoft, Anthropic and others are seeking first and foremost to generate public interest in their products and capture customers for their platforms, the increasing emergence of AI-centric hardware devices signals that the AI ecosystem could be entering into its next phase of commercialization and connected use.

This, as Apple, whose efforts to-date around AI have been regarded by observers as relatively lackluster compared to the pioneering efforts of its Big Tech peers, published a new research paper (Dec. 12) investigating the best method to run large language models (LLMs) using the limited compute capability of a smartphone.

As connected devices powered by generative AI start to become a technical and commercial reality, Apple’s research points to a crucial battle line emerging between devices powered by hardware-based and cloud-based AI models and even entire operating systems.

Read also: Who Will Power the GenAI Operating System?

Putting AI Into the Hands of Consumers With Connected Devices

The proliferation of smartphones, smart speakers, wearables and other connected devices has the potential to usher in a new era of AI accessibility for consumers. From virtual assistants that understand natural language to image recognition for enhanced photography, AI applications have transformed the way we interact with and utilize personal technology across various mediums.

OpenAI’s ChatGPT mobile app has been downloaded more than 110 million times and generated $28.6 million in consumer spending worldwide, while Baidu’s Ernie AI chatbot recently amassed 100 million-plus users.

Still, the smartphone market itself just had its worst year in a decade — meaning that device makers could be looking to tap AI to ship more units by tapping into 2023’s hottest technology trend.

And they will have two ways to do so: either by optimizing AI systems to run on battery-powered devices or by relying on cloud-hosted AI systems that rely on broadband connection.

The hardware-native approach to implementing AI in consumer devices involves integrating dedicated hardware components for processing AI tasks locally, compressing existing AI models to run efficiently on devices without the need for internet connectivity.

This hardware-based approach includes specialized processors like graphics processing units (GPUs) or dedicated AI accelerators. The advantage of this approach lies in its ability to execute AI algorithms directly on the device, minimizing latency and maintaining functionality even in the absence of a stable internet connection.

Processing data on-device also mitigates privacy concerns, as sensitive information remains within the confines of the device, reducing the risk of unauthorized access.

Contrary to the hardware-based approach, cloud-based AI models leverage remote servers for processing and analysis. In this model, the connected device acts as a conduit for transmitting data to the cloud, where AI algorithms perform computations before sending back the results. This paradigm offers the advantage of scalability, as well as that of leveraging vast computational resources and continuously updating models without requiring hardware upgrades.

Hardware-based models shine in scenarios where low latency is critical, such as voice recognition and augmented reality applications. On the other hand, cloud-based models offer scalability and the ability to deploy cutting-edge algorithms without the need for frequent hardware upgrades.

See more: AI Copilots Usher in the Service-as-Software Era

Building an Operating System (OS) for the AI Economy

As PYMNTS CEO Karen Webster wrote, one of the most significant implications of the digital transformation of the global economy is the inevitable shift from smartphones and apps to a distributed network of connected devices and smart ecosystems to access information and conduct commerce. GenAI and LLMs will be important catalysts for that change. We will likely see a significant shift in that direction over the next three to five years.

“There’s not an operating system for everything to plug into … things need to be built for a world of AI in order for that AI to work and scale,” Adrian Aoun, CEO at Forward, told PYMNTS.

“Think about why the mobile computing revolution took off — it was all because within the iPhone Apple brought together a Broadcom chip, AT&T cell towers, Corning Glass, an LG screen, with a powerful operating system in the middle so that developers from around the world don’t need to think about hard drives. They don’t need to think about connectivity, they don’t need to think about cameras. They can just have an idea and ship it out to billions of people,” Aoun said.

The ideal solution for an AI-driven OS may vary depending on the specific use case, but the ultimate goal is to provide consumers with seamless, efficient and intelligent experiences.

After all, that’s something that PYMNTS Intelligence shows consumers around the world increasingly want — and expect.