PYMNTS MonitorEdge May 2024

Emotive AI Aims to Offer New Ways to Connect With Consumers

chatbots, emotions, avatars

In a lab in London, a new generation of avatars is learning to smile, frown and wink — not just as programmed, but in response to human emotions conveyed through text. 

Backed by Nvidia, Synthesia’s latest creation aims to mimic and deeply understand feelings, turning every digital interaction into a near-human experience. As these emotional avatars prepare to enter the marketplace, they might redefine the future of customer service and beyond, one nuanced expression at a time.

“The biggest point here is that avatar usage is going to continue to grow,” Ben Ferguson, COO at Soundscape VR, which uses generative AI, told PYMNTS. “And more than likely, we will exist in a world where we have multiple avatars for different platforms. An avatar that has been created or assisted in creation by AI can really help narrow down the specific case use of those avatars and refine the look and appearance of what might be needed.”

AI-Fueled Avatars

Synthesia has developed “Expressive Avatars” that aim to shake up professional video production by blurring the lines between virtual and real characters. The technology supposedly eliminates the need for cameras, microphones, actors and lengthy editing processes, reducing production costs significantly.

The company’s studio employs actors who read scripts in front of a green screen to train the AI system. Synthesia showcased the platform’s capabilities in a demonstration by inserting lines of text that the AI-generated actor then read in the corresponding emotional tones.

To address concerns regarding the potential misuse of its technology for creating fake news content, Synthesia requires publishers to sign up as enterprise customers to develop synthetic avatars. Additionally, all content generated using Synthesia’s technology undergoes moderation to ensure its legitimacy.

Many Uses for Avatars

Until now, most avatars were just animation banks that were created for particular story-telling, David Stark, CMO of Openstream.ai, told PYMNTS. Created offline, the expressions and emotions were crafted by skilled artists to suit the story. 

“The advances in the field of AI have enabled rendering those expressions programmatically based on the situation,” he added. “For example, with AI, the avatar can be manipulated to put a sad face or happy face by using a command or even based on the text of the dialogue.”

AI avatars can be endowed with a combination of other AI capabilities and techniques, such as multimodality and emotional AI, Stark noted. The most sophisticated AI Avatars can combine various human sensory inputs such as speech, vision, gestures and touch with Emotional AI to understand and be empathetic to what a person is trying to convey, and can convey back an appropriate multimodal response to that person — in real time, without any scripts. 

Stark emphasized the significant commercial potential of advanced avatars. Envision digital twins of human experts — far beyond mere visual or auditory replicas — crafted by AI to mimic their human counterparts perfectly. These avatars can access a broad range of enterprise knowledge, collaborating with customers, prospects or employees to accomplish a goal.

Stark highlighted the wide-ranging possibilities for employing AI avatars and digital twins of human experts across various industries. He suggested that companies can leverage these innovative technologies to broaden or improve access to specialized expertise. By integrating AI avatars and digital twins into their operations, businesses can provide their teams with virtual experts who offer guidance, perform tasks or interact with customers, enhancing efficiency and the quality of service they deliver.

“[Considering] how to augment customer service agents, sales, training and HR scenarios across multiple industries are all viable considerations,” he added. 

Despite their potential, the rise of hyper-realistic avatars raises serious concerns, especially their potential misuse in deepfakes, identity theft and spreading misinformation, according to Yassine Tahi, the CEO of Kinetix, which allows games to integrate an AI-powered emotion feature. As avatars become more lifelike, distinguishing real from manipulated content grows tougher, increasing privacy and security risks online.

“Additionally, there are apprehensions regarding the social impact of hyper-realistic avatars, reminiscent of the detrimental effects seen with Instagram filters,” he added. “Just as these filters perpetuated unrealistic beauty standards and contributed to mental health issues, hyper-realistic avatars may exacerbate societal pressures by promoting unattainable ideals of perfection, ultimately affecting individuals’ self-esteem.”