Facebook Uses Retail Data To Train Chatbots To Be Like Humans

Facebook has launched, what it calls, a new way to train artificial intelligence (AI) in chatbots.

Researchers at the social network giant described how data about human conversations is the key to improve shopping chatbots, according to a technical report shared by VentureBeat.

Dubbed “Situated Interactive MultiModal Conversations (SIMMC),” its goal is to allow AI chatbots to show an object and explain what it’s made of in response to images, memories of previous interactions and individual requests.

The online news service reported Facebook revealed 13,000 human to human conversations across two retail spaces, furniture and fashion, were used as data for the training.

Facebook’s mission is to imitate human chat by responding to images and messages as a person would.

As a result of its research into shopping chatbots, the technology can be improved with more data from conversations that live humans have with each other about products such as furniture and fashion, the report said.

In the furniture data, a user could interact with an assistant to get recommendations for items such as couches and side tables.

Facebook’s research team said they created it by building a virtual environment where volunteers were connected with humans posing as a virtual, full-featured assistant, the report said. The users could request a specific piece of furniture and the assistant could filter a catalog from online furniture retailer Wayfair to show products by price, color and material as they navigate the results.

For fashion, users asked humans posing as virtual assistants to suggest jacket, dress, and other clothing and accessory suggestions, the report said. Like furniture, assistants could sort by price, brand, color.

Building on these data sets, the Facebook researchers said they crafted an assistant consisting of an utterance and history encoder, multimodal fusion, an action predictor and a response generator, VentureBeat reported.

After creating SIMMC’s fashion and furniture, Facebook’s researchers said they outperformed two AI systems. The best performing action predictor chose the right application program interface (API), a set of protocols and tools for building software applications, nearly 80 percent of the time for furniture and 85 percent of the time for fashion, VentureBeat reported.

Facebook said it plans to release the data and models at a future date.

The research follows Facebook’s detailing of the AI systems behind its shopping experiences, which are evolving on Instagram, WhatsApp and Facebook.