Conversational AI: The user-interface of the future?

Lucie is my personal assistant, she helps organise meetings for me. If I want to meet someone, I just cc her to the email thread, and she will handle the usual ping-pong of emails about suitable times, dates, locations, and will rearrange the whole thing if there are last-minute cancellations, etc. She can see my calendar, in fact, she has better knowledge and control of my schedule than I do.

This week, after a meeting she organised, I was asked if Lucie was a real person, or a robot.

The answer is: she is a real person. But the fact that my companion was unsure shows just how amazing AI has become at natural language communication.

I have another personal assistant: Amy. She also organises meetings for me. Amy is, in fact, not a person. She is a service by Remarkably, the way you interact with Amy is very similar to how you would ask Lucie, you just cc her in the email thread and she will, too, handle all the communication afterwards to schedule a meeting. I can send Amy an email about a new coffee place I discovered and she will use that for future coffee meetings. Amy is powered by software and AI: she also has access to my calendars, but unlike Lucie, she consumes it through an API.

Natural dialogue as a user interface

The most natural way for people to communicate is by using language. For a long time, natural language has been the predominant means by which humans interact with each other and, therefore, access services. But in the past decade, we have been forced to learn an increasing number of new ways of interaction: buying things from online marketplaces, using Yelp to find restaurants, using Uber to book a car. These user interfaces were designed to be easy to learn, and effective at doing specific things - and we humans were amazing at learning how to use each of them. We still are.

Yet, even though humans are amazingly adept at learning how to best use many different user-interfaces, we see a reverse trend today. Instead of people learning to use new interfaces to access services, software has started to learn the most natural interface of humans: the natural language dialogue.

Amy and constitute just one example of this phenomenon. There are several other services that are built on the concept of a conversational chat interface. Lukais an app that helps you find restaurants, using an interface that resembles texting, or chatting to another person. Cambridge, UK-based VocalIQ are developing a generic dialogue platform, complete with a speech interface, so developers can eventually design their own dialogue robots that can be specialised to talk about any topic.

Google's Brain team have been rumoured to be working on conversational AI recently: Top figures Ray Kurzweil, Jeff Dean and Geoff Hinton all hinted at in various recent interviews and talks. Here is an actual research preprint from Google on this topic, although my guess would be they're doing way more than what's reported here.

The Ultimate UI

What's cool about the dialogue user experience is that it lends itself perfectly to a lean method of developing a product. You can start out with a concierge MVP, where a real person sits in the background and performs the task for the client with no or minimal AI involved. As you do this, you start capturing valuable data about problems your customers usually encounter, and the different ways they express themselves through natural language. Then, you can start feeding that data into an AI system that will understand more and more of what is being said, eventually learning to handle most situations. Human input is then only needed when the machine is uncertain, and eventually the AI will be able to take over entire conversations.



Stay in touch with Balderton

Sign up for our newsletter to stay up to date on news from Balderton, and our portfolio.