Would your banking experience be more satisfying if you could gaze into the eyes of a chatbot and know it sees you?
At AMP ‘s recent Amplify conference professor and entrepreneur Mark Sagar describes the development of computer interfaces that look like humans, and learn, just like humans. His startup, Soul Machines is developing expressive digital faces for customer service chatbots.
Digital faces are produced by simulating the anatomy and mechanics of muscles and other tissues of the human face. The avatars can read the facial expressions of a person talking to them, using a device’s front-facing camera. Sagar says people talking to something that looks human are more likely to be open about their thoughts and be expressive with their own face, allowing a company to pick up information about what vexes or confuses customers.
To simulate empathy, avatars can be programed to react to a person’s facial expressions with their own simulated facial movements.
He suggests that will make them more useful and powerful, in the same way that meeting someone in person allows for richer communication than chatting via text. “It’s much easier to interact with a complex system in a face-to-face conversation,” says Sagar.
Soul Machines has already created an assistant avatar called Nadia for the Australian Government. It’s voiced by actor Cate Blanchett and powered by IBM’s Watson software. It helps people get information about government services for the disabled. IBM has prototyped another avatar, Rachel, that helps with banking.
View the full session here