The Google Assitant AI won’t enslave us, but it might help us fix our hair
It’s an artificial intelligence agent that can make phone calls for you – and we don’t mean dialing the number. We mean it has actual conversations with real life people.If you haven’t already seen the demo, please watch it below by skipping to 1 hour, 55 minutes on the video from Google's Keynote. We recommend you watch it as it's amazing.Now, we know that keynote videos are the Instagram photos of the tech world: what you see is highly polished, carefully selected and often vastly different to what you can expect in reality... but Duplex could be really useful.
What is It ?
Google Duplex isn’t designed to replace humans altogether. It’s designed to carry out very specific tasks in what Google calls "closed domains". So for example you wouldn’t ask Google Duplex to call your mum, but you might ask it to book a table at a restaurant.
Initially Google Duplex will focus on three kinds of task: making restaurant reservations, scheduling hair appointments and finding out businesses’ holiday opening hours.Partly because Google reckons it’s the most efficient way to get the information, especially if there are variables and interruptions, and partly because if you got a phone call from the Terminator you’d probably hang up.
How does it work
Google Duplex is the missing link between the Google Assistant and any business, because it enables the Assistant to get information that isn’t available digitally. For example, you might want to know a business’s holiday opening hours but they haven’t listed it on their website, or you might want to know if a shop has a particular item in stock and it doesn’t have online stock availability. From a tech perspective, Google Duplex uses a recurrent neural network (RNN) built using TensorFlow Extended (TFX). There’s a really good introduction to RNNs here. What RNNs like the one powering Duplex can do is process sequential, contextual information, and that makes them well suited to machine learning, language modelling and speech recognition.
What's so clever about it?
Duplex talks like a normal person, and that makes it a natural – and natural-sounding – extension to the OK Google functionality we already know. Let’s stick with our restaurant example. With Duplex, we could say “OK Google, find me a table for Friday night” and the Google app would then call restaurants on your behalf. Not only that, but it would have conversations – so if you wanted a table for around 7:30 but there wasn’t one, it could ask what times were available and decide whether those times fit your criteria. If not, the Google app would call another restaurant. Similarly if you wanted to arrange a meeting with Sarah, the Google app could call Sarah (or Sarah’s AI) to talk through the available time slots and agree which one would be best.
Who’s it for?
The benefits for people with hearing difficulties are obvious, but it can also overcome language barriers: you might not know the local language, but Google Assistant does – so it can converse in a language you don’t speak. And it can be asynchronous, so you can make the request and then go offline while Google Duplex gets on with the job: it will report back when you’re online again. That’s useful in areas of patchy connectivity, or if you’re just really, really busy.That’s a very good question. Right now, our personal digital assistants are more about the digital than the assistance: you can ask them to turn up the lights or tune the radio, but they can’t book your car in for a service, make a dental appointment or any of the many other bits of tedious admin we spend so much time doing. Imagine the hours we’d save if we didn’t have to spend countless hours on the phone to change the tiniest little detail in a form, answer a simple customer service question or find out why our broadband is on the blink again. Read More
Post a Comment