“Shut up, Alexa” – Why Amazon wants us to talk less to smart assistants
That was the view of Amazon Alexa SVP Tom Taylor, who spoke about the next evolution of smart assistants in a presentation at this year’s WebSummit in Lisbon.
Since launching Alexa in 2014, Amazon has reportedly sold more than 40 million of the devices in the US alone.
During Taylor’s presentation – Why Amazon wants you to talk to Alexa less – he revealed that customers interact with Alexa billions of times every week, and the number of applications and capabilities for the device has grown from just 13, relatively simple tasks at launch, to over 130,000 skills.
“Alexa has become a part of people’s lives in a way that we could have never imagined,” he said. “And it’s only going to go but get better. The team and I are inspired daily by stories of how Alexa is making a real difference.”
The next evolution, he said, is “ambient intelligence” – using AI to weave together multiple devices and intelligent functions to provide more value for the end user.
“This isn’t just more connected devices,” he added. “It’s about adding intelligence throughout the system to make the devices better. This is the next big leap forward for technology inside and outside the home. It understands you and adapts accordingly. It’s there when you need it. recedes into the background when you don’t and it’s able to take action for you.”
This, he explains, means users will be “reaching for their phones less” and ultimately speaking to digital assistants, such as Alexa or Siri, less because the AI will pick up on more functions.
“That means you’re spending more time looking up at the world and the people in it and not down at your hands.”
Amazon has two core focusses to enable this, according to Taylor: Self-service AI and self-directed AI.
“From the beginning, a big part of the magic of voice assistants was their accessibility through a small affordable speaker that sits in your kitchen or living room you could communicate with a computer just as you would with another person just using your voice,” he explained.
The next is to “democratise AI” by allowing ordinary users to tailor AI to carry out specific tasks, instead of programming them in using code. This “low code” or “no code” approach would allow a chef to teach an AI to bake something, rather than a computer scientist, who may have never baked before, trying to programme it in.
“In the past few years, we’ve enhanced routines with preset triggers, in which your devices are trained to respond to events in a specific way,” he said. These triggers could be based on audio, such as asking Alexa to play soothing music if it hears the sound of a baby crying, or based on location – known as geofencing, such as asking Alexa to turn on the heating when it detects you are almost home from work.
“We’re building on that by enabling people to make even more specific choices about the sounds that matter to them with a feature called Alexa custom sounds. Customers will be able to teach Alexa to recognise distinct sounds in their unique environment with just a few audio samples, and then have the option for Alexa to begin a routine once that sound is detected.”
In September, Amazon introduced Custom Event Alerts, which allow customers to create unique personalised detectors on connected cameras specific to objects in homes or businesses that matter. This can help solve issues such as when you leave home and forget to switch off a light or lock the door.
He said: “Ultimately, AI will enable people to influence and shape the future of AI capabilities from a diverse set of use cases. This will help create a greater variety of customised AI assistants produced by non AI specialists including individual customers.
“We believe there will be multiple assistants, playing different roles in different contexts. This is the beginning of the second wave of AI, with AI becoming less reactive and more proactive on its own.”
However, even with self-directed AI, privacy will still underpin everything Amazon aims to do with Alexa, explained Taylor.
“We knew from the beginning that we had to give customers transparency and control over how their data is used and stored,” he added. “We continue to innovate in this area. In fact, coupled with advances in transfer active and federated learning, we’ve reduced Alexa’s reliance on supervised learning by orders of magnitude. And I expect we’ll see much more of this in the coming years.
“As AI becomes more self-service and self-directed. It’s moving the world closer to true ambient computing. We’re excited by this positive impact these technologies are having on customers every day, and even more bullish about their potential in the long term. We look forward to the future where you can talk to your Alexa less and enjoy life more.”
Subscribe to our Editor's weekly newsletter