By Chris Albrecht
Amazon’s Alexa plays a central role in my family’s morning routine: she tells us the weather, plays the radio and sets timers so we can catch the bus. Alexa is great for simple tasks, but she and other smart speakers stumble when it comes to more complicated requests. That’s just a limitation of having only a voice and speaker interface.
But the way we interact with Alexa and Google Home is about to undergo a dramatic shift as those devices add screens and cameras. In doing so, we’ll move from talking and listening to our virtual assistants to looking, touching and smiling to get what we want.
Smart speakers like Amazon Alexa and Google Home are quickly crossing over into the mainstream. Amazon touted that it sold “tens of millions” of Alexa devices this past holiday season, while rival Google said it sold “tens of millions” of its Home devices throughout last year. According to a recent study by NPR and Edison research, roughly 39 million people (18 and older) in America own a smart speaker, and 65 percent of those surveyed “wouldn’t want to go back to life without their smart speaker.”
But in a room like a kitchen, voice assistance only gets you so far. Think about trying to put together a recipe by just listening. Sure you can do it, but listening to a set of instructions is not the best way to make a meal. Enter the kitchen screen, which is fast becoming a big trend this year with LG and Samsung building them into their fridges, and GE unveiling its giant monitor that’s meant to hang over your oven.
Screens will add a much needed visual component to smart speakers, turning them into smart displays. Instead of just a voice walking you through a recipe, you can see accompanying photos and videos to demonstrate technique and what the end product should look like. Since all of these screens have touch capabilities, they will make more general tasks like swiping through music and news much easier and faster than saying “next” every time you want to skip ahead.
Both Amazon and Google already recognize this and are adding screens to their smart speakers. Amazon released its Echo Show last year, and the smaller Echo Spot in time for the holidays. Instead of making its own device (for now, anyway), Google is being built into new smart displays from JBL, Lenovo, LG and Sony.
These smart displays will also come with a built-in camera for things like video calling, but eventually, these cameras will do more. Touchscreens are good for displaying information, but the “touch” part becomes a little harder in the kitchen when your hands are greasy or covered in cookie dough. The smart display camera then could become a motion sensor. Rather than touching the screen you wave your hand to go back a page, swipe through a list of ingredients or scrub a video to the exact part you want.
In addition to gesture control, the camera could also be used for facial recognition. As each member of the family looks into the smart display, a personalized view of news, messages, reminders and more will appear on the screen.
But these cameras wouldn’t just look at our faces; they will also see the contents of our kitchens. Already LG and Samsung fridges come with internal cameras to show what food you have and help you label it. Machine learning, RFID tags and scent sensors will work in unison to automatically recognize and inventory food in our fridge and pantries. These cameras will all tie into our virtual assistant of choice to let us know when we’re running out of items, order replacements and make recipe recommendations.
The interface, in this more extreme case, becomes invisible and just predicts and presents us with the information and items that we want, with no interaction with us at all. And that will definitely be something to smile about.
Chris Albrecht writes about startups and foodtech for The Spoon. He also is the Master of Ceremonies for Smart Kitchen Summit. In previous roles he has been Creative Director for Gigaom and Editor and Staff Writer at Gigaom.