What happens when Alexa has eyes??
Alexa and Google have changed the way we cook whether it’s adding ingredients to a shopping list or the music we listen to while we cook…BTW streaming the Fox goes great with cooking.
But what if that Alexa or Google could tell what you were looking at when gave the command.
That’s the idea behind a new prototype from Synapse, a part of Cambridge Consultants.
Synapse utilizes machine vision to gather information about where you are looking when issuing a voice command and applies that information to the stove top.
The cook top gathers which burner is being addressed in a voice command by using its camera tracking to know which burner you’re looking at.
When the system detects a person standing in front of it looking at a burner, commands can “turn that burner on,” or simply saying a level like “medium high.”
In the concept video you can see, a user is cooking and says
“Alexa, turn up the heat.”