Google reportedly testing smart displays that activate when you look at them

  • Google reportedly testing smart displays that activate when you look at them

Google reportedly testing smart displays that activate when you look at them

Google is reportedly testing a new feature to let users activate their smart speakers just by looking at them. Basically, with this, there will not be any need to repeat the wake words for each and every command to the Google voice assistant.

However, it is still unclear if Google has plans of bringing this feature which is given the codename of Blue Steel, to the public version of the firmware. Directly placing the order will serve the objective.

According to the report, Boromeusz "Nest Hub Max" smart display is running leaked internal firmware meant for testing within Google.

The video shows how, when close enough to the Nest Hub Max, only the word "weather" is needed to have Google show the local weather forecast. Google announced Nest displays were set to use ultrasound in December previous year, but the main objective was to figure out how far away you were standing so as to adjust what was displayed on screen.

Today, Google's voice command hardware is always listening, but only the hotword "Hey Google" is listening. This feature is going to use ultrasound waves to determine the location of the user. If it is detected, it will start processing additional commands. Using the camera will make the app recognize the face of the user, therefore, when the order is given out, its comprehension will be easier.

There's also no doubt that if Google were to release this feature, there would be significant privacy concerns. The key element of current smart speakers and displays is that they focus on what you are saying only after they have heard the word that has arisen.

We're all used to waking up our virtual assistants using a phrase such as "OK Google" or "Hey Siri", but that may soon be a thing of the past, at least for Google Assistant.