Google Glasses may translate directional sound, speech for the deaf

The Google glasses combined with speech recognition will touch people’s lives in many ways. Here is a great example…….

Google Glasses conversation

As Google engineers continue to push forward on the development of Project Glass, features included within a recent patent may greatly benefit the hearing-impaired community.

Detailed within a patent related to Google’s Project Glass, the search company is developing a feature on the glasses that will offer text alerts for hearing-impaired users when something dangerous is approaching like an automobile. Specifically taking direction and intensity into account within the incoming audio data, the display on the glasses would show directional arrows and flashing lights that increase in intensity when a loud noise approaches the user. For instance, when a hearing-impaired person is crossing the street, the glasses would indicate if a nearby car is honking and point the user in the direction of the noisy vehicle.

larry-page-google-glassesIn addition to directional sound, Google is also attempting to work speech recognition into the glasses. Hypothetically, the speech-to-text feature would determine what a nearby person is saying and display the text of the conversation on the display of the glasses.

According to the patent, Google would differentiate between people standing in a group by displaying a text bubble in a position on the display that is closest to whoever is currently talking. Beyond displaying the text on the display, the glasses would also indicate what type of object or living being is making a sound. For instance, if a dog is barking, the display would indicate that fact.

Also included in the patent, the glasses will include multiple video cameras and microphones to capture several angles in addition to a “finger-operable touch pad” that would accept user commands. Google also laid out a scenario where the video feed from the front-facing camera will be combined with computer generated images in order to create an augmented reality for the user. That augmented reality display could include anything from an overlay of GPS direction when walking down the street to marketing advertisements when a user looks at a particular shopping store. 

More-latest speech technologies
Social share or comment – what do you think?

There are no comments yet. Be the first and leave a response!

Leave a Reply

You must be logged in to post a comment. Click here to log in.

Trackback URL http://www.speechtechnologygroup.com/google-glasses-may-translate-directional-sound-speech-for-the-deaf/trackback

Google Glasses may translate directional sound, speech for the deaf

The Google glasses combined with speech recognition will touch people’s lives in many ways. Here is a great example…….

Google Glasses conversation

As Google engineers continue to push forward on the development of Project Glass, features included within a recent patent may greatly benefit the hearing-impaired community.

Detailed within a patent related to Google’s Project Glass, the search company is developing a feature on the glasses that will offer text alerts for hearing-impaired users when something dangerous is approaching like an automobile. Specifically taking direction and intensity into account within the incoming audio data, the display on the glasses would show directional arrows and flashing lights that increase in intensity when a loud noise approaches the user. For instance, when a hearing-impaired person is crossing the street, the glasses would indicate if a nearby car is honking and point the user in the direction of the noisy vehicle.

larry-page-google-glassesIn addition to directional sound, Google is also attempting to work speech recognition into the glasses. Hypothetically, the speech-to-text feature would determine what a nearby person is saying and display the text of the conversation on the display of the glasses.

According to the patent, Google would differentiate between people standing in a group by displaying a text bubble in a position on the display that is closest to whoever is currently talking. Beyond displaying the text on the display, the glasses would also indicate what type of object or living being is making a sound. For instance, if a dog is barking, the display would indicate that fact.

Also included in the patent, the glasses will include multiple video cameras and microphones to capture several angles in addition to a “finger-operable touch pad” that would accept user commands. Google also laid out a scenario where the video feed from the front-facing camera will be combined with computer generated images in order to create an augmented reality for the user. That augmented reality display could include anything from an overlay of GPS direction when walking down the street to marketing advertisements when a user looks at a particular shopping store. 

More-latest speech technologies
Social share or comment – what do you think?

There are no comments yet. Be the first and leave a response!

Leave a Reply

You must be logged in to post a comment. Click here to log in.

Trackback URL http://www.speechtechnologygroup.com/google-glasses-may-translate-directional-sound-speech-for-the-deaf/trackback
css.php