Last week Google showed its vision of the future of computers, as it publicly showed off Project Glass, its so-called Google Glasses, for the first time.
The glasses, if you haven’t seen the demo, project digital information right in front of your eyes. They might appear to transform humans into cyborgs, but they make the computer more personal than ever before.
If Google has its way, the personal computer will change. And while glasses are one way, they are not be the only way the search giant plans to change how we interact with digital information. At the Google I/O conference, it became clear: to Google, there isn’t just one future personal computer. They will come in different shapes and sizes and we will interact with them differently.
Vision As A Control
While Google used an elaborate skydiving stunt to show how the glasses take and send images, it didn’t talk much on stage about how the glasses will work and how digital information will help us in the real world.
A few weeks ago, before the conference, ABC News visited Google’s Mountain View campus with the mission of trying to find out more about Project Glass and how Google pictures the future of computing.
“Today we already have a product called Google Goggles, where you can take a picture of something and we try and understand what it is and give you all the results about it,” Amit Singhal, Google’s VP of Search, explained when asked about how search might play into the glasses.
Google Goggles, rolled out in 2009, is an app for the iPhone and Android phones; you take a photo and the search engine will try and search for it. The feature even lets you take a photo of text and then Google will translate it on the fly.
“Search has become far more understanding about what our users are really looking for,” he said.
But location, as well as search, is going to play a big part in the experience. Or at least that’s the feeling we’ve gotten from the early video Google showed of the Project Glass experience.
Which is why we tracked down Google’s Geo or Maps team. “You can imagine at some point you might be able to get map data or other information about where you’re looking and you might just see what is in this building or what are the hours of this business,” Peter Birch, Google Maps Project Manager, told ABC News.
Birch wouldn’t comment on the exact GPS technology in the glasses or how that information would appear to users who are wearing the glasses. But he did say Google is consistently working to place information where it most makes sense. “It’s all different ways of delivering this location information to users wherever they are in the most useful way.”
Birch and his team also showed ABC News Google’s Usability Testing Lab, where Google brings in real people and then sets them up on computers to test future products. They aren’t just any computers; many of them have built-in webcams with eye-tracking software, so the testers can see what areas of the screen you focus on the most and what features you may or may not see.
As you will see in the video above, some of the testing rooms also have one-way tinted windows, so that a Google employee can sit on the other side to observe the tests.
A Conversational Computer
But while a computer that understands our eyes is one way Google envisions the future of our computers, it became clear in speaking to Singhal that the software giant is also planning for a future where computers understand all our senses better.
“When all these technologies come together — computer vision, speech recognition, and search — products will emerge in the future that I can’t even dream of right now,” Singhal said during our conversation.
“What I’m really excited about is when computers start understanding language much closer to how you and I do,” Sighal said excitedly. “The possibilities are just tremendous. When computers understand language, you can imagine a conversational computer that does not exist today.”
There are other steps to take first. Google has Chrome OS laptops and desktops that are browser-based computers, so a push into Android computers might not be far off.
Tablets, Keyboards, and Android
“Android is the fastest growing computing platform ever. These desktop machines haven’t changed in 20 years,” Google’s Director of Android User Experience, Matias Duarte, told us. “There are so many ways computers can get so much richer, so much better, so much more immersive. Android is the perfect opportunity for us to do that. To change the status quo of how people use computers.”
Duarte addressed the success of the Asus Transformer Prime, an Android tablet that comes with a keyboard dock. “Already our partners have started to explore that space. Sure, there are a lot of ways where Android does not yet have all the capabilities of a desktop system, but the path is clear: we want Android to work on desktops and laptops,” he added. “I want people to touch and fling and gesture through things.”
So we may not have to look like cyborgs just yet. But the computers we’ve lived with — a screen, a keyboard, perhaps a mouse or other pointing device — have already changed. And if Google has its way, there’s much more to come.
More-latest speech technologies
Social share or comment – what do you think?