Emerging Technologies: Google Glass

picture 2
 This post is first in a two-part series about emerging technologies from Google and was written by Johanna Perez Strand, a Smithsonian Mobile Fellow.

As shown in the official video, Wearing Google Glass is like wearing your smartphone… right in your eyes! The titanium-framed glasses headset is comfortable and light. A small image about the size of a postage stamp appears just above your line of sight and allows you to read text messages, take photos, and receive directions to your destination. You can even listen to music, thanks to some built-in earbuds and the touchpad on the device allows you to change settings.

Google is currently experimenting with voice commands and head gestures, both of which will allow the user to interact with the device. It still remains unknown whether the vocal commands will activate all of the functions or just a few of them, whether the notifications will appear in the center of the screen or if it will be possible to reposition them, or even if these images will be opaque or partially transparent. However, the most important issue right now is whether these glasses will do what smartphones already do then, why would they need to be glasses? What would be the ideal scenario where the device would make sense in our everyday lives?

According to the head of the project, Babak Parviz, a former bionanotechnology expert at the University of Washington, who already imagined Augmented Reality in contact lenses, (A Twinkle in the Eye), the aim of Google Glass Project is to create a device that allows people to connect with others through images and video-“a device that would see the world through your eyes and allow you to share that view,” and to create a technology by which people would have instant access to information.

In the video I used Google Glass, a journalist asks the glasses he’s wearing when the first “Terminator” movie was made. A vaguely robotic woman’s voice instantly emerges from the device to deliver the answer—”1984.” One of the things the journalist wonders is whether the users who already wear prescription lenses, just like he does, will have to remove them in order to wear Google Glass. Isabelle Olsson, the Head Designer, says that Google and glasses manufacturers are already working on this issue.

“My wish is to make it as minimal as possible without being boring,” Olsson told Business Insider, and showed some of the features of the phone: “to take a photo, you squeeze an area of the eyepiece near the lens, like miming taking a photo; when using the touchpad, you point a finger towards your temple, as when thinking.”

Precisely because Glass is worn on the face, it allows information to flow in, it has the potential to capture the moment, and possibly be less interuptive than a camera or a phone. “My aspiration is not to interrupt people, but to give them access to information when they want it. They’re in charge,” says Parviz.

One of the main challenges the project is facing right now is finding the appropriate level of obtrusiveness.  Although these are still demos, the notifications coming out of the following apps are going to be hard to ignore, as they will show themselves right on our faces:

  • The New York Times service can be configured to push breaking news straight to the Glass headset. A headline will appear, which can be selected by tapping the device’s trackpad, or just by looking up. While on the go, Glass can read articles aloud to the user.
  • The social networking app Path will forward photos taken by friends to your own Glass headset, to which you can respond, using text or an emoticon.
  • Gmail can be set to push messages marked as “Important.”
Do we really want the eyes of the person sitting in front of us to wander up and to the right to check that important email during every conversation? Will we be able to manage so many notifications? Or will Google Glass drown us in data? At least for now, Parviz said Google doesn’t have any plans to display advertising on the device.

Glass will most likely be available to consumers somewhere between the end of 2013 and 2014. For the time being, the actual product has not yet been defined. Google is crowdsourcing ideas as the project has entered what is called the “feedback gathering phase,” during which the community is expected to chime in on what they would like/dislike in a fully realized product. “(…) there’s a lot of experimentation going on. And a lot of rapid prototyping on the team,” a Google spokesperson said.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s