The most interesting thing Google announced as part of its software update for Google Glass yesterday was an update to the MyGlass app for Android. Photos taken on Glass now instantly sync to your device, where you can apply filters and share via whatever photo-sharing services are available on your phone — not just the options built into Glass.
As someone who owned Glass for several months, that sounds like a huge improvement on the user experience of sharing photos from Glass itself. While Glass would automatically back up your photos to Google+ when connected to Wi-Fi, sharing to Facebook or Twitter required tapping through multiple layers of the interface and dictating the captions you wanted. Whenever I would try, I’d get to that last step and wish I had access to a keyboard. With this latest update, that’s not an issue, and you get filters, which would have been a pain to skim through on Glass’s display.
There are actually quite a few places in the Glass user experience where access to a more complex interface is helpful. For instance, navigating with Google Maps on Glass is great, but searching isn’t, especially when you don’t really know where you’re going.
But if you do a Google Maps search on your Mac that happens to be signed into your Google account, a card will show up in the Glass interface giving you the option to navigate to that address. It’s pretty neat to walk out the door, put on Glass, and instantly know where you’re headed.
It’s when Glass best leverages the other gadgets you use throughout your day that it really feels like you’re getting the most out of the device. There’s certainly incentive for them not to: One could argue that having functionality require the use of a smartphone makes using Glass redundant.
That ignores the advantages that come with Glass, such as visual notifications (that contain more information than what you can fit on a watch face). With a software update released last week, you can see your notifications just by looking at the screen.
When I asked Google if it worries about making the features on the MyGlass app “too useful,” Steve Lee, Glass product management director, said:
We want the user to have the best experience possible. Period. There are experiences that are just naturally better on Glass, like capturing moments while living them. If it enhances the user experience, we want to make sure those experiences can extend to other platforms regardless of whether it’s iPhone, Android or desktop. When it comes to photo editing — cropping, filtering, etc. – that’s clearly easier on a phone. This functionality is something that our Explorers have asked for, so we wanted to be sure to deliver it to them.
In a recent column on Wired, Mat Honan argues that notifications are going to rule the smartphone interface. Google Glass is already there: Everything you can do on the device is accessed through a notification-like card.
Maybe that’s why people don’t “get” Glass yet. They’re used to navigating to a specific app to do a specific task or reach a particular piece of information, not responding to contextual updates as they happen. Perhaps the notification-based interface on smartphones will get people used to that mindset by the time Glass is ready for consumer release.
If Google can get developers on board with making it as easy as possible to transition from seeing a notification on Glass to acting on that notification on a device better suited for the task, the device might start making a lot more sense to a lot more people.