How Google Took Street View For A Dive

Google’s underwater Street View launched last September, but Google’s Ocean program actually began six years ago, when one of the founders of Keyhole (which, after being acquired by Google, later became Google Earth), was inspired to also look into mapping the ocean. For several years now Google has been mapping the oceans, but bringing Street View underwater is still very challenging.

“Our goal is to really make all of our maps data more comprehensive by adding more ocean data. We want to take you from your home to the turtle’s home,” Google’s Jennifer Austin Foulkes said. So far, Google has launched this for six locations, including Oahu, Maui and locations around the Great Barrier Reef.

Because there is a strong scientific component to this project, the team set up a strict protocol for taking this imagery. Richard Vevers, director of the Catlin Seaview Survey – Google’s partner in this project – said that the cameras his team uses for this project are very different from those used by Google’s other Street View vehicles. The team had to use wider-angle lenses, for example. Google’s underwater Street View camera has three cameras on its front and takes images every three seconds. One of the cameras points downward, because that’s how images during reef surveys have traditionally been taken. The back of the scooter features a tablet that can control the cameras.

IMG_8781

During a typical dive, the divers cover about 2km and take 3,000 to 4,000 images per camera, and the team does three dives per day, each of which lasts about an hour. In total, the team has taken about 150,000 images so far, and Vevers expects this number to grow exponentially over the next few months. In the long run, the team hopes to create diver-less systems that can stay underwater for 12 hours or more. The technology is already available, but it needs to be adapted to the kind of camera system needed for Street View.

In addition to the usual cameras, the team is also testing stereo cameras to create 3D imagery and has recently experimented with doing underwater Hangouts and using Photo Spheres to engage the public.

Every camera system costs about $50,000, and four of them are currently in existence, though two of them haven’t been in the water yet.

To get this underwater data into Street View, Vevers used Google’s standard Business Photos tool. The actual location of the images, by the way, is triangulated. The images, it’s worth noting, are also freely available for scientists.

IMG_8783

The team is focusing on the Americas right now, but plans to bring underwater Street View to all of the world’s oceans over the next three years (that’s obviously just a few locations – not all of the oceans…). Another focus for the team is getting more developers involved – both for crowdsourcing data and for developing better reef-recognition algorithms. The existing algorithms can only interpret images from a downward-facing camera, but the team is hoping to create tools for working with all of the data the cameras generate.

Given the threats to the ocean, there is obviously a serious side to this project, something Vevers noted during his talk. Street View, he argues, is an important tool to inform the public about the threats that the ocean’s face today. “People don’t want to protect anything they can’t see,” he said. Most people don’t dive, but there’s no reason why we can’t take them diving virtually. There is no point in doing science, Vevers argues, if it doesn’t get out to the public and policy makers.

IMG_8782