Project recreates cities in rich 3D from images harvested online

People are taking photos and videos all over major cities, all the time, from every angle. Theoretically, with enough of them, you could map every street and building — wait, did I say theoretically? I meant in practice, as the VarCity project has demonstrated with Zurich, Switzerland.

This multi-year effort has taken images from numerous online sources — social media, public webcams, transit cameras, aerial shots — and analyzed them to create a 3D map of the city. It’s kind of like the inverse of Google Street View: the photos aren’t illustrating the map, they’re the source of the map itself.

Because that’s the case, the VarCity data is extra rich. Over time, webcams pointed down streets show which direction traffic flows, when people walk on it, and when lights tend to go out. Pictures taken from different angles of the same building provide dimensional data like how big windows are and the surface area of walls.

The algorithms created and tuned over years by the team ETH Zurich’s Computer Vision Lab can also tell the difference between sidewalk and road, pavement and grass, and so on. It looks rough, but those blobby edges and shaggy cars can easily be interpreted and refit with more precision.

The idea is that you could set these algorithms free on other large piles of data and automatically create a similarly rich set of data without having to collect it on your own.

“The more images and videos the platform can evaluate, the more precise the model becomes,” said a postdoc working on the project, Kenneth Vanhoey, in an ETH Zurich news release. “The aim of our project was to develop the algorithms for such 3D city models, assuming that the volume of available images and videos will also increase dramatically in the years ahead.”

Several startups have already emerged from the project: Spectando and Casalva offer virtual building inspections and damage analysis. Parquery monitors parking spaces in real time through its 3D knowledge of the city. UniqFEED (on a different note) monitors broadcasted games to tell advertisers and players how long they’re featured in the feed.

The video above summarizes the research, but a longer one going deeper into the data and showing off the resulting model will be appearing next week.

Update: Here’s the full-length video, with much more detail on what sources were used and the applications of the VarCity mapping technique.

And in the interest of giving credit where it’s due, here’s the core of the team:

  • Luc Van Gool – Principal investigator and professor
  • Hayko Riemenschneider – Project leader and researcher
  • Kenneth Vanhoey – Video director and researcher
  • Carlos Eduardo Porto de Oliveira – Video animation, edition & composition

New developments will be posted to the project’s Twitter account.