I got to cruise over to Smith Tower yesterday to talk with Microsoft Live Labs about Photosynth, and as of this writing the site should be live and all the things we talked about will be available for you to play with. It’s best to see it in motion, so either grab that camera or watch the video above for a taste of the Synthy goodness. Sorry the sound is so booty.
The video is a bit of behind-the-scenes stuff but really there isn’t too much there you can’t see by just heading over to Photosynth.net and testing out a few synths. May I recommend my own? Or, of course, you could look at one of the many gorgeous panoramas made up by people who work at some magazine called “National Geographic” — whatever that is. Read on for a basic explanation of what’s going on when it synths stuff.
So the whole process is automated, all you do is put a bunch of pictures in. They warned me that my first few synths might turn out a little weird, and that you have to be aware of some idiosyncrasies in the Synther’s code. What it does is look for similarities between pictures, and once it has found them, it uses depth cues and stuff like the focal length and distance objects have travelled in the picture, and creates a 3D map of the area. The map itself (see it by holding down control) is cool as hell and reminds me of voxel landscapes like in Outcast. I certainly had trouble making one of my apartment, I’m getting better at it with every try.
Some stock footage they gave me. Better resolution but it doesn’t have that je ne sais quois.
It uses that 3D map as an arena for a camera to move around in and look at your photos projected flat onto where it thinks they fall in the map. It’s generally pretty good about this. Sometimes it connects things a little strangely or throws out stuff that doesn’t have enough similarity with other pictures, but if you’re careful you can make something work well. I recommend taking lots of pictures. Inside the synth, you can zoom in to the maximum detail available in your photograph; it doesn’t recompress or anything, it’s just a different way of accessing that information. It streams the data using the Seadragon tech that made some of the Surface demos so compelling.
Once you create an account, you have 50 gigs of space for your synths, which is awesome, because mine were all at least 50MB each and I intend to make some probably 10 times that size. You do have to upload everything to their server, and everything is public now, but they’re working on setting up private synths. I have a decent upstream and it never took longer to make the synth than it did to upload all the pictures, so that’s nothing to worry about.
I can see this being used pretty widely as long as they can make people try it once or twice; it’s super easy and it’s really a great way to feel out a space remotely, for instance if you’re advertising a venue or apartment. It’s also a great educational tool, because it’s pretty intuitive and kids will enjoy zooming around a dinosaur skeleton or bug and getting up close to see the details. So what are you waiting for? Go try it out!