Snapchat has a secret team possibly building a pair of smart glasses

Snapchat has been known for some bold (and perplexing) moves in regards to content features, attempting to revamp event marketing through Stories, redefine journalism through Discover and rethink the selfie through Lenses. They’ve tackled software features, but now they may be looking to master hardware.

When you’re valued at $16 billion… why not?

A report from CNET details a crop of hirings from major augmented reality groups like Microsoft’s HoloLens, PTC’s (formerly Qualcomm’s) Vuforia and eye-tracking tech maker Eyefluence that point to Snapchat’s possible development of a pair of smart glasses.

The company showed interest in augmented reality during the last hype wave surrounding Google Glass. Vergence Labs, which produced a pair of glasses equipped with an embedded camera, was acquired by Snapchat for $15 million in March of 2014. The acquisition accompanied a larger $50 million acquisition of Scan.me, a QR code-scanning/creating technology that would later manifest itself it the company’s Snaptags feature.

Snapchat did not openly advertise these acquisitions then, details of it were leaked in the Sony data hack in December of 2014. It was an especially odd-sounding move then, but fast-forward a couple of years past the acquisitions and the company has added a few billion dollars to its valuation, as well as a variety of mainstay features like its Discover tab and geofilters.

Vergence-labs

Vergence Labs’ camera-equipped Epiphany Eyewear

The real question is whether Snapchat is using these hires to attempt to learn more about augmented reality tech and more effectively adapt its platform for usage on smart glasses or whether the engineering folk are actually just building glasses with the goal of a consumer release.

Sophisticated consumer gadgets might make zero sense for Snapchat at the moment, as their current consumer products include a beach towel, a deck of playing cards and a backpack. Admittedly, for most people, Snapchat in itself makes zero sense, so this could perhaps be very on-brand for them.

That being said, I’d definitely suspect the company is focusing more on looking at technologies to more effectively optimize AR technology for its apps. The employee from Eyefluence (who recently left) in particular is interesting as the eye-tracking company focuses heavily on engineering more intuitive interface gestures.

In an interview with Forbes in January 2014 (just a couple of months before the Vergence Labs purchase), Thomas Laffont, the managing director of Coatue (which led Snapchat’s $50 million Series C), spoke a bit about the platform’s relationship with interfaces on AR devices.

“People haven’t thought about use cases on new computing platforms,” said Laffont. “In one tap you take a photo, one more and you can share it. Imagine [the difficulty] trying to post on Instagram from a Google Glass device.”

Input for augmented reality is kind of a shit show right now, with most companies experimenting with a wide variety of hand-tracking, eye-tracking, motion-tracking and head-tracking tech to control their devices. If Snapchat can use their AR-minded team to make advances in furthering the ease of content absorption on the platform, there would be some obvious benefits to the company moving forward.