Two new patent applications published by the USPTO today describe a couple of pieces of tech that both help make mobile devices smarter, albeit in very different ways. The first is a facial recognition system that can identify people and things using vector-based “faceprints” picked up from photos, and the second is a call waiting system that can provide inbound callers with a lot more than just a repeated tone to let them know they’re still on hold.
The image recognition patent bears a lot of resemblance to Apple’s existing facial recognition systems built-in to iPhoto, which allows it to identify and tag pictures where friends and family appear after first learning what their face looks like. But the “Auto-recognition for noteworthy objects” patent isn’t about spotting your brother or wife; it’s intended to tag celebrities, landmarks and famous objects, in order to better serve up contextual links and information about those people and places.
To do this, it can either use locally stored photos, or tap into pictures kept in the cloud to search for faceprints that look like the one it gets from an original source image. Most of the patent describes these as representing actual people, but it also allows for the recognition of “iconic images.” Once a match is made, the system can group images of the same individual together. It can also add metadata to images to help improve hit rates, and even track people aging over the years. The exact reasons Apple would want to do this is unknown, but the patent does describe a system returning information and links about the subjects it identifies, including Facebook pages, Twitter profiles and iTunes Store links to related movies and music.
This could be a building block for a visual search engine, perhaps intended as a complement or addition to Siri. If Apple is intent on building a more robust and complete virtual personal assistant over time, after all, it’ll need eyes as well as ears to really get the job done.
The second patent seems like something we could see implemented quite soon, and actually builds on Apple’s introduction of new features for dealing with inbound calls, including responding via a pre-written text message, introduced in iOS 6. This time, the system would add a number of similar options to calls that come in while you’re already on the phone, providing a way to put callers on hold and give them an auto-response explaining your current situation. It would essentially be like having your own virtual office line on your device, complete with the ability to see how long you’ve had someone on hold and an option to send to voicemail should a hold go on too long.
Users would be able to set up custom responses to send out for their inbound callers to hear, from which they could select depending on their situation. Or they could ask a caller to speak a message, which the iPhone would translate using a text-to-speech engine in order to delivery a text message for the person receiving the call. Users could then text a return response, or select from one of their pre-recorded custom messages. A user could even enter an estimated hold time to let their caller know how long they might be waiting.
An automated response feature could also use sensor cues from an iPhone’s accelerometer or GPS, as well as Calendar and Contact data to predetermine what message should be played in response to an incoming call. It could determine if you’re driving, for instance, or recognize that someone is calling in who you were supposed to have a conference call with, and deliver a response tailored to each situation.
Both patents, as with a lot of recent Apple filings, are about making the smartphone smarter: a device that can recognize and process data on its own, and then offer up better, more immediately useful action options for a user is better than a device that is content to wait for user input on all decisions, at least according to Apple’s recent trajectory with software and services. As always, it’s worth noting that patent applications are seldom roadmaps of what will ship, but looking at them as a whole should prove a good indicator of where Apple is focusing for R&D.