Last Friday, we posted about the negotiations between Apple and voice recognition company Nuance. While these talks have been going on for months, sources told us that it wasn’t yet entirely clear what the outcome would be — either a broad strategic partnership or, less likely, an acquisition. Things are looking a bit more clear now.
In digging into the information about the relationship between the two companies, we had heard that Apple might actually already be using Nuance technology in their new (but yet to be officially opened) massive data center in North Carolina. Since then, we’ve gotten multiple independent confirmations that this is indeed the case. And yes, this is said to be the keystone of a partnership that Apple is likely to announce with Nuance at WWDC next month.
More specifically, we’re hearing that Apple is running Nuance software — and possibly some of their hardware — in this new data center. Why? A few reasons. First, Apple will be able to process this voice information for iOS users faster. Second, it will prevent this data from going through third-party servers. And third, by running it on their own stack, Apple can build on top of the technology, and improve upon it as they see fit.
Obviously, Nuance, which owns the technology, would have to sign off on all of this. And we now believe that they have. Hence, the big time partnership that should be formally announced soon.
All of this plays in nicely with our report that Siri would be a big part of the upcoming iOS 5 software. Apple is expected to show off iOS 5 at WWDC, but it would be launched this fall, as we previously reported. In order to work, Siri requires Nuance. When Apple bought Siri last year, they immediately began negotiations with Nuance to ensure that Siri was able to keep running. But it now appears that after months of tense negotiations, the two companies have decided that it was best to take the relationship a step farther.
Why would Apple go to all of this trouble to use a third-party technology? Probably because it’s well known that Nuance holds and strictly enforces a wide range of patents in this space. And there are only a handful of experts who really understand this stuff enough to build such a system from scratch — and those people mainly now work at Nuance or Google, is our understanding. In other words, even if they wanted to, Apple probably couldn’t build up such a system themselves without being sued. And Nuance knows this, so they likely have no problem striking a huge deal with Apple to use (and expand upon) their technology as they see fit.
As for why Apple wouldn’t simply buy Nuance, again, it’s probably because it just would not be a very smart deal. Nuance is a public company with a market cap over $6 billion (the stock is surging today due to our report last week and has hit a 52-week high, adding some $500 million to their cap). This means that Apple would have to spend upwards of $10 billion to acquire the company, and immediately after they did, the value would plunge since much of it is based on their partnerships with other companies not named Apple. It would be a straight-up strategic acquisition that would be a very expensive one for Apple. Nuance, naturally, knows all of this.
So yes, it appears very likely at this point that a Nuance/Apple partnership is a big part of Apple’s cloud initiative. The next question is what this will mean, if anything, for developers at WWDC? Will they get access to this advanced voice recognition technology through iOS APIs right off the bat? Or will this technology mainly serve Apple’s own applications at first?
Oh and one more thing: one anonymous tipster who correctly knew other information about the two companies tells us that Microsoft had been pushing Apple hard to use their own voice recognition technology in iOS. That attempt was rebuffed, apparently. It will be Nuance all the way.