We all saw what Watson did last night, but now the question becomes: what now? So what that IBM created an artificial intelligence that was able to answer a few trivia questions? (That’s a massively simplistic way of looking at Watson, and discounts the incredible capabilities of the human brain and discounts the complexity of having to interpret human language with off-the-shelf hardware and finely tuned software.) Is there an end-game here? Maybe “end-game” is too strong phrase to use, but IBM has announced a deal with Nuance Communications to “explore, develop, and commercialize the Watson computing system’s advanced analytics capabilities in the healthcare industry.”
Part of the backlash against Watson I’ve seen stems from people misunderstanding some of the basics surrounding its operation. A caller told Opie (of Sirius XM’s Opie & Anthony Show) yesterday that Watson, upon receiving the question, would search online for the answer. Wrong! Watson gets the text, “reads” it like a person would read it, analyses what the question is asking, consults his database, then buzzes in with an answer. It wasn’t always right—IBM had mentioned last night that Watson may have thought Toronto was a U.S. city (not counting Toronto, Ohio, but that would be like saying Rome is an American city because there’s a Rome, New York) because the Toronto Blue Jays play in Major League Baseball’s American League—but the fact that engineers were able to create what amounts to a fancy computer program to parse and interpret human language, with its many idiosyncrasies, to a degree that it was able to compete in a game and win speaks volume about the talent of IBM’s engineers.
IBM should hire someone like Bill Nye The Science Guy or Michio Kaku or Neil deGrasse Tyson to go around various talk shows (Leno, Conan, Good Morning America, Fox & Friends, etc.) to explain in terms the average person can understand why Watson is so much more than a “robot” that can answer a few trivia questions.
But back to this Nuance Communications news. It’s expected that the first commercially available product won’t be available for another 18-24 months, but that it will combine the best of IBM’s tech (Deep Question Answering, Natural Language Processing, and Machine Learning) with that of Nuance (speech recognition and Clinical Language Understanding). The idea is to create something “for the diagnosis and treatment of patients that provide hospitals, physicians, and payers access to critical and timely information.”
How about this scenario: a patient is rushed to a hospital with an unidentified illness. A loved one explains to Dr. Watson the symptoms, and the good doctor searches terabytes of information in a split-second to narrow down the possible ailments, along with possible courses of action. “This patient would appear to have X, please prepare a drip of Y to begin treatment.”
That’s a completely invented scenario, but imagine something along those lines for an idea of where this technology could be going.