By Greg Sikes
If you have watched TV in the last two years, or walked through an airport, you have likely seen an IBM Smarter Planet advertisement. At the core of this message is the idea that the world is becoming more Instrumented, Interconnected, and Intelligent. Just take a look around your house, or at the communication device you regularly clutch in your hands, and you will see evidence of the first two points. I looked up the word “intelligent” in the dictionary sitting on my desk; included in the definition was the text “the ability to learn or understand”.How does a device get the label “smart”? Aside from good marketing intended to increase sales, smart products hold the promise of optimizing performance based on behavior patterns. An example might be an energy consumption monitor that helps meter appliance usage such that it occurs during the lowest cost cycle within a given time frame.
How do products “understand”? Fundamentally they process information, or data. At the most basic level, structured data provides a consistent, predictable source of information that can be used to measure, understand and react to various situations. An example might be the buoys located in the world’s oceans that regularly measure and communicate wave height and water temperature information. Unstructured data presents more of a challenge. Natural language is a great example of unstructured information; the use of phrases (“the ball is in your court”), or a slang term (“that’s cool”), present unique challenges when processing information.
Let’s pick a simple example. Imagine a team of 10 medical doctors each talking individually to your average teenager about an ailment. Would they all arrive at the same diagnosis if the symptoms were not as obvious as a cut finger? If they did not, how much would they be influenced by how the information was delivered? Ideally never, but realistically sometimes. Imagine if the words from a patient could be reviewed by a neutral diagnostic assistant and a set of sensible alternatives, with associated probabilities, could be presented to the medical professional.
In 2011 an IBM computer, named Watson, competed on Jeopardy. Watson had to succeed in an environment where the human brain excels – to process unstructured English text, which included puns, turns of phrases, and indirect references, and then select a most likely answer from one or more alternatives; all in approximately 3 seconds. Watson correctly answered, “What is ‘chess’?” to the clue “Watson’s ancestor defeated the world champion at this in 1997, also a Broadway musical.” Did Watson get every question right? Well, the answer “What is Toronto?????” to the final Jeopardy clue “US Cities: Its largest airport is named for a World War II hero; its second largest for a World War II battle” should answer that question (Chicago was the correct answer).
A key component of Watson is a capability called “deep analytics”. Fundamentally it is the ability to process structured and unstructured data, sometimes as it is received real time, and to piece together a conclusion that may not be so obvious. Remembering previous situations and applying those to a current scenario is all part of being “intelligent” (humans do that every day).
How can deep analytic systems, such as Watson, be leveraged in every day life? Aside from the healthcare scenario discussed earlier, one can imagine others such as in-field product service advice, or possibly improved security through analysis of unstructured communications.
If you’d like to learn more about this subject, the Technology Council has an event schedule on March 29th titled “Deep Analytics: IBM Watson and Jeopardy: Was it only a game?” For more information click here.