Tuesday, November 5, 2013
Posted by lelyholida
No comments | 9:29 PM
Computers as we know them have are close to reaching an inflection point—the next generation is in sight but not quite within our grasp. The trusty programmable machines that have proliferated since the 1940s will someday give way to cognitive systems that draw inferences from data in a way that mimics the human brain.
IBM treated the world to an early look at cognitive computing in February 2011, when the company pitted its Watson computing system against two former champions on TV’s Jeopardy! quiz show. Watson’s ability to search a factual database in response to questions, to determine a confidence level and, based on that level, to buzz in ahead of competitors led it to a convincing victory. The accomplishment will soon seem quaint when compared with next-generation cognitive systems. IBM has already increased Watson’s speed, shrunk its previously room-size dimensions and hooked it to vastly more data.
Last month two prominent medical facilities began testing the revamped Watson as a tool for data analysis and physician training. The University of Texas M. D. Anderson Cancer Center is using Watson to help doctors match patients with clinical trials, observe and fine-tune treatment plans, and assess risks as part of M. D. Anderson’s "Moon Shots" mission to eliminate cancer. Meanwhile, physicians at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University are hoping IBM’s technology can help them make more informed and accurate decisions faster, based on electronic medical records.
It’s a nice start, but still only the beginning, says John Kelly, an IBM senior vice president and director of IBM Research, one of the world’s largest commercial research organizations. In his recent book, Smart Machines: IBM’s Watson and the Era of Cognitive Computing, co-authored with IBM communications strategist Steve Hamm, Kelly notes that one important goal for cognitive computers, and for Watson in particular, is to help make sense of big data—the mountains of text, images, numbers and voice files that strain the limits of current computing technologies.
Scientific American spoke with Kelly about cognitive computing’s relative immaturity, its growing pains and how efforts to develop so-called neuromorphic computers, which mimic the human brain even more closely, could someday relegate Watson to little more than an early 21st-century artifact.