Chas' Compilation

A compilation of information and links regarding assorted subjects: politics, religion, science, computers, health, movies, music... essentially whatever I'm reading about, working on or experiencing in life.

Thursday, February 17, 2011

"Watson" won. But did it really?


A database dishing up answers can be quick, but just how intelligent is it?

Computer finishes off human opponents on 'Jeopardy!'
(CNN) -- Start the "computers are conquering the world" jokes now. "Jeopardy!" master Ken Jennings already has.

The IBM supercomputer Watson won its second "Jeopardy!" game in Wednesday's edition of the TV show, completing a sweep of its two human opponents, including Jennings, who acknowledged mankind's trivia inferiority before the match was even over.

"I for one welcome our new computer overlords," Jennings wrote under his correct Final Jeopardy! solution, prompting laughter from the studio audience.

Watson -- despite being far from perfect -- was too far ahead in the two-game match to be caught. It beat Jennings and fellow "Jeopardy!" champion Brad Rutter, earning $41,413 for the day and $77,147 for the two-game total.

Jennings, who led for a good portion of the second game before succumbing to a late string of correct Watson answers, ended the game ($19,200) and match ($24,000) in second place.

The "IBM Challenge" match was spread over three days, with the first game taking two days so that host Alex Trebek could take time explaining what Watson is.

A massive machine represented at the studio by a tablet-like avatar, Watson was in development for years and has the processing power of 2,800 "powerful computers." IBM trumpets Watson as a machine that can rival a human's ability to answer questions posed in natural human language.

For the games, the computer -- stored in a separate building in New York -- received clues through digital texts and buzzed in against the two other contestants like any other player would. [...]

It made some mistakes, but not many. The example they gave wasn't a question I would have been able to answer, either. Watson won a million dollars, which IBM will donate to charity.

It did so well, I doubt it has a future on Jeopardy. The winner would be a forgone conclusion. But that may have more to do with the question format of the show, than any real intelligence on the part of the machine.

Is calling it "Artificial Intelligence" too much? That depends on how you define the phrase. IBM calls Waston a "Question Answering System". If you look at some of the Videos on Youtube, you can see that it went through quite a bit of training before it was ready to compete on Jeopardy; it was prone to breakdowns where it would start getting everything wrong. Perhaps it's really more of a victory for voice recognition and database retrieval?

This article goes into more detail about Watson's weaknesses:

Why Watson's win doesn't make humanity obsolete -- yet
(CNN) -- Well humans, it's been a good ride, but after being eviscerated by IBM's supercomputer Watson on "Jeopardy!," it's probably time to pack up the truck and let the machines inherit the Earth.

Or is it?

Despite Watson's tremendous performance, the Final Jeopardy question at the end of Tuesday night's airing revealed the Achilles' heel that computer scientists have known all along: Watson doesn't really "think" anything, and it struggles with simple questions that most humans can answer without a second thought.

Most of the clues on the "Jeopardy" board mention proper nouns -- specific places, events, people, songs, books and so on, says Dr. Douglas Lenat, a machine learning pioneer, former Stanford professor of computer science and CEO of Cycorp, a company that develops semantic technologies.

"This gives the Watson algorithm a great deal of 'traction.' To us viewing the show, it's impressive if it correctly knows that Franz Schubert's birth date was January 31, 1797. But if that date had been part of the clue, could Watson correctly pick out [Schubert's] maternal grandmother's birth date from a list where only one of the dates was earlier than 1797?"

We could, because we understand that everyone is younger than their own mother and grandmother, but Watson is unable to understand this, Lenat explained.

At the end of the day, Watson is not really conceptualizing a clue's meaning. It simply number-crunches its way to the right answers by comparing vast amounts of data. This is why it dominates the "fill in the blank" knowledge clues (Aeolic, spoken in ancient times, was a dialect of this), but falters on some more "common sense" deductions.

The biggest blunder was in the first game's Final Jeopardy round. [...]

It goes on with more examples of Watson's limitations. Then goes on to describe how the technology could be applied, as a useful tool.

What people call "Artificial Intelligence" (or A.I.) is really just programing that compares data, and mimics human intelligence. Some programs can even "learn" in a limited capacity, but all lack the depths and subtleties of a real living, intelligent consciousness. But it is an interesting, budding technology that will continue to grow and find new uses, as tools and entertainment.

I like the spin IBM put on the victory:

Humans win!
The challenge is over. Watson, Ken Jennings and Brad Rutter concluded their final round of Jeopardy! and the winner was… resoundingly, humankind. Watson’s advances in deep analytics and its ability to process unstructured data and interpret natural language will now be applied to humanity’s most vexing problems. If we can teach a computer to compete on Jeopardy! what could it mean for science, finance, healthcare and the future of society?

Watch the video and see how Watson has the potential to transform industries. [...]

The video is interesting. It shows how, for some questions, the humans were able to think of the answers more quickly than Watson could. Watson also got some wrong answers. And if it hadn't been lucky enough to get a "Daily Double" question, perhaps it would not have won the tournament. So it was really a bit of a close call.

I recommend watching the video. It shows how that, while Watson may seem to formulate answers like it's human competitors, it actually uses very different processes to get those answers. And questions with multiple components can slow it down or stump it. Still, it's fascinating to see how it works, it has all sorts of possibilities for future utilization. The way it sorts through data to find answers, combined with voice recognition and speech, make it a tool with great potential. For Watson and the team that build it, this is only the beginning.


Also see:

Ultra HAL, your personal computer assistant

Ultra Hal: His "Second Life" is really his first one

I have a new favorite Sci-Fi AI: "GERTY"

When ALICE met Jabberwacky
     

Labels: , , , , , , , ,

0 Comments:

Post a Comment

Links to this post:

Create a Link

<< Home