Artificial Intelligence in the form of a new computer chip that saves tonnes of space and operates on low power could revolutionise computers in the near future
A computer chip is no different from Mr. Spock in Star Trek. Ask it a straight question and you’ll get a straight answer: A “yes” or a “no”. Now ask it a question that doesn’t have a yes or a no but just a “perhaps” or a “maybe” as an answer and the chip will have its mundu in a twist.
Ben Vigoda isn’t having any of this anymore. He is an MIT scientist who’s been working to create a new type of chip for close to a decade now. On August 17, Lyric Semiconductor, a start-up that he co-founded in 2006, unveiled what it called the alternative to digital computing — a “probability processor” that can compute “quite often” or “very rarely” instead of just “yes” or “no”. Lyric claims its “LEC”, a specialised processor for flash memory, offers a 30-times reduction in size and a 12-times improvement in power consumption compared to its digital counterparts today, without sacrificing performance.
Vigoda believes that for many tasks in today’s world his processors can do a much better job than the stuff that Intel or AMD make. And before you jump, Lyric’s chips won’t be replacing Intel’s Core or Xeon or AMD’s Opteron or Phenom, but assisting them. In that sense Lyric’s processors will be like the graphics or audio-processing chips that assist Intel or AMD’s chips draw that 3D image or play that MP3 song faster and better.
To understand why this is important, go back 25 years when graphics software programmes like Microsoft Paint first became commercially available.
The calculation-intensive work of generating the drawing was done, albeit a slow pace, by the main chip. As consumers demanded that these drawings be rendered faster on the screen, all the “calculation logic” was put on a hardware platform, the video graphics adapter chip. And that’s how the main chip got a graphics assistant.
Dealing with Ifs and Buts
Similarly, Lyric’s processor is a proof that many computer applications today need to deal with scenarios rather than conclusive events. And these scenarios keep changing as time goes by. Consider your email for instance. Was that email titled “Please collect your free gift” just spam or a mail from your friend? Take another example: Movies. Presumably you have subscribed to one of those rent-a-DVD services. Now, what movie are you likely to order next from the online rental service?
All these require calculations based on probability and therefore nasty mathematics and logic. Such calculations require data about past behaviour, choices being made currently and the likelihood that current choice will be consistent with past behaviour. For instance, it is quite possible that if you order Aamir Khan’s 3 Idiots on a rental Web site it will suggest Lagaan next, which is another Aamir Khan film. But you actually end up ordering Dead Poet’s Society. Now the algorithm changes its hypotheses. Maybe you like stories of charismatic teachers and students defying orthodoxy. It suggests Stand and Deliver. You look at the review and end up ordering that. Ah, goes the algorithm. It suggests Finding Forrester next. And the chain continues till you order Ghajini, which is when the algorithm goes into a spin again.
Today more and more computer applications need to deal with such logic which requires a change in prediction based on a change in behaviour. Some might be predicting traffic routes, others may be suggesting books. “Probabilistic programs are better for tasks that involve uncertainty, including where some information is uncertain or unknown. They can also reason in a way that is much more natural to humans, for instance: ‘I’ve observed the outputs of this program, can I now reason what its inputs might have been?’ That is how human reasoning works, for example when we look at an email’s subject and sender and reason if it was sent by a spammer,” says John Winn, a researcher with Microsoft Research in Cambridge, UK.
Since more people depend on such calculations and they need it fast, the time has come for this logic to migrate to a hardware platform, just like the graphics processor. So Intel and AMD now need a “probability calculator” assistant.
Reading Between the Lines
This makes economic sense as well. To do just basic probability calculations, a conventional microprocessor ends up using almost 500 “transistors”, which are the fundamental building blocks of a chip. Lyric’s chip uses just a fraction of this amount. This means a smaller chip size and less power consumption. That’s a real help for the main microchip. Intel’s Pentium 4 consumed between 60-80 watts as compared with Pentium III that consumed 30-40 watts.
“The core of probabilistic computing is algorithms. The reason you’re seeing newer hardware and software in this space is because they can exploit over 30 years of work in algorithms that can solve diverse kinds of problems in very efficient ways,” says Prof. Manindra Agrawal, the head of the computer sciences department at IIT Kanpur. Infographic: Hemal Sheth
(This story appears in the 24 September, 2010 issue of Forbes India. To visit our Archives, click here.)