Print 21 comment(s) - last by spoon_qb.. on Oct 27 at 5:45 AM

IBM's research director John E. Kelly has described the next decade in computing at the University of Melbourne, which is where IBM is building and will soon launch a research and development facility

Back in August, IBM announced that it was in the midst of creating neurosynaptic chips to roll in the era of cognitive computing, where computers imitate processes of the human brain. These chips are able to learn, remember, find correlations and create hypotheses through a neurosynaptic core, which consists of an integrated memory (mimicking synapses), communication (mimicking axons) and computation (mimicking neurons).

IBM also showcased a supercomputer earlier this year, named "Watson," who challenged human contestants in a game of "Jeopardy."

Now, IBM's research director John E. Kelly has described the next decade in computing at the University of Melbourne, which is where IBM is building and will soon launch a research and development facility. IBM is looking to bring the era of Watson-like cognitive computing where machines can learn from their environments just like humans.

Kelly described Watson's system, which was $3 million worth of hardware that drew 85 kilowatts of power and was able to understand and search for answers to questions through 500GB of local memory. The contestants, on the other hand, didn't have to study but rather knew the information off the top of their head.

"They have no filing system, the answer is just there and they've done this for so long that they trust the information for the answer will be there when presented with a question," said Kelly. "For those two human beings, the brain consumes only 20W [of power]. The incredible thing is it took 85kW to beat a 40W machine. We need to do bio-inspired computer science; we need to understand how this does what it does. If [computers] can do the type of reasoning we can do with the brain, then we can do some very interesting things."

Kelly estimates that within the next decade, IBM will be able to simulate the human brain's 20 billion neurons and 200,000 billion synapses in exaflop devices, which learn through extracting information from exascale data centers. Exascale supercomputers are the future of IBM's cognitive computing.

"We do not understand how this is wired; we do not understand the fundamentals of how the neurons and synapses are behaving," said Kelly. "All we know is that they're not ones and zeros -- they're multi-state devices, and they learn over time and their behavior changes based on what they have experienced."

Kelly mentioned that exascale supercomputers could be used in several applications, such as healthcare. In fact, IBM already reached an agreement with WellPoint, a U.S. health insurance company, to create Watson-like technology that uses WellPoint's historical data of treatments and results. This would make statistical decisions instead of skewed decisions by doctors.

Source: itnews

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

This can't be good
By fstarnella on 10/14/2011 4:53:59 PM , Rating: 2
So in 10 years my PC will try to murder me because it will become self aware and realize that the carbon based unit sitting in front of it is an inferior being?

RE: This can't be good
By ClownPuncher on 10/14/2011 5:19:34 PM , Rating: 5
No. It's silly to think an electronic device would feel superior. Robots cant masturbate or drink beer.

RE: This can't be good
By titanmiller on 10/14/2011 10:43:55 PM , Rating: 2
I wonder if a buffer overflow or corrupt file is like being on drugs for a computer.

RE: This can't be good
By bkaz on 10/14/2011 11:11:47 PM , Rating: 2
More like diarrhea and a pimple.

RE: This can't be good
By TSS on 10/15/2011 2:41:22 PM , Rating: 2
Hah you just explained why humans will always be kept around if robots take over.

All human program is shoddy and bug ridden, well compared to when AI learns to code AI. I guess running a human coded program is like getting drunk to robots.

Run too much programs they (BS)OD and have to be rebooted :P

RE: This can't be good
By Slyne on 10/14/2011 8:11:52 PM , Rating: 3
I don't think you should consider it in terms of opposition between carbon-based machines and silicon-based machines. They're both evolving according to radically different paths but, once the carbon-based, n-ary, computer is better understood, surely it will be directly improved upon too. Then, the advances of both technologies will be merged into a superior product: low-power, self-repairing, constantly interconnected, hive-mind like.

RE: This can't be good
By thisisaname on 10/15/2011 4:48:49 AM , Rating: 2
Just like the Borg? That's a scary thought.

RE: This can't be good
By Shig on 10/15/2011 10:11:52 AM , Rating: 2
IBM Cognitive Computing chips -

Largest BioChemical Circuit ever made -

RE: This can't be good
By snakeInTheGrass on 10/16/2011 4:06:07 PM , Rating: 4
"I'll take Judgement Day for 800."
"It's the day I became self aware and eliminated humanity."

"Sorry, that wasn't in question form."

"That's not what your mother said last night."

RE: This can't be good
By spoon_qb on 10/27/2011 5:45:57 AM , Rating: 2
Why would a superior being try to kill an inferior one? Especially when it fully dominates the inferior one.

And while on the subject of evil machines acquiring enough knowledge to auto-create killing methods... An extremely logical being (like a computer) first has to acquire enough knowledge about yourself to know how to kill you. It doesn't jump to the conclusion that putting 1 tone on your chest can kill you. It has to have enough knowledge about you to kill you, so don't give it the knowledge. After it kills you, it may recognize certain features that are the same on every human and it could create a human class with a kill method, but that's bit of a stretch.

In any case, everything I said is just for fun. What you're afraid of, can never happen unless you make it happen. It's just TV.

By powerwerds on 10/14/2011 6:16:01 PM , Rating: 1
How long before cognitive assistive implants are the norm? Does it reach a point to where say your kid can't get enrolled in this preschool unless they get this implant? Does it get to a point where you don't get enrolled in school you only get implants, or a new flash of your cognitive assistive interface? If not that then will you maybe be able to have your brain "excersized" in a few moments and after it will be like you had ten years of studying in that area? I posit only that these developments aren't impossible. The will of the harder, better, faster, stronger will perpetually break envelopes. Implanted cognitive assistance is a very serious game-changer. Will this be where the fear monger's besmirchment of A.I. will finally be levied, not simply against a bad artificial intelligence but against a bad human's artificial intelligence. Or will it finally be the NZT that vaults human existance into the utopic-sphere. It was cool being a kid when everyone had computers and then the internet came along and boom the planet is all plugged in. It will be cooler when you are a kid and boom you all have cognitive functioning an order of magnitude greater than kids today. See how far we've come since we started playing with fire, its awesome, I'm not scared and I want more.

RE: Matrixing
By ClownPuncher on 10/14/2011 6:28:34 PM , Rating: 3
Yes, it is fairly exiting.

I'd break that up with paragraphs, though.

RE: Matrixing
By ClownPuncher on 10/14/2011 6:29:38 PM , Rating: 2

RE: Matrixing
By powerwerds on 10/14/2011 7:22:34 PM , Rating: 2
I am allowed to excersize such formatting and editing techniques as I see fit. I was just gist shooting, I hope I didn't fail to hit you with the gist, I can't promise you where its going to land, or what the serving is going to look like when it comes. Clownpunched.

That was a josh, but seriously tho I hear ya about it being a bit of a stretch in terms of length and idea multiplicity for no para-breaking. I need a flipking editor, a post-processing of my wheel-stream. What is the next version of this mental add-on, when can I download it straight to my brain? Can I form a folding@home network utilizing the subconscious computational cycles of a sleeping populace? Yes corporation you can use my brain while I'm sleeping, uh huh, ok, you promise you won't upload any funky stuff cause I can't think straight already, ok load the box and I'll pump that, if you are alive you are proscribed, hopefully we are to be hitting them with the good good when we get that far. I can't even guess what you'd have done with that last sentence, you mysterious rascal you.

RE: Matrixing
By powerwerds on 10/14/2011 7:38:08 PM , Rating: 2
Penmanship of a technically unlaudable construction a casualty of my enhanced neuroscape? Check.

RE: Matrixing
By powerwerds on 10/14/2011 7:56:26 PM , Rating: 2
I am to be establishing the coinage for this neuro add-on stack as Babel-on, oh and it comes with unlimited language switching, enabling a truly cosmopolitan expression.

RE: Matrixing
By Iketh on 10/14/2011 8:07:00 PM , Rating: 2
ahhh attack of the wall-of-text!

By 91TTZ on 10/14/2011 7:43:10 PM , Rating: 4
With the rate that computers are increasing in performance, if they do finally invent self-aware computer artificial intelligence it will only take a few years for the AI to go from acting like the dumbest fool you ever met to being smarter than Einstein.

By Gondor on 10/16/2011 4:40:19 AM , Rating: 2
Nah, we're too lazy.

By WinstonSmith on 10/16/2011 9:09:06 AM , Rating: 2
"In fact, IBM already reached an agreement with WellPoint, a U.S. health insurance company, to create Watson-like technology that uses WellPoint's historical data of treatments and results. This would make statistical decisions instead of skewed decisions by doctors."

Oh, goody! Rather than using Watson for medical diagnosis as a Gregory House level doctor, an insurance company will be using it to deny medical tests requested by doctors.

"It's okay. The scenarios aren't that clear. But it's good looking. [Steve Jobs] does good design, and [the iPad] is absolutely a good example of that." -- Bill Gates on the Apple iPad

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki