backtop


Print 22 comment(s) - last by Oralen.. on Sep 24 at 9:51 AM

Video games and immune system response modeling? It's all in the hardware.

Figuring out what happens when foreign bodies invade the human body is fairly serious business. Confined mostly to the realm of test tubes, complex reactions like the human immune system responding to a tuberculosis infection have been difficult and time consuming to study – willing volunteers for this kind of study are hard to come by.

Computer modeling has made these trial-and-error type studies easier by being able to replicate a system's response to an introduced external factor -- simply program the behavior of the components of the system and hit the start button.

Agent based modeling may take this kind of systems study to the next level, however, and a team of computer scientists at Michigan Tech University are making it happen without the use of super-powerful, super-expensive supercomputers. Under the direction of Roshan D'Souza, computer science students at MTU have developed agent based modeling software that harnesses the power of modern graphics processing units.

"With a $1,400 desktop, we can beat a computing cluster. We are effectively democratizing supercomputing and putting these powerful tools into the hands of any researcher. Every time I present this research, I make it a point to thank the millions of video gamers who have inadvertently made this possible,” says D'Souza of the project.

Agent based modeling is a very powerful tool in which many different components, factors and behaviors can be programmed and then let loose in a simulated environment. The outcomes of large systems are often unpredictable and surprising.

MTU's software, which was created by computer science student Mikola Lysenko, is not limited to small systems with few factors such as the previous tuberculosis example. Ryan Richards, a fellow computer science student explained, "We can do very complex ecosystems right now. If you're looking at epidemiology, we could easily simulate an epidemic in the US, Canada and Mexico."

It seems the days of supercomputers and complex simulations may be numbered, becoming an endangered species quickly at the hands of the relatively inexpensive gamer's video card. Perhaps the next step in evolution for these modeling software projects could be similar to Stanford's very successful Folding@Home project, using the client's GPU to power its way to understanding new and complex systems.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

memory
By winterspan on 9/23/2008 1:17:21 AM , Rating: 2
"It seems the days of supercomputers and complex simulations may be numbered..."

This is a very short-sighted comment. Only a certain subset of highly-parallel HPC applications can harness the unique capabilities of "GPGPU" processing on conventional graphics cards. Remember, even if a GTX 280-based stream computing card is rated at over 1 teraflop, it still only has 4GB of memory or whatever the latest is. Conventional CPU-array supercomputers don't just have massive number-crunching ability, they have MASSIVE combined pools of memory and memory bandwidth that blow away anything a GPU can do.




RE: memory
By CCRATA on 9/23/2008 2:04:54 AM , Rating: 2
If the work can be done on a large cluster, it can be split up for a GPU. They have obviously already done the work to parallelize the algorithm. Also you aren't just limited to the ram on the card, you can load subsets of the data you are working on in and out from memory. It would not be that expensive to create a SLI GTX280 with 16GM of System ram when you compare it to the cost of the cluster and infrastructure it can replace.


RE: memory
By Radnor on 9/23/2008 5:43:57 AM , Rating: 2
Don't forget, in GPUs there is no north bridge, or nothing delay the fetch to the ram. It is build-it. And you can already put several GPUs all with their own RAM. System RAM will still be shared. In that case don't forget GDDR3, GDDR5 are faster than their "generalist" counterparts.

As for sheer power, i advise ya to look in Google for Fastra. It is a medical imaging computer that can be bought 4k euros vs CalcUA that costs about 4 million. Fastra loses, but not by much. Of course the TOC and price differences are abysmal.


RE: memory
By Denithor on 9/23/2008 8:11:43 AM , Rating: 2
Meh...just get a blade server box with four Tesla cards. Those are what the server farms working for team WhoopAss over at the F@H project are using to crank out massive ppd.

http://www.nvidia.com/object/tesla_computing_solut...

Better yet, I believe up to three of those boxes can be connected to the same PC motherboard (TriSLI setup) for a total of 12 Tesla cards crunching per system.


"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki