backtop


Print 22 comment(s) - last by Oralen.. on Sep 24 at 9:51 AM

Video games and immune system response modeling? It's all in the hardware.

Figuring out what happens when foreign bodies invade the human body is fairly serious business. Confined mostly to the realm of test tubes, complex reactions like the human immune system responding to a tuberculosis infection have been difficult and time consuming to study – willing volunteers for this kind of study are hard to come by.

Computer modeling has made these trial-and-error type studies easier by being able to replicate a system's response to an introduced external factor -- simply program the behavior of the components of the system and hit the start button.

Agent based modeling may take this kind of systems study to the next level, however, and a team of computer scientists at Michigan Tech University are making it happen without the use of super-powerful, super-expensive supercomputers. Under the direction of Roshan D'Souza, computer science students at MTU have developed agent based modeling software that harnesses the power of modern graphics processing units.

"With a $1,400 desktop, we can beat a computing cluster. We are effectively democratizing supercomputing and putting these powerful tools into the hands of any researcher. Every time I present this research, I make it a point to thank the millions of video gamers who have inadvertently made this possible,” says D'Souza of the project.

Agent based modeling is a very powerful tool in which many different components, factors and behaviors can be programmed and then let loose in a simulated environment. The outcomes of large systems are often unpredictable and surprising.

MTU's software, which was created by computer science student Mikola Lysenko, is not limited to small systems with few factors such as the previous tuberculosis example. Ryan Richards, a fellow computer science student explained, "We can do very complex ecosystems right now. If you're looking at epidemiology, we could easily simulate an epidemic in the US, Canada and Mexico."

It seems the days of supercomputers and complex simulations may be numbered, becoming an endangered species quickly at the hands of the relatively inexpensive gamer's video card. Perhaps the next step in evolution for these modeling software projects could be similar to Stanford's very successful Folding@Home project, using the client's GPU to power its way to understanding new and complex systems.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Quad-crossfire?
By silversound on 9/22/2008 6:01:01 PM , Rating: 1
quad-crossfire in the picture?
Is this Intel Skulltrail?




RE: Quad-crossfire?
By dajeepster on 9/22/2008 6:09:08 PM , Rating: 2
it's an AMD crossfire set up.


RE: Quad-crossfire?
By Joz on 9/22/2008 7:02:00 PM , Rating: 5
Its a quad socket VIA system using octa-core Nano CPU's and 16x Chrome 440GTX graphics cards!


RE: Quad-crossfire?
By Souka on 9/22/2008 7:15:53 PM , Rating: 4
nothing beats a dual Monster PCI 3D 12mb card setup... with a Matrox G200 running AGP.

;)


RE: Quad-crossfire?
By grath on 9/22/2008 9:06:01 PM , Rating: 4
... except for my two linked TI-85 graphing calculators running zShell in 8 kilopixel monochrome


RE: Quad-crossfire?
By SunAngel on 9/23/2008 1:54:00 AM , Rating: 2
My GMA X4500HD kills all of them put together!

Lets see you top that!


RE: Quad-crossfire?
By FaceMaster on 9/23/2008 2:20:50 AM , Rating: 2
Yeah but can it run Crysis?


RE: Quad-crossfire?
By StevoLincolnite on 9/23/2008 3:17:20 AM , Rating: 2
No it can't, but my S3 Virge can!


RE: Quad-crossfire?
By CommodoreVic20 on 9/23/2008 8:14:03 AM , Rating: 2
My commodore Vic 20 beats the pants off all you guys!


RE: Quad-crossfire?
By greylica on 9/23/2008 9:55:01 AM , Rating: 2
Annnnd the question is :
Will it Blend ?


RE: Quad-crossfire?
By Souka on 9/23/2008 10:36:23 AM , Rating: 2
Actually.... just a few weeks ago I came across my Ti-85 calculator from college and we blended it in a blender that was in another box.

Wwiiizzzzzz...bang...bang...bang...crunch..crunch ....then the blender broke. :)

Now before any of ya make a complaint like "you could sell it" it had two problems... 1st: batteries were left in and had completely leaked and corroded the guts. 2nd: it had my student ID inscribed deeply into cover and body...back then studentID was your SSN#...

Cheers!


RE: Quad-crossfire?
By SunAngel on 9/23/2008 11:24:37 AM , Rating: 2
Dude, when were you in college - 1951? Excel has been the choice of calculators since Office 2000.


RE: Quad-crossfire?
By Suntan on 9/23/2008 11:51:39 AM , Rating: 2
Yes, of course, because nobody went to college between 1951 and 2000...

Good grief, some of the people we have to share this planet with.

-Suntan


RE: Quad-crossfire?
By Guerreiro29 on 9/23/2008 3:55:32 PM , Rating: 2
Excel? You must not have taken any math intensive courses in college if you were using Excel as a calculator. Sure it handles large data sets nicely, but everything it does with the data sets boils down to add, subtract, multiply or divide. No calculus, linear algebra, differential equations, or vector manipulations whatsoever. No thank you, I’ll take my trusty TI-89 over Excel any day, although I would prefer Matlab which actually is the calculator of choice for math, science, and engineering.


RE: Quad-crossfire?
By Fnoob on 9/23/2008 11:32:49 AM , Rating: 2
Crysis? Only in DX9 mode at 4fps....

But it WILL run solitare at 64000fps!


RE: Quad-crossfire?
By Visk on 9/22/2008 8:47:35 PM , Rating: 2
That's a reference AMD 790FX motherboard with four Radeon 3850 video cards.

Here's a larger image of the thumbnail: http://i34.tinypic.com/hwzpn8.png


memory
By winterspan on 9/23/2008 1:17:21 AM , Rating: 2
"It seems the days of supercomputers and complex simulations may be numbered..."

This is a very short-sighted comment. Only a certain subset of highly-parallel HPC applications can harness the unique capabilities of "GPGPU" processing on conventional graphics cards. Remember, even if a GTX 280-based stream computing card is rated at over 1 teraflop, it still only has 4GB of memory or whatever the latest is. Conventional CPU-array supercomputers don't just have massive number-crunching ability, they have MASSIVE combined pools of memory and memory bandwidth that blow away anything a GPU can do.




RE: memory
By CCRATA on 9/23/2008 2:04:54 AM , Rating: 2
If the work can be done on a large cluster, it can be split up for a GPU. They have obviously already done the work to parallelize the algorithm. Also you aren't just limited to the ram on the card, you can load subsets of the data you are working on in and out from memory. It would not be that expensive to create a SLI GTX280 with 16GM of System ram when you compare it to the cost of the cluster and infrastructure it can replace.


RE: memory
By Radnor on 9/23/2008 5:43:57 AM , Rating: 2
Don't forget, in GPUs there is no north bridge, or nothing delay the fetch to the ram. It is build-it. And you can already put several GPUs all with their own RAM. System RAM will still be shared. In that case don't forget GDDR3, GDDR5 are faster than their "generalist" counterparts.

As for sheer power, i advise ya to look in Google for Fastra. It is a medical imaging computer that can be bought 4k euros vs CalcUA that costs about 4 million. Fastra loses, but not by much. Of course the TOC and price differences are abysmal.


RE: memory
By Denithor on 9/23/2008 8:11:43 AM , Rating: 2
Meh...just get a blade server box with four Tesla cards. Those are what the server farms working for team WhoopAss over at the F@H project are using to crank out massive ppd.

http://www.nvidia.com/object/tesla_computing_solut...

Better yet, I believe up to three of those boxes can be connected to the same PC motherboard (TriSLI setup) for a total of 12 Tesla cards crunching per system.


Thanks to / Courtesy of
By phxfreddy on 9/22/2008 8:21:35 PM , Rating: 3
Get some new lines boys!




The way things are progressing...
By Oralen on 9/24/2008 9:51:19 AM , Rating: 2
One day, my watch will tell me when a microbe entered my body, at what rate the suckers multiply...

And cancel all my future appointment if I'm scheduled to die from that infection.

Bugger.




"Well, we didn't have anyone in line that got shot waiting for our system." -- Nintendo of America Vice President Perrin Kaplan

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki