backtop


Print 32 comment(s) - last by Sunner.. on Oct 5 at 4:08 AM

Fermi supercomptuer will be ten times more powerful than today's fastest supercomptuer

NVIDIA was at the forefront of the push to move high-performance computing from CPUs to GPUs in scientific and other areas of research. As it turns out, the GPU is a very effective tool for running calculations historically run on the CPU.

NVIDIA announced its new Fermi architecture at its GPU Technology Conference recently. The new architecture was designed from the ground up to enable a new level of supercomputing using GPUs rather than CPUs. At the conference, Oak Ridge National Laboratory (ORNL) associate lab director for Computing and Computational Sciences, Jeff Nichols, announced that ORNL would be building a next generation supercomputer using the Fermi architecture.

The new supercomputer is expected to be ten times faster than today's fastest supercomputer. Nichols said that Fermi would enable substantial scientific breakthroughs that would have been impossible without the technology.

Nichols said, "This would be the first co-processing architecture that Oak Ridge has deployed for open science, and we are extremely excited about the opportunities it creates to solve huge scientific challenges. With the help of NVIDIA technology, Oak Ridge proposes to create a computing platform that will deliver exascale computing within ten years."

ORNL also announced at the conference that it would create a Hybrid Multicore Consortium with the goal of working with developers of major scientific codes to prepare the applications for the next generation of supercomputers using GPUs.

“The first two generations of the CUDA GPU architecture enabled NVIDIA to make real in-roads into the scientific computing space, delivering dramatic performance increases across a broad spectrum of applications,” said Bill Dally, chief scientist at NVIDIA. “The ‘Fermi’ architecture is a true engine of science and with the support of national research facilities such as ORNL, the possibilities are endless.”



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Crysis
By Bill0151 on 10/2/2009 10:59:44 AM , Rating: -1
I wonder if this will manage to run Crysis at a decent frame rate? It will probably be the only computer that does...




RE: Crysis
By 3minence on 10/2/2009 12:29:55 PM , Rating: 2
Actually, I was kinda thinking something similar.

It appears nVidia is designing their GPU to be General Processing Unit rather than Graphical Processing Unit. They are heading into the Research realm. How long before they stop fighting ATI and (soon) intel for the graphics crown and focus solely on research computing? How many compromises with their associated lost graphics performance did they make in order to make GT300 more dual purpose? Maybe they made none, but you rarely get something for nothing.

It's an interesting tact their taking. I shall enjoy watching where it leads.


RE: Crysis
By tviceman on 10/2/2009 1:26:57 PM , Rating: 2
There is no doubt that with this architecture the are expanding their horizons and going after fairly untapped and highly potential sources of revenue. However, I don't think they're going to abandon retail consumer gpu's - there is still a huge market for it and nvidia will likely want a piece of either the next xbox, psn, or even wii.

Without new chipset or x86 licenses, nvidia was backing into a dead end and, as such, have adapted to their situation and will likely strike gold in the scientific community.


RE: Crysis
By foolsgambit11 on 10/2/2009 6:43:00 PM , Rating: 2
I hate to the guy, but it's 'tack', not 'tact'. It's a sailing term denoting a course sailed with respect to the wind (among other things, like the action of changing course by bringing the bow through the eye of the wind, or the forward-lower corner of a fore-and-aft sail).


RE: Crysis
By GaryJohnson on 10/3/2009 12:24:21 AM , Rating: 1
I think that "tact" was actual a short-form or mispelling of "tactic" instead of whatever the hell it is you're talking about.


RE: Crysis
By Mclendo06 on 10/4/2009 2:37:42 AM , Rating: 2
Yeah, tack doesn't jive with the meaning he had in his post.

Anyone who got the subtle word play, email me and I'll ship you a cookie :-)


RE: Crysis
By luke84 on 10/2/2009 4:27:56 PM , Rating: 4
Graphics horsepower isn't the problem, Crysis's bad coding is.


RE: Crysis
By oab on 10/2/2009 10:09:06 PM , Rating: 2
The Radeon HD 5870 can run crysis at 'enthusiast' settings at high resolution (above 1680x1050) without breaking a sweat.

The meme is the problem, it is out of date.


RE: Crysis
By Lakku on 10/4/2009 12:37:26 AM , Rating: 2
So I suppose 26.6 FPS is not breaking a sweat? Yes it's the first card to have enthusiast settings across the board, but still struggles with the game. It's as much to do with bad coding then with the hardware it's run on, but the 5870 still has trouble, albeit less of it, running Crysis.


RE: Crysis
By luke84 on 10/4/2009 10:35:07 AM , Rating: 2
Put two of those puppies in Crossfire and you'll be able to run EVERYTHING, Crysis included. Just don't know till which resolution... although the 2 GB variants will help performance a lot in high resolution.


RE: Crysis
By Azsen on 10/4/2009 8:34:39 PM , Rating: 2
Point is you shouldn't have to need two cards in Crossfire/SLI to get good framerates and graphics in Crysis.

Comparing the source engine to the one in Crysis, you can play Half Life 2 with full settings and HDR and that looks and plays better than Crysis and you still get great framerates.

Crysis is poorly coded bloatware like Vista.


RE: Crysis
By Sunner on 10/5/2009 4:08:19 AM , Rating: 1
If you seriously think Half Life 2 looks anywhere near as good as Crysis you really need to lay off the shitty moonshine.


"Spreading the rumors, it's very easy because the people who write about Apple want that story, and you can claim its credible because you spoke to someone at Apple." -- Investment guru Jim Cramer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki