Print 32 comment(s) - last by Sunner.. on Oct 5 at 4:08 AM

Fermi supercomptuer will be ten times more powerful than today's fastest supercomptuer

NVIDIA was at the forefront of the push to move high-performance computing from CPUs to GPUs in scientific and other areas of research. As it turns out, the GPU is a very effective tool for running calculations historically run on the CPU.

NVIDIA announced its new Fermi architecture at its GPU Technology Conference recently. The new architecture was designed from the ground up to enable a new level of supercomputing using GPUs rather than CPUs. At the conference, Oak Ridge National Laboratory (ORNL) associate lab director for Computing and Computational Sciences, Jeff Nichols, announced that ORNL would be building a next generation supercomputer using the Fermi architecture.

The new supercomputer is expected to be ten times faster than today's fastest supercomputer. Nichols said that Fermi would enable substantial scientific breakthroughs that would have been impossible without the technology.

Nichols said, "This would be the first co-processing architecture that Oak Ridge has deployed for open science, and we are extremely excited about the opportunities it creates to solve huge scientific challenges. With the help of NVIDIA technology, Oak Ridge proposes to create a computing platform that will deliver exascale computing within ten years."

ORNL also announced at the conference that it would create a Hybrid Multicore Consortium with the goal of working with developers of major scientific codes to prepare the applications for the next generation of supercomputers using GPUs.

“The first two generations of the CUDA GPU architecture enabled NVIDIA to make real in-roads into the scientific computing space, delivering dramatic performance increases across a broad spectrum of applications,” said Bill Dally, chief scientist at NVIDIA. “The ‘Fermi’ architecture is a true engine of science and with the support of national research facilities such as ORNL, the possibilities are endless.”

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

bring on the supercomptuers
By Michael86 on 10/2/2009 10:47:20 AM , Rating: 5
i was wondering when these "supercomptuers" i keep hearing about will start to replace our current supercomputers

RE: bring on the supercomptuers
By Jaybus on 10/2/2009 12:36:07 PM , Rating: 5
They already have, and have been doing so since the beginning. Many of not most of the functionality of our CPUs and GPUs were pioneered in mainframes and supercomputers years or decades ago. For example, hardware-assisted virtualization was first introduced on the IBM System/370 in 1972. The Texas Instruments Advanced Scientific Computer was built in the late 1960's and had multi-core processors with dedicated vector processing instructions, or in other words "stream processors".

RE: bring on the supercomptuers
By teldar on 10/3/2009 6:35:26 PM , Rating: 2
EPIC FAIL, dude.
Reread his post.
NOT Computer.
Please try again.

Just like it says in the article.

RE: bring on the supercomptuers
By Camikazi on 10/2/2009 2:37:51 PM , Rating: 2
By the time we get supercomputers with those specs there will be supercomputers with 100 times the speed and ours won't seem so super anymore.

RE: bring on the supercomptuers
By luke84 on 10/2/2009 4:33:30 PM , Rating: 3
It's a lot like women. When you get a sexy 25 year old wife, it's "super". 50 years later when she's 75, it's not that "super" anymore, she's just a plain ol' Pentium 4.

RE: bring on the supercomptuers
By jconan on 10/3/2009 2:41:25 AM , Rating: 2
that's because "women have expiration dates" quote from Bleach's Haineko.

RE: bring on the supercomptuers
By Camikazi on 10/3/2009 9:31:27 PM , Rating: 2
BLEACH!!! WOOO!!! and I woudln't mind playing with that little kitty Haineko :)

A glimpse of the future?
By captchaos2 on 10/2/2009 10:46:02 AM , Rating: 2
This is good news for nvidia. When can I expect to buy an Nvidia processor for an Nvidia mobo with a triple SLI vid card setup?

RE: A glimpse of the future?
By Jedi2155 on 10/2/2009 10:53:56 AM , Rating: 3
I don't believe nVidia will venture into the consumer level desktop CPU space anytime soon. As the GPU's still far too different to run the vast numbers of x86 applications available.

Their research into this field is probably primarily based on their fear of Larabee however it seems like its Intel/AMD who should be more afraid of nVidia as it eats into their high end CPU sales from government.


RE: A glimpse of the future?
By Yojimbo on 10/2/2009 11:21:20 AM , Rating: 2
Their research into this field is probably primarily based on their fear of Larabee however it seems like its Intel/AMD who should be more afraid of nVidia as it eats into their high end CPU sales from government.

Their research into the area has nothing to do with Larabee. It started years ago before Intel was talking about Larabee. I remember seeing a paper by nvidia and some university about gpu computing at least 5 years ago. If anything, Larabee is a response to Intel's fears about their research in this area.

RE: A glimpse of the future?
By F41TH00 on 10/2/2009 1:24:58 PM , Rating: 2
You are correct. Nvidia has long researched what Intel called Larabee approximately 5-7 yrs ago. It's known as Cuda.

RE: A glimpse of the future?
By habibo on 10/4/2009 1:42:01 AM , Rating: 2
CUDA was made public only 3 years ago - but Larrabee is most certainly a response to CUDA and not the other way around.

RE: A glimpse of the future?
By nafhan on 10/2/2009 11:53:00 AM , Rating: 2
If the fermi architecture is Turing complete, they could run an x86 emulator on it... :)

Seriously though, x86 isn't the only game in town. My guess is they'll probably continue to build on their relationship with ARM by improving and evolving Tegra. Keep in mind that Google and Apple both have OS's that run on ARM.

RE: A glimpse of the future?
By MatthiasF on 10/2/2009 1:48:22 PM , Rating: 2
Don't forget about MIPS! They still have some very nice products that could be used in these situations.

RE: A glimpse of the future?
By surt on 10/4/2009 1:32:29 PM , Rating: 2
An x86 emulator on fermi would be painfully slow as a cpu, because unlike a gpu, a cpu often wants to do something other than multiply floating point numbers.

Im Fermilicious!
By SilthDraeth on 10/2/2009 10:46:17 AM , Rating: 3
Four, tres, two, uno

Listen up y'all cause this is it
The beat that I'm bangin' is delicious

Fermilicious definition
Them scientists resister
They want my treasure
So they get their pleasures from my transistors

RE: Im Fermilicious!
By StevoLincolnite on 10/2/2009 10:48:21 AM , Rating: 3
I'll help you people translate what the above poster said:

"I'm high as a kite and drunk...
I live in a padded room, surrounded by Monkeys wearing Pink Tuu Tuu's"

RE: Im Fermilicious!
By R3T4rd on 10/2/2009 10:58:59 AM , Rating: 1
No no no got it all wrong. What he's saying is....

ORNL will coat the servers in nice shiny glossy white then put an Apple with a cape logo on it and call it the Super the thing a SuperMac.

It does depend on one factor though...
By Saist on 10/3/2009 1:42:30 AM , Rating: 2
The proposed supercomputer does depend on one important factor... that Fermi actually is real. The big problem Nvidia has at this point... is that it's not. As far as I'm aware, Nvidia wasn't even able to show Fermi silicon working on IKOS boxes... much less actually showing off fabbed versions of even beta silicon. White papers and theories are one thing... real-life performance is another.

Remember the P4 architecture? Northwood and Prescott? Scaling to 8ghz and beyond? Remember Hyperthreading? Remember the Radeon 8500's 3 textures per pass? Just because those technologies worked on paper, didn't mean they actually worked when put in Silicon... and didn't mean that developers actually coded to use the features.

Nvidia's got even larger issues than non-working silicon. Their stock has been taking repetitive beatings each time a vendor issues a recall on Nvidia hardware. It's no secret that Nvidia's got a huge problem with runaway thermal envelopes. The promise to offer a certain amount of computing power in a certain thermal envelope... should send most investors into fits of hysteria.

From an outside viewpoint, having a vendor interested in Fermi this early is no doubt good news for Nvidia and it's stock. However, if Fermi isn't all that Nvidia's hyped it as... it could just be the nail that sends Nvidia into bankruptcy.

By habibo on 10/4/2009 1:46:20 AM , Rating: 2
What silicon do you propose that they were demoing this past week at their GPU Technology Conference? Are you suggesting a "faked moon landing" conspiracy theory?

By Bill0151 on 10/2/09, Rating: -1
RE: Crysis
By 3minence on 10/2/2009 12:29:55 PM , Rating: 2
Actually, I was kinda thinking something similar.

It appears nVidia is designing their GPU to be General Processing Unit rather than Graphical Processing Unit. They are heading into the Research realm. How long before they stop fighting ATI and (soon) intel for the graphics crown and focus solely on research computing? How many compromises with their associated lost graphics performance did they make in order to make GT300 more dual purpose? Maybe they made none, but you rarely get something for nothing.

It's an interesting tact their taking. I shall enjoy watching where it leads.

RE: Crysis
By tviceman on 10/2/2009 1:26:57 PM , Rating: 2
There is no doubt that with this architecture the are expanding their horizons and going after fairly untapped and highly potential sources of revenue. However, I don't think they're going to abandon retail consumer gpu's - there is still a huge market for it and nvidia will likely want a piece of either the next xbox, psn, or even wii.

Without new chipset or x86 licenses, nvidia was backing into a dead end and, as such, have adapted to their situation and will likely strike gold in the scientific community.

RE: Crysis
By foolsgambit11 on 10/2/2009 6:43:00 PM , Rating: 2
I hate to the guy, but it's 'tack', not 'tact'. It's a sailing term denoting a course sailed with respect to the wind (among other things, like the action of changing course by bringing the bow through the eye of the wind, or the forward-lower corner of a fore-and-aft sail).

RE: Crysis
By GaryJohnson on 10/3/2009 12:24:21 AM , Rating: 1
I think that "tact" was actual a short-form or mispelling of "tactic" instead of whatever the hell it is you're talking about.

RE: Crysis
By Mclendo06 on 10/4/2009 2:37:42 AM , Rating: 2
Yeah, tack doesn't jive with the meaning he had in his post.

Anyone who got the subtle word play, email me and I'll ship you a cookie :-)

RE: Crysis
By luke84 on 10/2/2009 4:27:56 PM , Rating: 4
Graphics horsepower isn't the problem, Crysis's bad coding is.

RE: Crysis
By oab on 10/2/2009 10:09:06 PM , Rating: 2
The Radeon HD 5870 can run crysis at 'enthusiast' settings at high resolution (above 1680x1050) without breaking a sweat.

The meme is the problem, it is out of date.

RE: Crysis
By Lakku on 10/4/2009 12:37:26 AM , Rating: 2
So I suppose 26.6 FPS is not breaking a sweat? Yes it's the first card to have enthusiast settings across the board, but still struggles with the game. It's as much to do with bad coding then with the hardware it's run on, but the 5870 still has trouble, albeit less of it, running Crysis.

RE: Crysis
By luke84 on 10/4/2009 10:35:07 AM , Rating: 2
Put two of those puppies in Crossfire and you'll be able to run EVERYTHING, Crysis included. Just don't know till which resolution... although the 2 GB variants will help performance a lot in high resolution.

RE: Crysis
By Azsen on 10/4/2009 8:34:39 PM , Rating: 2
Point is you shouldn't have to need two cards in Crossfire/SLI to get good framerates and graphics in Crysis.

Comparing the source engine to the one in Crysis, you can play Half Life 2 with full settings and HDR and that looks and plays better than Crysis and you still get great framerates.

Crysis is poorly coded bloatware like Vista.

RE: Crysis
By Sunner on 10/5/2009 4:08:19 AM , Rating: 1
If you seriously think Half Life 2 looks anywhere near as good as Crysis you really need to lay off the shitty moonshine.

"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki