Print 8 comment(s) - last by Ktracho.. on May 20 at 12:56 PM

Machine uses Dell hardware

It may not top the TOP500, but the University of Florida's new 150 teraflop (TFLOP) HiPerGator is Florida's most powerful supercomputer.  The new supercomputer cost $3.4M USD to construct and is housed in a $15M USD 25,000 sq. foot data center on the UF campus.

The cluster uses Piledriver server chips (the sixteen-core Opteron 6378, to be precise) from Advanced Micro Devices, Inc. (AMD) and PowerEdge 6145 servers from Dell, Inc. (DELL).  The cluster has 16,384 CPU cores.  It uses interconnects from Mellanox Technologies, Ltd. (MLNX).

While it won't crack the top 10 ranks of the TOP500 anytime soon, the powerful supercomputer is expected to be put to use providing more accurate weather forecast, designing drugs, and designing better body armor.

UF immunologist David Ostrov comments, "HiPerGator can help get drugs get from the computer to the clinic more quickly. We want to discover and deliver safe, effective therapies that protect or restore people’s health as soon as we can.  UF’s supercomputer will allow me to spend my time on research instead of computing."

HiPerGator uses AMD Opteron CPU cores. [Image Source: AMD]

The university will share its new computing wealth with joint projects with other top institutions around the country.

While AMD's most recent server chips are a very cost effective option and Intel Corp.'s (INTC) Xeon line remains an industry standard, most of the world's top supercomputers use chips from International Business Machines, Inc. (IBM) whose supercomputer designs make up much of the high-end of the TOP500.

Sources: University of Florida, AMD

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Power guzzler
By Ktracho on 5/20/2013 12:56:06 PM , Rating: 2
All the major server vendors - IBM, HP, Dell, Lenovo, etc. - have off-the-shelf servers that can be stuffed with GPUs. Plus, there are lots of people who can write decent code for GPUs, especially in the research community and among graduate students. In fact, if a university cannot find people willing (to learn) to use the latest technology, they shouldn't be building a supercomputer (using old technology) in the first place. As for being within budget, they could have opted for a system with far fewer CPUs and smaller operating costs (due to lower power usage, for example), and still end up with higher performance than the system they got.

"I'd be pissed too, but you didn't have to go all Minority Report on his ass!" -- Jon Stewart on police raiding Gizmodo editor Jason Chen's home

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki