AMD Announces "Torrenza" Technology
June 1, 2006 2:20 PM
comment(s) - last by
AMD's "Torrenza" platform mock up
AMD opens up the Opteron architecture to other microprocessor R&D companies
Today AMD unveiled what it calls the evolution of enterprise level computing, called
. The new platform, says AMD, will utilize next-generation multi-core 64-bit processor that have the capability to work alongside specialized co-processors.
previously reported that AMD was considering working with
co-processing design firms such as ClearSpeed
, to develop and design platforms that would be able to utilize specialized processors for specific duties alongside the general host processor in a traditional Opteron socket.
, AMD has designed what it calls an open architecture, based on the next wave of Opteron processors, which allow what AMD calls "Accelerators." Using the add-in accelerators, a system will be capable of peforming specialized calculations similar in fashion to the way we use GPUs today.
Because of its flexibility, the HyperTransport protocol allows a multitude of co-processor designs that are already compatible with systems on other platforms. For example, with
, specialized co-processors are able to sit directly into an Opteron socket, and communicate directly with the entire system. During the conference, Cray Inc. noted that it had worked with AMD to design a system where a system can contain even up to three different co-processors, all dedicated to specialized tasks. All three processors would communicate directly with Opteron processors and the system chipset harmonously. The open-ended nature of
will allow companies to design specialized processors to plug-in and work with
-enabled Opteron systems. Although AMD acknowledges many of these applications can run off PCIe and other connection technologies,
emphasizes HT-3 and HTX in particular.
AMD representatives said that because of the archicture,
allows very low latency communication between chipset, main processor and co-processors. According to both Cray and AMD, applications can be written in a way where all the variouis processing architectures are recognized and are fully usable.
-aware applications are on the way said Cray, but the company did admit that developing them was very much "rocket-science".
This article is over a month old, voting and posting comments is disabled
RE: How handy
6/9/2006 6:11:46 AM
yeah. We'll solve issues, hit problems, stuck for a while, until eventually the mechanical parts near light speed and we hit the event horizon and have to start accounting for what special relativity talks about in regards to time. That will be a huge issue. When time is going at one rate on the the computers end, and a different rate at our end, how do we account for certain things, and monitor in real time? We'll have to learn how to use it to our advantage, ie do the inverse and speed time up on the computing end to enable massive leaps in technology on our end (what if you could get 100 years of processing (cpu speed and technology leaps become irrelevant at this point) done in one second?) If you don't understand cosmology please don't say I'm a quack. Go look and see what happens when you near a black hole in regards to time. Or what happens when you near light speed. We really do need to at least begin to consider different approaches as options as computing as a science is explored and exploited to it's fullest. I'll reitterate, if I could use something such as that to change our perspective on computing, hardware tech and developers as we know them would become obsolete, but how much would technology leap?
Yes this was a bit random, but I'm an astronomer and tech, so I tend to combine my knowledge of both. Heck, they've allready created a micro black hole in a lab, (incredibly bad idea I might add)we are approaching the time when things as this will be possible. Time travel is impossible. To warp timespace is. LOL I'm WAY off topic.
RE: How handy
7/26/2006 6:07:54 PM
100 years of processing in 1 second would only get you a burned CPU in a second :)
Your thread reminds me of Prince of Persia Warrior Within. You go forward in time, everything is in ruins.
Anyway... although you don't see scientists talking much about hitting the event horizon, they are dealing with problems nowadays too. I am sure your concerns have been thought more than a million times by genius electronic engineers. Its just not worth investing in right now.
We're not talking about smth that could require decades to develop, we're talking about what right now we describe as impossible in our existence.
PS: I would never let you put a black hole in my computer to alter time :D
"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs
AMD Interested In Reviving the Math Co-processor
March 18, 2006, 1:30 AM
Google's Gleaming Glass HQ Gets Mountain View Snub, LinkedIn Gets the Love
May 7, 2015, 6:58 AM
Tech's Tax Day Fortunate Few: Qualcomm, Xerox, GE, et al. Pay Little or No Taxes
April 15, 2015, 11:30 AM
LinkNYC Terminals to Blanket New York City With Free WiFi, Free Calls, and Ads
November 17, 2014, 6:50 PM
Microsoft is Open-Sourcing Most of .NET, Adding OS X and Linux Support
November 12, 2014, 8:27 PM
Home Depot Lost 53 Million Emails, Blames Windows, Buys Execs New Macs
November 9, 2014, 5:00 PM
Former NSA Lawyer: If Google, Apple Encrypt User Data, They’ll Wither on the Vine Like Blackberry
November 6, 2014, 12:15 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2016 DailyTech LLC. -
Terms, Conditions & Privacy Information