backtop


Print 73 comment(s) - last by Shadowmaster62.. on Jun 16 at 8:36 AM


Specs for the Wii U, set to launch in 2012, have partially leaked.

A POWER7 CPU from IBM -- the same core design used inside the Watson supercomputer, which recently smoked Ken Jennings at Jeopardy on national TV.  (Source: IBM via Engadget)

The Wii U reportedly packs a GPU superior to the PS3 or Xbox 360's. It reportedly uses an AMD chip similar to that found in the Radeon 4000 Series.  (Source: Anandtech)
The system's full specs have leaked -- supposedly

Various sources have been busy spilling a semi-complete set of specs for the Wii U, Nintendo Comp., Ltd.'s (TYO:7974) quirky touch-screen successor to the best-selling Wii.

TIMEs "TechLand" blog claims that the console, set to launch in 2012, will pack a R700 series variant from Advanced Micro Devices, Inc. (AMD), built on a 32 nm process with 1 GB of video memory. R700 GPUs are found in AMD's two-generations-old Radeon 4000 Series -- the R700 architecture launched in 2008.

While the GPU may seem a bit underpowered by modern PC gaming standards, consider that the PlayStation 3 from Sony Corp. (
TYO:6758) uses a modified version of the NVIDIA Corp. (NVDA) chip found inside the GeForce 7800 (2006-era) and the Xbox 360 from Microsoft Corp. (MSFT) uses a "Xenos" AMD GPU -- which falls somewhere between a R520 (2005 era) and a R600 GPU (2006 era GPU).  In other words, by console standards, the Wii U's reported GPU is quite advanced, with its architecture surpassing those found in the PS3 or Xbox 360.

Likewise, the CPU sounds like a pretty tough character as well.  Engadget reports that Nintendo is using a POWER7 architecture CPU from International Business Machines Corp. (IBM) similar to that found in the Watson supercomputer.  By comparison, the PS3 uses a somewhat older Cell processor design, that is POWER4 compatible.  Noticeably missing are the core count and clock speed of the Wii U -- without this info it's unclear where the CPU will lie versus the PS3 in performance.


DRAM will reportedly be embedded directly on the CPU chip.  The amount of DRAM memory is still unknown -- Nintendo simply says it will be "a lot".


In an interview with Kotaku, Nintendo designer Katsuya Eguchi confirms that the Wii U will use a proprietary high-density optical disc format that isn't Blu-Ray.  That can't make Sony too happy.  Reportedly the discs will pack up to 25 GB -- the same as the maximum for a single-layer Blu-Ray disc.  Mr. Eguchi declined to reveal whether standard DVD playback would be supported, whether double-layer (50 GB) discs would be supported, and whether we might see movies shipping in this new format.

According to TIME the console will also likely have 8 GB of internal flash memory storage.  Additionally the system reportedly will have 4 USB ports and at least one SD card reader.  Using USB sticks or SD cards, the memory capacity can be expanded substantially.

A final item of interest is that the 6.2-inch touchscreen controller will be capable of output 1080p graphics via an HDMI connection.

From here on out the most pressing questions seem to be what the specifics of the CPU are (core count, clock speed); what kind of hardware rivals Microsoft and Sony are cooking up; and when that rival hardware will arrive.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By RadnorHarkonnen on 6/15/2011 8:43:23 AM , Rating: 2
I Stll have a 4850 CF running. Chews everything i throw at him. Except for crysis, but its a 512mb fault, not the cards/cpu fault. I play on 1650x1050. Going from 55nm to 32nm on it would increase its performance some 20% and reduce heat output alot. You still need to add tweaks they surely putted in.

Anyway, it is the ONLY console with DX10.1 capabilities. Most likely they updated it for DX11 and its quite up to par with today standards.

I personaly like this approach. Tested and true.You probably wont have any "RROD" problems with this cards. Although i don't like to play on consoles (XBOX and PS3 for me are quite sub standard now) because of interface and because I am an eye candy junkie, i believe that card would do just fine in 1080p.


"Nowadays you can buy a CPU cheaper than the CPU fan." -- Unnamed AMD executive














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki