Print 73 comment(s) - last by Shadowmaster62.. on Jun 16 at 8:36 AM

Specs for the Wii U, set to launch in 2012, have partially leaked.

A POWER7 CPU from IBM -- the same core design used inside the Watson supercomputer, which recently smoked Ken Jennings at Jeopardy on national TV.  (Source: IBM via Engadget)

The Wii U reportedly packs a GPU superior to the PS3 or Xbox 360's. It reportedly uses an AMD chip similar to that found in the Radeon 4000 Series.  (Source: Anandtech)
The system's full specs have leaked -- supposedly

Various sources have been busy spilling a semi-complete set of specs for the Wii U, Nintendo Comp., Ltd.'s (TYO:7974) quirky touch-screen successor to the best-selling Wii.

TIMEs "TechLand" blog claims that the console, set to launch in 2012, will pack a R700 series variant from Advanced Micro Devices, Inc. (AMD), built on a 32 nm process with 1 GB of video memory. R700 GPUs are found in AMD's two-generations-old Radeon 4000 Series -- the R700 architecture launched in 2008.

While the GPU may seem a bit underpowered by modern PC gaming standards, consider that the PlayStation 3 from Sony Corp. (
TYO:6758) uses a modified version of the NVIDIA Corp. (NVDA) chip found inside the GeForce 7800 (2006-era) and the Xbox 360 from Microsoft Corp. (MSFT) uses a "Xenos" AMD GPU -- which falls somewhere between a R520 (2005 era) and a R600 GPU (2006 era GPU).  In other words, by console standards, the Wii U's reported GPU is quite advanced, with its architecture surpassing those found in the PS3 or Xbox 360.

Likewise, the CPU sounds like a pretty tough character as well.  Engadget reports that Nintendo is using a POWER7 architecture CPU from International Business Machines Corp. (IBM) similar to that found in the Watson supercomputer.  By comparison, the PS3 uses a somewhat older Cell processor design, that is POWER4 compatible.  Noticeably missing are the core count and clock speed of the Wii U -- without this info it's unclear where the CPU will lie versus the PS3 in performance.

DRAM will reportedly be embedded directly on the CPU chip.  The amount of DRAM memory is still unknown -- Nintendo simply says it will be "a lot".

In an interview with Kotaku, Nintendo designer Katsuya Eguchi confirms that the Wii U will use a proprietary high-density optical disc format that isn't Blu-Ray.  That can't make Sony too happy.  Reportedly the discs will pack up to 25 GB -- the same as the maximum for a single-layer Blu-Ray disc.  Mr. Eguchi declined to reveal whether standard DVD playback would be supported, whether double-layer (50 GB) discs would be supported, and whether we might see movies shipping in this new format.

According to TIME the console will also likely have 8 GB of internal flash memory storage.  Additionally the system reportedly will have 4 USB ports and at least one SD card reader.  Using USB sticks or SD cards, the memory capacity can be expanded substantially.

A final item of interest is that the 6.2-inch touchscreen controller will be capable of output 1080p graphics via an HDMI connection.

From here on out the most pressing questions seem to be what the specifics of the CPU are (core count, clock speed); what kind of hardware rivals Microsoft and Sony are cooking up; and when that rival hardware will arrive.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Nintendo sucks
By GuinnessKMF on 6/14/2011 8:16:07 PM , Rating: 2
1080p is not the limiting factor, you can tax a modern GPU plenty at 1080p or less. Complex scenes often require multiple renderings of the same frame, between aliasing, reflections, and now even temporal aliasing (the reason why film looks good at 24 fps when video games don't is temporal 'aliasing' of the film). GPUs are also responsible for a lot more than just displaying the contents of a game, physics, and geometry are now often the responsibility of the GPU, sometimes even some AI.

I do agree that 2 year old hardware can handle most of what is needed to make a good game (infact good games don't *need* good graphics, but they're nice), but I think your understanding of graphic card requirements is a bit lacking.

"top-end" PC monitors aren't used for gaming, they're used by graphic professionals, gaming on PCs is most often done at 1080p (this is changing rapidly with multi-monitor gaming).

RE: Nintendo sucks
By BZDTemp on 6/15/2011 2:20:28 PM , Rating: 2
"top-end" PC monitors aren't used for gaming...

Sure they are. Of course this is not mainstream but among those that run the latest and greatest in hardware for gaming playing on 30" 2560x1600 monitors and lately the 27" 2560x1440 is not unheard of. In fact is likely more common than you think.

And why not - a 30" monitor is costly compared to even very good 24" monitors but considering it's likely to outlast 3-4 generation of gfx cards and at least two generations of PC's the cost is not that high. Plus for those of us that have been gaming for a long time all PC equipment is cheap as dirt (my first 16-bit sound card cost me aprox. $1000 in todays money and it was a good deal at the time).

RE: Nintendo sucks
By GuinnessKMF on 6/15/2011 2:55:06 PM , Rating: 2
I knew that one line in my response would get nit-picked out and the substance ignored.

The point is that resolution does not determine the limiting factor of graphical processing power required, alone. Resolution is a factor, but you could tax out the best GPU solution in existence right now rendering a complicated enough of a scene at 640x480.

A scene rendered at 1920x1080 with a single pass filter (say reflection, un-optimized AA, etc) requires roughly the same processing power as a 2560x1600 scene without the filter.

All that said, I will stand by my statement that the highest tier of monitors are not "aimed" at gaming (yes obviously there are people using them), most of the greater than 1920x1080/1200 monitors have noticeable lag/response time issues, as they are designed for proper color reproduction instead of gaming. If you want to talk about multi-monitor setups using hydravision or whatever they're calling it now, then yes, that's targeted at gaming over 1080p resolutions.

"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki