backtop


Print 73 comment(s) - last by Shadowmaster62.. on Jun 16 at 8:36 AM


Specs for the Wii U, set to launch in 2012, have partially leaked.

A POWER7 CPU from IBM -- the same core design used inside the Watson supercomputer, which recently smoked Ken Jennings at Jeopardy on national TV.  (Source: IBM via Engadget)

The Wii U reportedly packs a GPU superior to the PS3 or Xbox 360's. It reportedly uses an AMD chip similar to that found in the Radeon 4000 Series.  (Source: Anandtech)
The system's full specs have leaked -- supposedly

Various sources have been busy spilling a semi-complete set of specs for the Wii U, Nintendo Comp., Ltd.'s (TYO:7974) quirky touch-screen successor to the best-selling Wii.

TIMEs "TechLand" blog claims that the console, set to launch in 2012, will pack a R700 series variant from Advanced Micro Devices, Inc. (AMD), built on a 32 nm process with 1 GB of video memory. R700 GPUs are found in AMD's two-generations-old Radeon 4000 Series -- the R700 architecture launched in 2008.

While the GPU may seem a bit underpowered by modern PC gaming standards, consider that the PlayStation 3 from Sony Corp. (
TYO:6758) uses a modified version of the NVIDIA Corp. (NVDA) chip found inside the GeForce 7800 (2006-era) and the Xbox 360 from Microsoft Corp. (MSFT) uses a "Xenos" AMD GPU -- which falls somewhere between a R520 (2005 era) and a R600 GPU (2006 era GPU).  In other words, by console standards, the Wii U's reported GPU is quite advanced, with its architecture surpassing those found in the PS3 or Xbox 360.

Likewise, the CPU sounds like a pretty tough character as well.  Engadget reports that Nintendo is using a POWER7 architecture CPU from International Business Machines Corp. (IBM) similar to that found in the Watson supercomputer.  By comparison, the PS3 uses a somewhat older Cell processor design, that is POWER4 compatible.  Noticeably missing are the core count and clock speed of the Wii U -- without this info it's unclear where the CPU will lie versus the PS3 in performance.


DRAM will reportedly be embedded directly on the CPU chip.  The amount of DRAM memory is still unknown -- Nintendo simply says it will be "a lot".


In an interview with Kotaku, Nintendo designer Katsuya Eguchi confirms that the Wii U will use a proprietary high-density optical disc format that isn't Blu-Ray.  That can't make Sony too happy.  Reportedly the discs will pack up to 25 GB -- the same as the maximum for a single-layer Blu-Ray disc.  Mr. Eguchi declined to reveal whether standard DVD playback would be supported, whether double-layer (50 GB) discs would be supported, and whether we might see movies shipping in this new format.

According to TIME the console will also likely have 8 GB of internal flash memory storage.  Additionally the system reportedly will have 4 USB ports and at least one SD card reader.  Using USB sticks or SD cards, the memory capacity can be expanded substantially.

A final item of interest is that the 6.2-inch touchscreen controller will be capable of output 1080p graphics via an HDMI connection.

From here on out the most pressing questions seem to be what the specifics of the CPU are (core count, clock speed); what kind of hardware rivals Microsoft and Sony are cooking up; and when that rival hardware will arrive.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Nintendo sucks
By RussianSensation on 6/15/2011 2:03:22 AM , Rating: 2
Remember, PC hardware has to deal with a HUGE API bottleneck. John Carmack talks about it in his 20 minute interview at E3. The fact is a modern GPU is at least 10x more powerful than what's found in the PS3 and Xbox360. But games barely look 2x better on the PC.

Programmers are coding directly to the hardware on the console, while for the PC, they have to deal with the bloatware OS/API. Not only that but since you have a fixed ecosystem with consoles, it's much easier to optimize. Personally, it would have made much more sense to get a 32nm HD5770 version which has similar performance to the 4870 and DX11. However, I can see how Nintendo went the cheaper HD4000 series since HD5770 isn't fast enough to use DX11 specific features such as Tessellation.

Overall, if you look at the HD4870 vs. 7900GS (PS3 level), the Radeon is about 4-5x faster in modern games :
http://techreport.com/articles.x/18682/6

Of course we don't know the exact specs of the HD4000 series in the Wii-U, but if it's anything along the 4870/4890, it's fast enough for Nintendo as their strategy isn't to sell their consoles at a loss, nor to price them at $600 USD at launch.


RE: Nintendo sucks
By SilthDraeth on 6/15/2011 4:17:32 AM , Rating: 2
Except you just mentioned the API bottleneck, you take that away, then 5770 should be quick enough for all DX11 features.

I have a 4870 and it is still pretty fricken fast.


RE: Nintendo sucks
By TSS on 6/15/2011 11:00:38 AM , Rating: 2
John carmack i'll trust on code but not on beauty. They've admitted themselves when quake live was being developed that quake 3 didn't look as awesome as it could have - they where so busy innovating the code that they didn't bother looking at stuff like light placements (the lighting in quake live is vastly different). And Rage - ID's new title - really doesn't look that good. I wasn't very impressed during the first tech demo's of the engine but now that the game still isn't out and the UE4 tech demo is.... well it looks stale.

IMO that also shows the real culprit why games don't look as good as the hardware can make it look. Development overhead. If it was carmack coding Rage it would've been out years ago. The problem is the game is too big for just him to code, and others aren't quite the genius he is unfortunately.

It was shown with i belive Doom for the iphone. ID software said it couldn't be done - it would take a team of coders a month and that would cost too much. John Carmack then did it by himself in a weekend.

While he remains that genius, all games these days are made by that team of coders. Thus limiting the potential of the game from the start.

Aside from all that, it's a fact most games are console ports these days. Since their developped for the consoles most models have lower polycounts which cannot be changed unless the entire models, skin and animations are redone (in the worst case). Developers consider this too little bang for the buck these days so they pick the easyest route: Textures are made at double the console resolution and scaled back 50% for the consoles while the original is used for the PC. This also works because consoles have an abysmal small amount of memory (512mb total for the PS3 texture and system RAM, my 2 year old PC is sporting 6gb system and 1gb texture RAM). But because of this same reason even double the resolution looks ugly on PC's because they can handle 4 times the resolution by now and on selective props even 8 times the resolution.

Stuff like more vegitation, more physics, more debris, better looking models, more background stuff going on is all left out, even though PC's can handle it.

Not to mention games are being redesigned so consoles can handle them. Prime example, crysis 2. It looks worse then crysis 1 because consoles can't even handle crysis 1. It's not an open world, rather a huge corridor, and the jungle vegitation of crysis 1 IMO looks alot better then the urban enviroments of crysis 2.

So i'm sorry but on this one it's the developers themselves.


RE: Nintendo sucks
By someguy123 on 6/15/2011 4:17:13 PM , Rating: 2
I don't know what conventions you've been to but the engine looks amazing. You have a problem with the drab artstyle, not the engine itself. The same could be said about games based around ID4. ID made the decision to design their games tinted brown and "dirty". The engine is capable of much more.

The UE4 tech demo is merely a demo. If you look up the original UE3 photos and original tech demo attached (before the public demos) it looked about as good as UE4 in character rendering. The same could be said about the alpha build of unreal tournament 3.

I don't agree with people's assessment that PC has 10 times the bloat, but there are overhead problems, as well as having to deal with scaling for various builds compared to single hardware spec. When it comes to brute force and just dumping massive textures on objects, you end up with aliasing issues like the witcher 2, which relies on FSAA to be cleaned. Looking at crysis, their vegetation looks great, but there are a lot of texture problems and indoor areas don't look nearly as good as the replicated vegetation. Even an engine like cryengine 2 relying heavily on sheer processing power needs to cut corners. To be honest, I still think uncharted 2 is the best looking game regardless of platform, though obviously it's limited in resolution.


“So far we have not seen a single Android device that does not infringe on our patents." -- Microsoft General Counsel Brad Smith














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki