Print 73 comment(s) - last by Shadowmaster62.. on Jun 16 at 8:36 AM

Specs for the Wii U, set to launch in 2012, have partially leaked.

A POWER7 CPU from IBM -- the same core design used inside the Watson supercomputer, which recently smoked Ken Jennings at Jeopardy on national TV.  (Source: IBM via Engadget)

The Wii U reportedly packs a GPU superior to the PS3 or Xbox 360's. It reportedly uses an AMD chip similar to that found in the Radeon 4000 Series.  (Source: Anandtech)
The system's full specs have leaked -- supposedly

Various sources have been busy spilling a semi-complete set of specs for the Wii U, Nintendo Comp., Ltd.'s (TYO:7974) quirky touch-screen successor to the best-selling Wii.

TIMEs "TechLand" blog claims that the console, set to launch in 2012, will pack a R700 series variant from Advanced Micro Devices, Inc. (AMD), built on a 32 nm process with 1 GB of video memory. R700 GPUs are found in AMD's two-generations-old Radeon 4000 Series -- the R700 architecture launched in 2008.

While the GPU may seem a bit underpowered by modern PC gaming standards, consider that the PlayStation 3 from Sony Corp. (
TYO:6758) uses a modified version of the NVIDIA Corp. (NVDA) chip found inside the GeForce 7800 (2006-era) and the Xbox 360 from Microsoft Corp. (MSFT) uses a "Xenos" AMD GPU -- which falls somewhere between a R520 (2005 era) and a R600 GPU (2006 era GPU).  In other words, by console standards, the Wii U's reported GPU is quite advanced, with its architecture surpassing those found in the PS3 or Xbox 360.

Likewise, the CPU sounds like a pretty tough character as well.  Engadget reports that Nintendo is using a POWER7 architecture CPU from International Business Machines Corp. (IBM) similar to that found in the Watson supercomputer.  By comparison, the PS3 uses a somewhat older Cell processor design, that is POWER4 compatible.  Noticeably missing are the core count and clock speed of the Wii U -- without this info it's unclear where the CPU will lie versus the PS3 in performance.

DRAM will reportedly be embedded directly on the CPU chip.  The amount of DRAM memory is still unknown -- Nintendo simply says it will be "a lot".

In an interview with Kotaku, Nintendo designer Katsuya Eguchi confirms that the Wii U will use a proprietary high-density optical disc format that isn't Blu-Ray.  That can't make Sony too happy.  Reportedly the discs will pack up to 25 GB -- the same as the maximum for a single-layer Blu-Ray disc.  Mr. Eguchi declined to reveal whether standard DVD playback would be supported, whether double-layer (50 GB) discs would be supported, and whether we might see movies shipping in this new format.

According to TIME the console will also likely have 8 GB of internal flash memory storage.  Additionally the system reportedly will have 4 USB ports and at least one SD card reader.  Using USB sticks or SD cards, the memory capacity can be expanded substantially.

A final item of interest is that the 6.2-inch touchscreen controller will be capable of output 1080p graphics via an HDMI connection.

From here on out the most pressing questions seem to be what the specifics of the CPU are (core count, clock speed); what kind of hardware rivals Microsoft and Sony are cooking up; and when that rival hardware will arrive.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By CharonPDX on 6/14/2011 3:03:10 PM , Rating: 5
The GameCube used a disc that was physically the standard 8cm in size, using the standard DVD-ROM laser and essentially the same formatting - they just made it have a track closer to the center of the disc than the DVD format officially allows.

The Wii used a normal-size 12cm. Again, using fairly standard DVD-ROM formatting - but, again, they have a track that is non-DVD-standard.

However, in both cases, there are some standard DVD-ROM drives out there that are physically capable of reading the custom discs.

My guess is that the Wii U does the same thing, only updating the physical media to be based on Blu-ray, with some slight change to make it incompatible with standard Blu-ray drives. It would be *INSANE* to create their own 100% custom new format, when they can just follow what they've done for two generations already. Take an existing standard, and tweak it just enough to be not-trivial to pirate.

By therealnickdanger on 6/14/2011 3:08:47 PM , Rating: 4
It would be *INSANE* to create their own 100% custom new format

Not that insane. Remember carts?

I'm thinking that you're correct, however. Assuming they just modify the data structure on a conventional 30GB HD-DVD or a 25GB Blu-ray disc, anyone with a reader could still copy and distribute the game data. Real pirates don't take "no" for an answer.

By gamerk2 on 6/14/2011 3:37:56 PM , Rating: 2
Actually not; only a handful of DVD drives can read GC/Wii games, and none are currently in production that I know of.

That being said, people were smart enough to stockpile those drives and are now making a killing on ebay. Someone will find a way to crack the format on some drive somewhere...

By JonnyBlaze on 6/15/2011 10:49:37 PM , Rating: 2
as long as one person can read them they will be copied. wii games burn and play on standard dvds just fine anyways. no one does that they just put them on a usb hard drive and play them from there.

By Samus on 6/14/2011 11:32:39 PM , Rating: 3
I'd be willing to bet anything the disc format is simply HD-DVD.

The flexibility of using a red laser, Nintendo's long-standing ties to Toshiba (who manufactures Nintendo's drives) and the lack of licensing fees are all very attractive options to keeping costs down, profitability up, and piracy low (HD-DVD burners have gone the way of the dodo.)

By CZroe on 6/15/2011 12:58:45 AM , Rating: 2
Long-standing ties to Toshiba? Panasonic has always made their optical drives. Panasonic made the SD card adapter and official SD card on GameCube too, IIRC. I can't think of anything they've done with Toshiba and the capacity is more in-line with a single-layer BD than a DL HD-DVD.

By Samus on 6/15/2011 2:51:44 PM , Rating: 3
Toshiba has provided low-power LCD technology used in Nintendo portable's for nearly a decade, from the reflective LCD in the Gameboy Advance the 3D LCD technology used in the 3DS...

As far as Panasonic is concerned, you are right, they have historically provided memory formats for Nintendo, such as the defunct 3DO (originally an addon drive for the SNES) and the drives used in the Gamecube-forward.

However, don't forget Panasonic backed HD-DVD, not Bluray. And it is entirely possible for HD-DVD to have the same storage capacity as Bluray (25/50GB) with different error correction (HD-DVD has more error correction as to be more durable...remember original BD-ROM prototypes were so sensitive they were required to wear caddies until the production lines were modified!) and using the Bluray stamping process (HD-DVD uses traditional DVD stamping...)

Fujitsu has provided audio processors and memory chips to Nintendo dating back to the NES. Fujitsu semiconductor is a subsidiary of Toshiba, and Toshiba bought their storage division in 2009.

By CZroe on 6/15/2011 7:30:57 PM , Rating: 1
Playstation was originally an add-on for the SNES. 3DO was marketed to multiple manufacturers including Panasonic as a standardized platform from the 3DO company. Panasonic bought the follow-up console lock, stock, and barrel to sell as the "M2" (Matsushita 2). I would expect someone to mix up the Sony Playstation and Philips CD-i before thinking that the 3DO was ever meant as an SNES expansion!

Also, Sharp and Hitachi provided the LCD technology for the GBA, NDS, and 3DS, NOT Toshiba. They were played against each other instead of getting exclusive component contracts in an attempt to lower costs but Sharp and Hitachi were caught price-fixing instead. Anyway, the 3D parallax barrier tech that made the 3DS possible was Sharp's and neither Hitachi's nor Toshiba's.

HD-DVD required more error correction in order to be possible to make the discs on equipment designed for DVD tolerances. They were by their very nature more fragile from the start due to the equipment not being designed for such densities. The early DVD recorders all had caddies too (remember the early Panasonic DVD-RAM drives?). BD has a protective coating that is also necessary to make it acceptably durable with the error correction used at those densities, but they are both engineered to an acceptable tolerance of durability and capacity and neither is notably more or less fragile than the other. Panasonic/Matsushita was an exclusive Blu-Ray supporter and out-right manufactured the first slim notebook BD recorders for Sony notebooks in 2006. I own one. Hell, I used to work there when they first got the "Starcube" contract (leaked name before settling on Gamecube).

Fujitsu is about as relevant as Sony. Sony made many critical components of the NES and SNES, including the SPC700 audio processor that blew the competing 16-bit consoles out of the water.

My point is, a reasonable analysis shows anything BUT Toshiba and HD-DVD.

By CZroe on 6/15/2011 11:37:04 PM , Rating: 1
Wow. I was voted down for knowing the industry better than some guy who's willing to "bet anything" on his faulty industry knowledge? I hope that was a slip.

By Ichinisan on 6/16/2011 1:07:13 AM , Rating: 2
Well, I hope that canceled it out.

Samus: What's with all that nonsense? You clearly have no idea what you're talking about.

By Shadowmaster625 on 6/16/2011 8:36:54 AM , Rating: 2
Since when was a 3DO an addon to a SNES?

By TranceHead on 6/16/2011 5:38:47 AM , Rating: 2
HD-DVD used a BLUE LASER!!!, not a red one.
they couldnt get the capacity with a red laser due to wavelength and physics (you can only get a lasers focal point so small)

By nafhan on 6/14/2011 3:44:05 PM , Rating: 2
Well... this is totally conjecture, but it seems like as long as they are going with a proprietary format, they might be able to get a good deal on some HD-DVD equipment :)

Judging by the capacity, though, it's probably a single layer bluray.

By omnicronx on 6/14/2011 4:30:55 PM , Rating: 2
Judging by the capacity, though, it's probably a single layer bluray.
20 bucks they are merely offset BD's and the console itself won't have the ability to actually play BD movies. This would give them the size advantage without paying much of the expensive licensing costs that movie disk support would require.

By icanhascpu on 6/14/2011 5:21:17 PM , Rating: 2
I do not think it would be insane at all to go a bit further than they have. Its just a matter of finding a balance between mitigating piracy and the cost of how far they are willing to change from the base standard.

By Flunk on 6/14/2011 7:36:40 PM , Rating: 1
You missed that the gamecube's DVD Motor spins backwards, which is a real pain to replicate.

By sprockkets on 6/14/2011 11:31:24 PM , Rating: 4
You seriously didn't fall for that 10 year old myth still?


By Samus on 6/14/2011 11:34:20 PM , Rating: 2
I lol'd.

By ElFenix on 6/14/2011 8:22:58 PM , Rating: 2
If it were BR based they'd have to pay Sony money for it.

My guess would be HDDVD based. Tosh will probably take $5 for licensing that right now.

By someguy123 on 6/14/2011 8:50:05 PM , Rating: 2
Well, nintendo did pour some money into that company developing holographic discs a few years ago.

Maybe it's just a really premature version of holographic discs?

By Hiawa23 on 6/15/2011 8:51:03 AM , Rating: 1
I find it almost laugphable the article compares the GPU used in the 7year old Xbox 360(2012), 6 year old PS3(2012) to the GPU WiiU. I was thinking about getting the WiiU, but I think I will wait for the next Xbox & Playstation as I am happy with the 360 & PS3, & the Wiiu will be underpowered in comparison to the next gen consoles that MS or Sony launches. didn't like the Wii at all so it all depends on the games that the console lauches. if the Wiiu is simply a Wii in HD, no thanks.

4000 series - probably a good choice
By nafhan on 6/14/2011 3:49:10 PM , Rating: 3
I think the 5000 series had a large jump in number of transistors just to get DX11 compatibility without any actual increase speed (it was actually slower clock for clock, if I remember correctly). So... if they're going for cheap and AMD, the 4000 series makes sense.

RE: 4000 series - probably a good choice
By kmmatney on 6/14/2011 6:09:52 PM , Rating: 5
Also, it will be made on a 32nm process, so will be more efficient than the HD4890 that used a 55nm process. I still use an HD4890, and it's plenty fast at 1080p resolutions.

RE: 4000 series - probably a good choice
By B3an on 6/14/2011 9:05:41 PM , Rating: 2
Even though i'm sure the Wii-U will obviously not use any form of DirectX, if it has the same graphics capabilities as Radeon 4xxx then DX11/OpenGl 4.0 things like tessellation may not be possible. And this is a must have feature for a console that has to last many years. Tessellation can greatly increase scene complexity and overall graphics at relatively little performance cost.

By stupidvillager on 6/14/2011 10:12:18 PM , Rating: 2
The 4xxx series can do tessellation. Amd gpus have had it built in for years. Plus it is "based" off of that series, it is still a custom chip with who knows what optimizations. I believe parts of the japanese garden demo at E3 had some tessellation in it.

By piroroadkill on 6/15/2011 4:49:05 AM , Rating: 2
A long history of tessellation exists in ATI hardware, although of course it's easy to forget about TruForm, especially since it's now entirely deprecated, but that was out with the Radeon 8500, DirectX 8.

RE: 4000 series - probably a good choice
By Reaper20k on 6/15/11, Rating: 0
RE: 4000 series - probably a good choice
By BZDTemp on 6/15/2011 4:37:03 AM , Rating: 2
I just cannot get excited about Nintendo choosing hardware that old. Yes, it's better than 360 and PS3 but those are old!

As for the visual fidelity of games for PC hardware similar to PS3/360 I think you're wrong. Yes, Uncharted, Killzone and GOW looks fine but so does fx. Far Cry and it runs fine on old PC's if you're content with console screen resolutions.

With Nintendo running with old tech like that their console is gonna look ancient before it's even half way into it's life cycle. Still - maybe we should just be glad they are not recycling Gamecube tech a third time :-)

By RadnorHarkonnen on 6/15/2011 8:43:23 AM , Rating: 2
I Stll have a 4850 CF running. Chews everything i throw at him. Except for crysis, but its a 512mb fault, not the cards/cpu fault. I play on 1650x1050. Going from 55nm to 32nm on it would increase its performance some 20% and reduce heat output alot. You still need to add tweaks they surely putted in.

Anyway, it is the ONLY console with DX10.1 capabilities. Most likely they updated it for DX11 and its quite up to par with today standards.

I personaly like this approach. Tested and true.You probably wont have any "RROD" problems with this cards. Although i don't like to play on consoles (XBOX and PS3 for me are quite sub standard now) because of interface and because I am an eye candy junkie, i believe that card would do just fine in 1080p.

By 4745454b on 6/15/2011 11:45:25 AM , Rating: 2
I am SHOCKED that they left 1GB of GPU ram. If they are going to cheap out and not allow DVD playback, why go all the way and leave 1GB of ram on it? I would have assumed they would have gave us 512MB and called it good.

I'm guessing the disc format is based on HD-DVD...
By mmp121 on 6/14/2011 2:59:32 PM , Rating: 1
Only time will tell...

By 91TTZ on 6/14/2011 4:35:21 PM , Rating: 2
Why would it be? It would be more expensive since manufacturers stopped producing them years ago. Blu-Ray took over.

By dark matter on 6/14/2011 4:58:12 PM , Rating: 2
But no licensing costs on every disc.

Over hundreds of millions of discs!

By CharonPDX on 6/14/2011 7:42:35 PM , Rating: 2
Um, just because the format was declared dead doesn't mean the rights-holders behind the technology don't have any interest in earning money from uses of their rights.....

By phatboye on 6/14/2011 7:34:39 PM , Rating: 2
Yes HD-DVD's drives have not been produced in a while but they are not that different from BD drives. In fact from what I remember Toshiba HD-DVD drives were physically capable of reading BD disk with a few minor changes. That is why Toshiba was able to switch to BD drives fairly easily once they dropped HD-DVD.

Also remember CBHD (see drives are still in production so some of them may in fact still be able to read the old HD-DVD format since it is based off the HD-DVD format.

With that in mind it wouldn't be too far fetched to imagine Nintendo using something more closely related to HD-DVD than BD if they could in fact get that technology at a fairly decent price.

By Samus on 6/14/2011 11:48:46 PM , Rating: 2
The other main advantage of using CBHD/HD-DVD is that existing DVD production lines/techniques can be used, making it super cheap.

It's actually the old fashioned disc production methods that limited capacity to 15/30GB opposed to Bluray's 25/50GB. Updating the disc production using Bluray pressing equipment would theoretically allow for 25/50GB HD-DVD storage capacity with a firmware update...

or as Nintendo stated, a 'proprietary drive'

By mallums on 6/16/2011 7:44:26 AM , Rating: 2
It's the other way around: Modified Blu-Ray drives can read HD-DVD dics. LG produced some dual-format drives for a while. They can write to BD-R/RE, but can only read HD-DVD.

HD-DVD might be a good format because Nintendo can probably get its hands on some manufacturing equipment cheap. I expect the stamping machines (which are basically slightly fancier DVD machines, unlike BD machines) can be found in storage here and there. CBHD is an interesting idea, but I doubt it would fly. Blu-Ray or HD-DVD is the way to bet.

By ElFenix on 6/14/2011 8:24:24 PM , Rating: 2
iirc HDDVD discs were very easy to make compared to BR discs, using a lot of the in-place DVD infrastructure.

Can't wait to hear the price
By rikulus on 6/14/2011 3:21:30 PM , Rating: 2
I'm really looking forward to hearing the price of this console, seems like it could be quite a step up from the pricing of the Wii. Should be interesting to see if people will bite, and if Nintendo can regain some momentum.

You don't hear a whole lot about 3DS sales, I gather they have been less than stellar, just recently reaching the first million units in Japan.

And Wii sales have been on the decline... can't help but think maybe Nintendo hit some mass market magic, but that there isn't a sustained interest from that segment. Are they interested in buying a new, more expensive Wii? Time will tell. I remember when it looked like the Wii was going to blow PS2 lifetime sales numbers out of the water... with the Wii-U around the corner, I wonder if Wii will even beat PS1 numbers (currently PS2 at 150 million, PS1 at 101 million, and Wii at 86 million.)

RE: Can't wait to hear the price
By gamerk2 on 6/14/2011 3:40:25 PM , Rating: 2
What happened to Wii sales wasn't shocking; everyone wanted one, for no other reason then someone they know wanted one.

Once you hit saturation, you got 90% shovelware that didn't take advantage of the Wii's unique capabilities. As such, every game reduced to "chop, slash, stab", and the same exact actions, frankly, get boring after the 10th game or so.

Lack of quality third party titles hurt the Wii.

RE: Can't wait to hear the price
By Aloonatic on 6/14/2011 5:50:01 PM , Rating: 2
What happened to Wii sales wasn't shocking; everyone wanted one, for no other reason then someone they know wanted one.
True, but it's one thing to want one, it's another to be able to afford to buy one. That's where the Wii really did well, it was cheap. Not only because it was cheap, of course, the core xBox was pretty cheap too, but the "casual" gamer (or whatever you want to call the newcomers to the market that the Wii attracted) would never have bought into the Wii in the volumes that they did if it was the same price as a PS3. That's the gamble with the WiiU and it's tablet controller might be, but we've not seen the price.

I quite agree about the lack of 3rd party titles too.

Hope they die shrink that R700 GPU at least
By Bateluer on 6/14/2011 11:13:59 PM , Rating: 2
R700s, 55nm, if I recall? 6xx0 series is 40nm, with a move to 28nm not too far off. Hopefully, they make launch that R700 GPU on a 40nm process, with a possible more to 28nm later. Keep power use down, cheaper to manufacturer.

By DanNeely on 6/15/2011 12:40:09 AM , Rating: 2
second paragraph. It's going to be built at 32nm initially.

Output from the controller?
By rikulus on 6/14/2011 3:11:59 PM , Rating: 3
The paragraph about the controller outputting 1080P content seems very misstated. The console is expected to output 1080P video, which would of course be via HDMI. There's no reason to have the controller be able to output video... the controller doesn't can't even generate it's own video. It only displays what is streamed from the console (Nintendo has stated that the controller cannot be used to play games on the go, outside the range of the console.) So, if console and controller have to be in the same room, it would be silly to have an HDMI cable attached to the controller rather than the console.

By aspartame on 6/15/2011 6:39:41 AM , Rating: 2
The new Tegra 3 is at least as powerful as Wii. That means we are going to have a gaming console/desktop pc class cpu in a tablet. Within a year Tegra 3 based tablets will start replacing gaming consoles. Wintel monopoly will be dead. Even iMac will introduce a Tegra version. I am sure that Steve Jobs is testing a Tegra based iMac right now. Like floppy disks, Optic disks will also vanish in near future. They are big waste of space. MicroSD is a better alternative.

Too little too late
By slickr on 6/15/2011 7:47:24 AM , Rating: 2
So WII is using a radeon 4000 series GPU? They probably aren't aware that the next xbox720 and PS4 will use DX11 parts and are coming out somewhere in 2013, which is basically only 1 and a half years later.

They also don't provide big storage with HDD drives and this will hurt their sales a lot as well.

People want to download movies, music, mods, saved games, map packs on their systems and 8gb of storage won't cut it.

By Makaveli on 6/15/2011 2:53:29 PM , Rating: 2
4800 series is more than enough for 1080p gaming someone above said. That is a lie unless you are talking about low detail with no aa. I just replaced a 4890 for a 6950 that I unlocked into a 6970. And the 4890 will give sub 60fps on most modern games at 1920x1080 on High detail and say 2xaa or less. I try playing some modern games and not games from 2008.

Both the xbox360 and the PS3 I would hardly call HD systems.
First most of the xbox games are rendered at 720p and the system up scales it when you have your tv set to 1080p. The PS3 is even worse cause they have some games that aren't even rendered at 720p and goes through the same up scaling.

There is maybe a few games less than you can count with all ten fingers that are actually rendered at 1080p natively on both systems.

I'm hoping the next gen systems fix this and we actually see 1080p native games.

By edge929 on 6/14/11, Rating: -1
By gamerk2 on 6/14/2011 4:45:34 PM , Rating: 2
Consoles don't need state of the art hardware, sicne your accessing the hardware at a MUCH lower level then you would when going through an OS and using device drivers. Plus, using an older arch keeps costs down.

It should be noted: It hasn't been stated WHICH 4000 series GPU the WiiU is based on, so don't automatically assume tehy are using a 4800...

By FITCamaro on 6/14/2011 5:10:56 PM , Rating: 1
Even a 4870 is a pretty potent GPU today. And as mentioned, faster than the 360's or PS3's GPUs.

By Manch on 6/14/2011 6:22:39 PM , Rating: 2
The 360's GPU was a proprietary design that was equivalent in performance to the current crop of video cards that were out when it launched. The PS3 which released a year later was on par with the 360 and those cards as well. While I think the poster was a little wild with his comparison, I see his concern.

This system will release in 2012 with an architecture that will be 4 years old. If that's the case then it will be better than the 360/PS3 but will be way inferior to the next gen consoles released by MS/Sony.

It'll become the new dreamcast. Great at first, but abandoned within 2 years. Nintendo may have some 3rd party suptfor it's new console, but that will evaporate with MS/Sony's next gen consoles.

I love Mario games, Legend of Zelda and all of that, but Nintendo needs to compete on graphics also. Otherwise their consoles will have no staying power. MS/SOny already proved that they can copy Nintendo's offerings and compete.

Of course all of this just guessing based on "leaked" info. Maybe we'll be suprised when it launches. After all, it's a console and it doesnt have to compete with the rediculously high resolutions that us PC gamers love, so yeah not neccessarily state of the art, but it does need to be able to compete on everything else.

By omnicronx on 6/14/2011 6:59:28 PM , Rating: 2
The 360's GPU was a proprietary design that was equivalent in performance to the current crop of video cards that were out when it launched. The PS3 which released a year later was on par with the 360 and those cards as well.
Comparing the graphics unit of the PS3 and 360 is like comparing Apples to oranges. Simply put the ATI Xenos based GPU found in the 360 works alone, while the RSX found in the PS3 works side by side with its 7SPE cell processor.

From a pure theoretical standpoint the GPU in the 360 is actually better, but because the PS3's RSX essentially works in union with its Cell processor, they are pretty much equals in terms of graphical performance.

The point is until we know more about the Cell architecture that the new Wii U will employ, its pretty much pointless talking about how good its graphical performance is going to be..

By Manch on 6/14/2011 10:32:31 PM , Rating: 2
No it's not. They may work differently but they are both integrated to work with their CPU's for the sole purpose of playing games. To say that the 360's GPU is completely independent and that the CPU has no bearing on performance is absurd. The cell proc in the PS3 doesnt help the graphics that much. Yeah it will have an effect but the meat and potatoes are in the GPU.

I already said in my previous post that were talking about "leaked" info. It's accuracy is tenuous at best. I'm pretty sure final silicon hasnt even been produced yet. I'd bet money that the 4000 series GPU was probably a starting point for the Wii U as they didnt just start planning it out yesterday.

When people hear they are releasing a console with 4 year old tech, we're going to question whether or not it will be good enough to compete with MS/Sony's new consoles. Since those consoles are releasing two years after Nintendos then the GPU tech will be 6yrs old by then.

Speculation and conjecture isnt pointless, it's my escape from cubicle hell. I'm pretty sure the Big N realizes they cannot rely on blue ocean BS or more crap to wave around. They need genuine innovation and graphics.

By icanhascpu on 6/14/2011 5:35:18 PM , Rating: 1
So youre telling me * E PEEN ALERT E PEEN ALERT *


Consoles do not need the hardware you see on PCs due to the nature of consoles being much more tightly controlled allowing programmers to optimize games much easier, not to mention the ease of optimizing for a small set of TV resolutions.

This is coming from a PC gamer.

By edge929 on 6/14/2011 6:03:05 PM , Rating: 2
I think you missed something in my post, it was just a reference to a cheap, last-gen GPU, no epeen stroking was done. I'm intimately familiar with programming for XYZ hardware and understand optimizations play a big role, hence why I alluded to this.

This coming from a PC gamer and programmer of 12 years.

By mallums on 6/16/2011 8:02:50 AM , Rating: 2
Consoles need the hardware they need. High end enthusiast-oriented machines need high end hardware, because they are going to have a 10-year lifespan.

The Wii-U is not going for that market. They are going after the same market that the Wii went after. Like the Wii, they don't need the horsepower. What they need is a good gimmick, and otherwise good gameplay. The Wii has the Wii controller, which took Sony and Microsoft years to catch up to. Sony's version is laughable. The Kinect at least has good hackability value.

With the Wii-U, the gimmick is obviously the fancy controller, again. This time, I'm skeptical, because I don't think that most people can divide their attention between two screens all that well. The DS really doesn't succeed, screenwise, and only a few games really make good use of both.

However, if the gameplay is good enough, people will forgive the awkwardness. It really is all about the games.

Nintendo sucks
By Roy2001 on 6/14/11, Rating: -1
RE: Nintendo sucks
By icanhascpu on 6/14/2011 5:39:54 PM , Rating: 1
Defining 'better' by hardware is a childish and foolish measure. So Im going to have to go ahead and say you suck. Not Nintendo.

And I dont even own a Wii/DSi.

RE: Nintendo sucks
By Helbore on 6/14/2011 5:45:20 PM , Rating: 3
Nintendo sucks and yet currently have the most successful console on the market.

I'm not a fanboy, as I own a PS3 (and also don't give a monkeys what console anyone else chooses to buy. They're just toys, after all!) but you can't knock them when they've managed to take cheap, old tech and turn it into a winner.

It might not be what you or I want in a console, but its clearly what an awful lot of people want - and let's not forget that Sony and Microsoft were playing catch-up in terms of user control.

RE: Nintendo sucks
By Nutzo on 6/14/2011 6:12:55 PM , Rating: 2
Yet DS line out sales the PSP line....

Could it be that the DSI has more games available (all the DS games) than the PSP?

Both the DS and the Wii where marketed to the casual gamer and families, and Nintendo had great success in that market.
I'm not sure they are having much success moving up market.

If one of my kids DS-Lite dies, I can't see replacing it with a DSI-3D due to the much higher cost. I'd probably buy a used/refurbed DS-lite instead. Same with the Wii since it's now only $149. Almost not worth getting a broken one fixed.

I doubt I'd buy the Wii U, especially if the cost is about the same as the Xbox.

RE: Nintendo sucks
By inperfectdarkness on 6/14/2011 7:15:04 PM , Rating: 2
and yet, you completely overlook the fact that a 2 year old radeon 4890 isn't taxed at all for 1080p gaming. who gives a crap about power levels higher than that--if your TV (which is what 99% of console gamers are going to play on) is the limiting factor?

it's a documented fact that hardware has closed the gap with software--even to the point of surpassing it. there's simply no need to put a $600 GPU solution in a console that will be putting out (at best) 1/2 the resolution of top-end PC monitors (2Mp vs. 4Mp).

besides, why shouldn't next generation consoles be "better" than the previous generation? nintendo didn't win the 6th generation console wars (3rd place, actually) so going "over the top" on 7th generation would have been a stupidly risky gamble. instead, nintendo went with "sufficient" on hardware & focused on user experience--and it paid off.

because of that success, nintendo is willing to step forward again & put out a more powerful console. ps3/n64 both demonstrate that "most powerful" != sales success. these wii u stats are sufficient for 1080p gaming, imho. that's all that's needed.

if the DS sucks so much, why is it DESTROYING everything else--smartphones and PSP included? smartphones don't have the buttons of a dedicated portible game system. the psp (while more powerful) doesn't have the library, userbase, or quality of games offered by the DS. again, power != success. heck, ask anyone in the UFC. power cannot compensate for superior technique.

RE: Nintendo sucks
By GuinnessKMF on 6/14/2011 8:16:07 PM , Rating: 2
1080p is not the limiting factor, you can tax a modern GPU plenty at 1080p or less. Complex scenes often require multiple renderings of the same frame, between aliasing, reflections, and now even temporal aliasing (the reason why film looks good at 24 fps when video games don't is temporal 'aliasing' of the film). GPUs are also responsible for a lot more than just displaying the contents of a game, physics, and geometry are now often the responsibility of the GPU, sometimes even some AI.

I do agree that 2 year old hardware can handle most of what is needed to make a good game (infact good games don't *need* good graphics, but they're nice), but I think your understanding of graphic card requirements is a bit lacking.

"top-end" PC monitors aren't used for gaming, they're used by graphic professionals, gaming on PCs is most often done at 1080p (this is changing rapidly with multi-monitor gaming).

RE: Nintendo sucks
By BZDTemp on 6/15/2011 2:20:28 PM , Rating: 2
"top-end" PC monitors aren't used for gaming...

Sure they are. Of course this is not mainstream but among those that run the latest and greatest in hardware for gaming playing on 30" 2560x1600 monitors and lately the 27" 2560x1440 is not unheard of. In fact is likely more common than you think.

And why not - a 30" monitor is costly compared to even very good 24" monitors but considering it's likely to outlast 3-4 generation of gfx cards and at least two generations of PC's the cost is not that high. Plus for those of us that have been gaming for a long time all PC equipment is cheap as dirt (my first 16-bit sound card cost me aprox. $1000 in todays money and it was a good deal at the time).

RE: Nintendo sucks
By GuinnessKMF on 6/15/2011 2:55:06 PM , Rating: 2
I knew that one line in my response would get nit-picked out and the substance ignored.

The point is that resolution does not determine the limiting factor of graphical processing power required, alone. Resolution is a factor, but you could tax out the best GPU solution in existence right now rendering a complicated enough of a scene at 640x480.

A scene rendered at 1920x1080 with a single pass filter (say reflection, un-optimized AA, etc) requires roughly the same processing power as a 2560x1600 scene without the filter.

All that said, I will stand by my statement that the highest tier of monitors are not "aimed" at gaming (yes obviously there are people using them), most of the greater than 1920x1080/1200 monitors have noticeable lag/response time issues, as they are designed for proper color reproduction instead of gaming. If you want to talk about multi-monitor setups using hydravision or whatever they're calling it now, then yes, that's targeted at gaming over 1080p resolutions.

RE: Nintendo sucks
By B3an on 6/14/2011 9:15:35 PM , Rating: 2
The guy above is right.

But also idiots like you fail to realise that this hardware will probably have to last atleast 5 - 6 years and it isn't even out yet. The Wii-U will likely be around until atleast 2017.

Even now PC games can bring a 4890 at 1080p res to it's knees. And if the GPU graphics capabilities are also like the 4xxx series it wont even be able to do tessellation and other DX11 / OpenGL 4.0 comparable effects. You obviously have no idea what you're talking about.

RE: Nintendo sucks
By RussianSensation on 6/15/2011 2:03:22 AM , Rating: 2
Remember, PC hardware has to deal with a HUGE API bottleneck. John Carmack talks about it in his 20 minute interview at E3. The fact is a modern GPU is at least 10x more powerful than what's found in the PS3 and Xbox360. But games barely look 2x better on the PC.

Programmers are coding directly to the hardware on the console, while for the PC, they have to deal with the bloatware OS/API. Not only that but since you have a fixed ecosystem with consoles, it's much easier to optimize. Personally, it would have made much more sense to get a 32nm HD5770 version which has similar performance to the 4870 and DX11. However, I can see how Nintendo went the cheaper HD4000 series since HD5770 isn't fast enough to use DX11 specific features such as Tessellation.

Overall, if you look at the HD4870 vs. 7900GS (PS3 level), the Radeon is about 4-5x faster in modern games :

Of course we don't know the exact specs of the HD4000 series in the Wii-U, but if it's anything along the 4870/4890, it's fast enough for Nintendo as their strategy isn't to sell their consoles at a loss, nor to price them at $600 USD at launch.

RE: Nintendo sucks
By SilthDraeth on 6/15/2011 4:17:32 AM , Rating: 2
Except you just mentioned the API bottleneck, you take that away, then 5770 should be quick enough for all DX11 features.

I have a 4870 and it is still pretty fricken fast.

RE: Nintendo sucks
By TSS on 6/15/2011 11:00:38 AM , Rating: 2
John carmack i'll trust on code but not on beauty. They've admitted themselves when quake live was being developed that quake 3 didn't look as awesome as it could have - they where so busy innovating the code that they didn't bother looking at stuff like light placements (the lighting in quake live is vastly different). And Rage - ID's new title - really doesn't look that good. I wasn't very impressed during the first tech demo's of the engine but now that the game still isn't out and the UE4 tech demo is.... well it looks stale.

IMO that also shows the real culprit why games don't look as good as the hardware can make it look. Development overhead. If it was carmack coding Rage it would've been out years ago. The problem is the game is too big for just him to code, and others aren't quite the genius he is unfortunately.

It was shown with i belive Doom for the iphone. ID software said it couldn't be done - it would take a team of coders a month and that would cost too much. John Carmack then did it by himself in a weekend.

While he remains that genius, all games these days are made by that team of coders. Thus limiting the potential of the game from the start.

Aside from all that, it's a fact most games are console ports these days. Since their developped for the consoles most models have lower polycounts which cannot be changed unless the entire models, skin and animations are redone (in the worst case). Developers consider this too little bang for the buck these days so they pick the easyest route: Textures are made at double the console resolution and scaled back 50% for the consoles while the original is used for the PC. This also works because consoles have an abysmal small amount of memory (512mb total for the PS3 texture and system RAM, my 2 year old PC is sporting 6gb system and 1gb texture RAM). But because of this same reason even double the resolution looks ugly on PC's because they can handle 4 times the resolution by now and on selective props even 8 times the resolution.

Stuff like more vegitation, more physics, more debris, better looking models, more background stuff going on is all left out, even though PC's can handle it.

Not to mention games are being redesigned so consoles can handle them. Prime example, crysis 2. It looks worse then crysis 1 because consoles can't even handle crysis 1. It's not an open world, rather a huge corridor, and the jungle vegitation of crysis 1 IMO looks alot better then the urban enviroments of crysis 2.

So i'm sorry but on this one it's the developers themselves.

RE: Nintendo sucks
By someguy123 on 6/15/2011 4:17:13 PM , Rating: 2
I don't know what conventions you've been to but the engine looks amazing. You have a problem with the drab artstyle, not the engine itself. The same could be said about games based around ID4. ID made the decision to design their games tinted brown and "dirty". The engine is capable of much more.

The UE4 tech demo is merely a demo. If you look up the original UE3 photos and original tech demo attached (before the public demos) it looked about as good as UE4 in character rendering. The same could be said about the alpha build of unreal tournament 3.

I don't agree with people's assessment that PC has 10 times the bloat, but there are overhead problems, as well as having to deal with scaling for various builds compared to single hardware spec. When it comes to brute force and just dumping massive textures on objects, you end up with aliasing issues like the witcher 2, which relies on FSAA to be cleaned. Looking at crysis, their vegetation looks great, but there are a lot of texture problems and indoor areas don't look nearly as good as the replicated vegetation. Even an engine like cryengine 2 relying heavily on sheer processing power needs to cut corners. To be honest, I still think uncharted 2 is the best looking game regardless of platform, though obviously it's limited in resolution.

RE: Nintendo sucks
By piroroadkill on 6/15/2011 4:50:52 AM , Rating: 2
4890 is brought to its knees? I don't really think so, and it's massively more powerful than the existing consoles.

Also, you call someone an idiot, yet make a common mistake: "it's knees". It is knees?

RE: Nintendo sucks
By superunknown98 on 6/15/2011 4:00:58 PM , Rating: 2
You also have to take into account that Sony and Microsoft start designing new consoles way ahead of the release. So if a new Xbox is released in 2014, the CPU and GPU will certainly be a year or two old from that point. I estimate the Wii U GPU will only be 2 or 3 generations behind the next consoles from Sony and Microsoft. Obviously a big gap, but not as big as the current one.

RE: Nintendo sucks
By someguy123 on 6/15/2011 4:23:28 PM , Rating: 2
And idiots like yourself fail to realize that microsoft and sony subsidize hardware.

People really think they're entitled to cheap hardware subsidized heavily by corporations? Nintendo is a for profit company. You can't base the evolution of their console on arbitrary hardware specs that did cost and will cost sony and microsoft billions in losses to produce next console refresh.

RE: Nintendo sucks
By mircea on 6/15/2011 12:57:28 PM , Rating: 2
Even more so, there has never been on first place the most powerful console in any generation since the first "video-game consoles war". Just remember:
  Atari         -   Intellivision
  NES          -   Sega Master System
  SNES        -   Neo Geo / Sega Genesis
PlayStation -   N64 / Sega Saturn
  PS2           -   Xbox / GameCube
  Wii            -   PS3 / Xbox360

I don't know exactly where to place the Dreamcast With the N64/PS or PS2/GameCube/Xbox?

Anyways if you look at all these you will see that with the SOLE exception of the Wii, the winner was thja one with most third party support. And it's what Nintendo lost with N64 and have yet to really get them back, even though the Wii helped it still is a fluke since Xbox360 and PS3 have a more varied 3rd party support.

"Google fired a shot heard 'round the world, and now a second American company has answered the call to defend the rights of the Chinese people." -- Rep. Christopher H. Smith (R-N.J.)

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki