Print 70 comment(s) - last by Johnmcl7.. on Jul 30 at 6:57 AM

An alleged screen capture of the ATI version of Rydermark (saved and reposted to preserve metadata)

An alleged screen capture of the NVIDIA version of Rydermark (saved and reposted to preserve metadata)

A difference map of the two images
"Rydermark developers" make bold claims which turn out to be nothing more than a Photoshop hoax

It would appear The Inquirer was quick to jump the gun on a story accusing NVIDIA of lying about full DirectX 9 support. The story accused NVIDIA of not allowing developers to use 24-bit or 32-bit shader precision. Instead it claims NVIDIA forces developers into using 16-bit shader precision as the technique is faster. This is a problem as DirectX 9 compliancy requires 24-bit shader precision or better.  "Rydermark" is not a commercially shipping application yet, and has had very little information published to confirm its authenticity.

The original story lacked any type of physical evidence and The Inquirer claimed its sources were developers for the program. Images allegedly proving that NVIDIA forces developers into using 16-bit shader precision were posted on The Inquirer. The posted images compared a rendered scene in Rydermark 2006 between ATI and NVIDIA graphics cards.

It turns out the images "proving" NVIDIA’s wrongdoings were nothing more than poorly done Photoshopped images.  The NVIDIA rendered image appeared to have blurrier water while the ATI rendered image had sharper water detail. However, the ATI rendered image just didn’t look right with poor cut offs and a creation date three minutes after the NVIDIA rendered image.  A difference image of the two JPG files can be seen to the right, with the outline of the modified  area clearly visible in the ATI image.  This would suggest the NVIDIA image was the original source image, and that the ATI version was modified afterwards.

A difference of the metadata from both images reveals that the NativeDigest delimiter is identical for both images, but has two different InstanceIDs.  This would be consistent with an image that was modified and saved twice.  In the author's defense, images that are created and saved on his computer have distinct metadata tags that are very easily identifiable.  These are not present in the two images supplied by The Inquirer for "Rydermark" -- suggesting the images may not have been modified by the author.

There’s been an outcry of The Inquirer images on various forums including Ace’s Hardware, AnandTech and Beyond3D.

Update 07/19/2006: The Inquirer has posted something resembling a rebuttal to this article. Incredibly, a user from the forums managed to track down some of the stock art used in the screen renders, and believes the entire image is actually fraudulent.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By Spoonbender on 7/19/2006 6:43:34 AM , Rating: 2
I'd arrived at the same conclusion, although only through guessing.
Just seemed unlikely that NVidia, who made a big fuss of the Geforce 6 (or was it 5?) ran 32 bit internally, would ditch that and go 16-bit only on Geforce 7.
Moreover, it seemed fishy that it'd taken so long before someone spotted it. I mean, it should be relatively easy for developers to spot. They'd run into *a lot* more precision issues if it ran 16 bit, which should be easily noticeable.

RE: Clever
By defter on 7/19/2006 6:55:48 AM , Rating: 2
All desktop cards output image at 8bit per color precision (some workstation cards use 10bit precision). There is no general purpose display that can display 16bits, let alone 32bits, per color.

RE: Clever
By The Cheeba on 7/19/2006 7:02:48 AM , Rating: 2
Just FYI, that's 8-bits per subpixel. You get 256 shades per subpixel, which corellates to 256^3 colors on better displays.

RE: Clever
By Sunday Ironfoot on 7/19/2006 9:07:29 AM , Rating: 2
It's 8 bits for each of the three main colours, Red, Green and Blue, or 256 shades per colour. Hence 24bit colour, or 16.7 million colours.

RE: Clever
By Ecmaster76 on 7/19/2006 10:18:46 AM , Rating: 2
Its not the colors, its the data precision.

They are still running 32 bit color or whatever you pick. This is completely different from the ALU precision, which is the issue at hand. 16 bit calculations are pretty innacurate and can lead to noticeable geometry distortios (like old Quake engine games had with MD2, though thats more of an engine problem).

RE: Clever
By Gooberslot on 7/19/2006 6:03:37 PM , Rating: 2
Actually, a CRT can output an unlimited number of colors, it's only limited by what the video card can produce.

"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher

Most Popular ArticlesSmartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
UN Meeting to Tackle Antimicrobial Resistance
September 21, 2016, 9:52 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Update: Problem-Free Galaxy Note7s CPSC Approved
September 22, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki