"G80" To Feature 128-bit HDR, 16X AA
October 5, 2006 11:21 AM
comment(s) - last by
More G80 features abound
As if we mere mortals needed
more reasons to be excited about G80
, here are a couple more tidbits: 128-bit high dynamic-range and antialiasing with 16X sampling.
The high dynamic-range (HDR) engine found in GeForce 7950 and Radeon series graphics cards is technically a 64-bit rendering. This new HDR approach comes from a file format developed by Industrial Light and Magic (
the LucasFilm guys
). In a nutshell, we will have 128-bit floating point HDR as soon as applications adopt code to use it.
OpenEXR's features include
Higher dynamic range and color precision than existing 8- and 10-bit image file formats.
Support for 16-bit floating-point, 32-bit floating-point, and 32-bit integer pixels. The 16-bit floating-point format, called "half", is compatible with the half data type in NVIDIA's Cg graphics language and is supported natively on their new GeForce FX and Quadro FX 3D graphics solutions.
Multiple lossless image compression algorithms. Some of the included codecs can achieve 2:1 lossless compression ratios on images with film grain.
Extensibility. New compression codecs and image types can easily be added by extending the C++ classes included in the OpenEXR software distribution. New image attributes (strings, vectors, integers, etc.) can be added to OpenEXR image headers without affecting backward compatibility with existing OpenEXR applications.
NVIDIA already has 16X AA available for SLI applications. The GeForce 8800 will be the first card to feature 16X AA on a single GPU. Previous generations of GeForce cards have only been able to support 8X antialiasing in single-card configurations.
This new 16X AA and 128-bit HDR will be part of another new engine, similar in spirit to PureVideo and the
Quantum Effects engines
also featured on G80.
This article is over a month old, voting and posting comments is disabled
RE: PS3's video chip?
10/5/2006 1:49:08 PM
My guess is that he meant the PC will once again be capable of putting out better graphics than the latest concoles. This will likely be true. But then, it always has been. The big upside to a console will always be the amount of gaming power you get for a "reasonable" price.
Of course, I might not be the right person to talk about "reasonable" prices. I owned all 3 of the major consoles during the last generation, as well as an overclocked gaming PC. I currently own a gaming PC and an XBox 360, and I plan on buying a Wii when it launches, and eventually a PS3 at some point down the road. So, I may not be the "voice of reason". LOL
"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs
Documents Leak NVIDIA's Quantum Physics Engine
October 5, 2006, 1:21 AM
NVIDIA "G80" Retail Details Unveiled
October 5, 2006, 12:39 AM
"Prepare to be Punished": Microsoft is Killing OneDrive With Cuts, Blames Users
November 3, 2015, 8:23 PM
Apple's New "Magic" Peripheral Line Packs High Tech, High Prices
October 13, 2015, 9:39 PM
Samsung Adds 2 TB 850 EVO, PRO SSDs for $800, $1000
July 7, 2015, 4:23 PM
Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
May 13, 2015, 2:49 PM
How to Recover Most Apps After Your NVIDIA Driver Crashes in Windows 10
March 30, 2015, 12:54 PM
Tinkerer Gets Old School Mac Plus Running on the Modern Web
March 24, 2015, 6:41 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2016 DailyTech LLC. -
Terms, Conditions & Privacy Information