Print 43 comment(s) - last by gescom.. on Mar 30 at 12:47 PM

Competition to DX 11 heats up

DirectX 11 has been gathering a lot of support, especially over the last six months as ATI has released an entire top-to-bottom lineup with support for the standard. Although DirectX is probably the best known collection of Application Programming Interfaces for games, OpenGL still remains relevant as a competitor in driving gaming technology forward.

OpenGL is managed by the Khronos Group, and it recently released the OpenGL 4.0 specification. The twelfth revision to the original spec adds many new features, some of which is also supported by current hardware through the new OpenGL 3.3 spec.

OpenGL 3.3 adds support for OpenGL Shading Language (GLSL) 3.3, which includes built-in functions for getting and setting the bit encoding for floating-point values. There are also new color blending functions and performance enhancements.

The real meat is in OpenGL 4.0, adding support for GLSL 4.0 and the fragment shader texture functions it allows. Per-sample fragment shaders and programmable fragment shader input positions will allow for increased rendering quality and anti-aliasing flexibility. The shader subroutines have been redesigned for significantly increased programming flexibility.

New tessellation stages and two new corresponding shader types are introduced. Tessellation control and tessellation evaluation shaders operate on "patches" (fixed-sized collections of vertices). Tessellation can increase visual quality significantly by taking a rough object and generating new vertices to smooth out the object and provide more detail without excessive performance penalties. These two new shader stages will enable the offloading of geometry tessellation from the CPU to the GPU.

A new object type called "sampler objects" will allow the separation of texture states and texture data. 64-bit double precision floating point shader operations and inputs/outputs will increase rendering accuracy and quality, while performance improvements will come from instanced geometry shaders, instanced arrays, and a new timer query. The drawing of data generated by OpenGL or external APIs such as OpenCL can be done without any CPU intervention.

The new spec is also supposed to improve interoperability with OpenCL for accelerating computationally intensive visual applications. OpenCL competes with DirectCompute, found in DirectX 10.1 and DX11.

Support for both the Core and Compatibility profiles first introduced with OpenGL 3.2 are continued, enabling developers to use a streamlined API or retain backwards compatibility for existing OpenGL code depending on their market needs.

ATI has been working extensively on OpenGL support and in shaping the standard. The functionality introduced in OpenGL 3.3 is supported by all ATI discrete graphics products released since the spring of 2007.  That includes the consumer Radeon lineup and the workstation FirePro and FireGL cards.

The ATI Radeon HD 5900 and 5800 series are also fully compatible with the OpenGL 4.0 standard, including tessellation and integration with the OpenCL API. This means that full OpenGL 4.0 GPU acceleration will be available when software that is coded for the standard hits the market.

Almost all of the OpenGL 4.0 functionality is also available on ATI Radeon HD 5400, 5500, 5600, and 5700 series graphics cards, with the exception of double precision support. ATI will enable this feature at a later date.

The features are enabled through the ATI Catalyst OpenGL 4.0 preview driver, which can be found here. Full support for OpenGL 4.0 will eventually be folded in the regular monthly Catalyst driver updates.

"The fact that we are able to announce our support for OpenGL 3.3 and OpenGL 4.0 at launch is an incredible feat on the part of our OpenGL software team, and speaks volumes to the commitment and continued support that the entire team brings to the many developers utilizing OpenGL.  In fact, with the launch of these updates, industry pundits have commented that OpenGL is in for a renaissance of sorts.  As a company that believes in and encourages open and industry standards, maintaining OpenGL as a strong and viable graphics API is important to AMD," stated Chris James, Social Media Strategist for the company's Global Communications team in a blog post.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: I don't miss the old days
By Zirconium on 3/26/2010 6:49:47 PM , Rating: 2
If OpenGL is serious about being status-quo again, they better make sure nobody gets to make extensions to it but themselves.

That's not the way the Khronos Group works, at least as far as OpenGL is concerned. Wikipedia gives a good summary of their methods (

Khronos doesn't "push" vendors in the same way Microsoft does. Microsoft may put in some feature in an upcoming version of D3D that is not supported by any current graphics card, which then prompts the vendors to want to support it because clearly, ATI's Radeon 5450 with which supports DirectX 11 is better than Nvidia's Geforce 295 because that only supports DirectX 10.1 - it's a whole DirectX 0.9 better! I would posit that the Khronos group never will either; it is not in ATI/Nvidia/Intel's interests to ratify a standard that neither of them support.

The way OpenGL advances is that the different vested companies create extensions, the then other companies adopt them. Eventually, the "Architecture Review Boards" and ratify and extension and in the next release of OpenGL, it becomes part of the standard spec.

can still remember how messy OpenGL was in terms of vendor extensions; NVIDIA would name their extensions NV_EXT_* to alienate ATI and S3 cards, so that if you want to play a certain game you HAVE to buy an NVIDIA card.

No, the didn't do it to "alienate ATI and S3." They did it because that's part of the spec in introducing new functions. It is a poor man's way of doing namespaces. Initially, it will be prefixed or suffixed with NV, then when other companies start adopting it, that suffix can be changed to EXT, when the Architecture Review Board gets to it, it will be changed to ARB, and when it is included into the GL core spec, that suffix will be dropped altogether.

"Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be." -- Steve Ballmer

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki