backtop


Print 43 comment(s) - last by gescom.. on Mar 30 at 12:47 PM

Competition to DX 11 heats up

DirectX 11 has been gathering a lot of support, especially over the last six months as ATI has released an entire top-to-bottom lineup with support for the standard. Although DirectX is probably the best known collection of Application Programming Interfaces for games, OpenGL still remains relevant as a competitor in driving gaming technology forward.

OpenGL is managed by the Khronos Group, and it recently released the OpenGL 4.0 specification. The twelfth revision to the original spec adds many new features, some of which is also supported by current hardware through the new OpenGL 3.3 spec.

OpenGL 3.3 adds support for OpenGL Shading Language (GLSL) 3.3, which includes built-in functions for getting and setting the bit encoding for floating-point values. There are also new color blending functions and performance enhancements.

The real meat is in OpenGL 4.0, adding support for GLSL 4.0 and the fragment shader texture functions it allows. Per-sample fragment shaders and programmable fragment shader input positions will allow for increased rendering quality and anti-aliasing flexibility. The shader subroutines have been redesigned for significantly increased programming flexibility.

New tessellation stages and two new corresponding shader types are introduced. Tessellation control and tessellation evaluation shaders operate on "patches" (fixed-sized collections of vertices). Tessellation can increase visual quality significantly by taking a rough object and generating new vertices to smooth out the object and provide more detail without excessive performance penalties. These two new shader stages will enable the offloading of geometry tessellation from the CPU to the GPU.

A new object type called "sampler objects" will allow the separation of texture states and texture data. 64-bit double precision floating point shader operations and inputs/outputs will increase rendering accuracy and quality, while performance improvements will come from instanced geometry shaders, instanced arrays, and a new timer query. The drawing of data generated by OpenGL or external APIs such as OpenCL can be done without any CPU intervention.

The new spec is also supposed to improve interoperability with OpenCL for accelerating computationally intensive visual applications. OpenCL competes with DirectCompute, found in DirectX 10.1 and DX11.

Support for both the Core and Compatibility profiles first introduced with OpenGL 3.2 are continued, enabling developers to use a streamlined API or retain backwards compatibility for existing OpenGL code depending on their market needs.

ATI has been working extensively on OpenGL support and in shaping the standard. The functionality introduced in OpenGL 3.3 is supported by all ATI discrete graphics products released since the spring of 2007.  That includes the consumer Radeon lineup and the workstation FirePro and FireGL cards.

The ATI Radeon HD 5900 and 5800 series are also fully compatible with the OpenGL 4.0 standard, including tessellation and integration with the OpenCL API. This means that full OpenGL 4.0 GPU acceleration will be available when software that is coded for the standard hits the market.

Almost all of the OpenGL 4.0 functionality is also available on ATI Radeon HD 5400, 5500, 5600, and 5700 series graphics cards, with the exception of double precision support. ATI will enable this feature at a later date.

The features are enabled through the ATI Catalyst OpenGL 4.0 preview driver, which can be found here. Full support for OpenGL 4.0 will eventually be folded in the regular monthly Catalyst driver updates.

"The fact that we are able to announce our support for OpenGL 3.3 and OpenGL 4.0 at launch is an incredible feat on the part of our OpenGL software team, and speaks volumes to the commitment and continued support that the entire team brings to the many developers utilizing OpenGL.  In fact, with the launch of these updates, industry pundits have commented that OpenGL is in for a renaissance of sorts.  As a company that believes in and encourages open and industry standards, maintaining OpenGL as a strong and viable graphics API is important to AMD," stated Chris James, Social Media Strategist for the company's Global Communications team in a blog post.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

I don't miss the old days
By carniver on 3/26/2010 2:16:55 PM , Rating: 4
I can still remember how messy OpenGL was in terms of vendor extensions; NVIDIA would name their extensions NV_EXT_* to alienate ATI and S3 cards, so that if you want to play a certain game you HAVE to buy an NVIDIA card. In those days the slogan "The way it's meant to be played" had a lot of weight. Developers would either give little to no support for non-NVIDIA cards, or they'll have to work out separate rendering routines for different brands and even if they do so, it doesn't guarantee that one brand's performance/quality won't be superior to the other.

And then the culture changed. NVIDIA and ATI started following DirectX feature sets. If your game asks you for DX8 and your card supports it, you're pretty much guaranteed you can play the game just fine. Now as we're waiting for Fermi from NVIDIA everyone's comfortable that it'll be compatible with ATI's 5800 series.

We're now enjoying fair competition between GPU vendors, thanks for the standard MS has set in place. Of course, proprietary is BAD, but so is a lack of standard. If OpenGL is serious about being status-quo again, they better make sure nobody gets to make extensions to it but themselves.




RE: I don't miss the old days
By Zirconium on 3/26/2010 6:49:47 PM , Rating: 2
quote:
If OpenGL is serious about being status-quo again, they better make sure nobody gets to make extensions to it but themselves.

<facepalm>
That's not the way the Khronos Group works, at least as far as OpenGL is concerned. Wikipedia gives a good summary of their methods (http://en.wikipedia.org/wiki/Opengl#Extensions).

Khronos doesn't "push" vendors in the same way Microsoft does. Microsoft may put in some feature in an upcoming version of D3D that is not supported by any current graphics card, which then prompts the vendors to want to support it because clearly, ATI's Radeon 5450 with which supports DirectX 11 is better than Nvidia's Geforce 295 because that only supports DirectX 10.1 - it's a whole DirectX 0.9 better! I would posit that the Khronos group never will either; it is not in ATI/Nvidia/Intel's interests to ratify a standard that neither of them support.

The way OpenGL advances is that the different vested companies create extensions, the then other companies adopt them. Eventually, the "Architecture Review Boards" and ratify and extension and in the next release of OpenGL, it becomes part of the standard spec.

quote:
can still remember how messy OpenGL was in terms of vendor extensions; NVIDIA would name their extensions NV_EXT_* to alienate ATI and S3 cards, so that if you want to play a certain game you HAVE to buy an NVIDIA card.

No, the didn't do it to "alienate ATI and S3." They did it because that's part of the spec in introducing new functions. It is a poor man's way of doing namespaces. Initially, it will be prefixed or suffixed with NV, then when other companies start adopting it, that suffix can be changed to EXT, when the Architecture Review Board gets to it, it will be changed to ARB, and when it is included into the GL core spec, that suffix will be dropped altogether.


RE: I don't miss the old days
By Scali on 3/29/2010 4:09:08 AM , Rating: 2
quote:
If OpenGL is serious about being status-quo again, they better make sure nobody gets to make extensions to it but themselves.


Well, the existence of the extensions isn't the problem. It's developers relying on them.
Back in those days, developers had to rely on various extensions, because they enabled fundamental functionality for games, such as shaders, render-to-texture, vertex buffers.

These days there are standardized extensions for all this functionality, so you don't need to rely on vendor-specific extensions anymore.
OpenGL 4.0 will pull the OpenGL standard up to about DirectX 11-level... which means that OpenGL will support pretty much every hardware feature, and there's not a lot of room for custom extensions at this point anyway.
This is a result from that culture-change you described. nVidia and ATi mainly follow DirectX featuresets these days.

The main battleground for new functionality has shifted from graphics to GPGPU. But that's OpenCL's territory.


RE: I don't miss the old days
By gamerk2 on 3/29/2010 8:50:37 AM , Rating: 2
quote:
If OpenGL is serious about being status-quo again, they better make sure nobody gets to make extensions to it but themselves.


Funny; especially when you consider that in the DX spec, Pixel Model 1.1 was basically developed by NVIDIA, then backported into the DX API. [PM 1.4 was ATI's implementation, then we got unified Shader Model 2.0, which took the best of both implementations].

I could also point out to ATI/NVIDIA's different implemntations of AA. Proprietary extensions are common, as its how you grow the API.


RE: I don't miss the old days
By Scali on 3/29/2010 9:41:36 AM , Rating: 2
That's not entirely true.
nVidia may have been the first with SM1.1, but it was a standard-feature of the DirectX API, and this API forces backwards compatibility.
This means that all hardware supporting anything above SM1.1 also need to be compatible with SM1.1.
The result is that when ATi came out with SM1.4 hardware, it ran all previous SM1.1 software without a problem.

This is a fundamental difference with vendor-specific extensions.
SM1.1 wasn't nVidia-specific. nVidia was just the first to have hardware on the market.
Likewise, SM1.4 wasn't ATi-specific, ATi was just the first.
All current hardware, including Intel and S3 aswell, are capable of running this SM1.x code.


"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki