"Five exclamation marks, the sure sign of an insane mind."
- Terry Pratchett
More pages: 1 2
OpenGL 3
Friday, April 10, 2009 | Permalink

In my framework work I've started adding OpenGL 3.x support. I was one of the angry voices about what happened with OpenGL 3.0 when it was released. It just didn't live up to what had been promised, in addition to being a whole year too late. While OpenGL 3.0 didn't clean up any of the legacy crap, OpenGL 3.1 finally does. A bunch of old garbage that were simply labelled deprecated in 3.0 have now actually been removed from OpenGL 3.1. As it looks now, OpenGL is on the right track, the question is just whether it can make up for lost time and keep up with Direct3D going forward.

However, along with the OpenGL 3.1 specification they also came up with the horrible GL_ARB_compatibility extension. I don't know what substance they were under the influence of while coming up with this brilliant idea. Basically this extension brings back all legacy crap that we finally got rid of. This extension really should not exist. Please IHVs, I urge you to NOT implement it. Please don't. Or at the very least don't ship drivers with it enabled by default. If you're creating a 3.1 context, you're making a conscious decision not to use any of the legacy crap. Really, before venturing into OpenGL 3.1 land, you should've made the full transition to OpenGL 3.0. Fine, if a developer for some reason still needs legacy garbage during a transitional period while porting over the code to 3.1 he can enable it via a registry key or an IHV provided tool. But if you're shipping drivers with it enabled by default, you know what's going to happen. An important game or application is going to ship requiring the support of this extension, and all the work done on cleaning up the OpenGL API is going to be wasted. It would have to be supported forever and we're back at square one.

In my framework I'm going to use the WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB flag. This means that all legacy crap will be disabled. For instance calling glBegin()/glEnd() will be a nop. This will ensure that I'm sticking to modern rendering techniques. All new code should be using this. The only reason for not using it is if you're porting over some legacy code. The deprecation model that OpenGL 3.0 added is fundamentally good. It allows us to clean up the API while giving developers the ability upgrade their code. If done right the whole OpenGL ecosystem would modernize by necessity. This of course relies on Khronos not to poop all over the efforts with stuff like GL_ARB_compatibility.

Name

Comment

Enter the code below



Aras Pranckevičius
Friday, April 10, 2009

Fully agree.

The biggest problem with OpenGL for production use is driver quality (and no, users never update their drivers, at least in casual/small game space). Having GL3 with all the legacy bits does nothing to improve the situation.

What I think should have been done: make GL3 a whole new headers/DLL/dylib/framework, and only have "the new way" APIs in there. Then provide a single "GL2 emulation on top of GL3" library (that is mostly hardware independent); which can be just statically linked into applications and whatnot. Make drivers only contain the GL3 bits, the old APIs should be emulated on top of actual GL3.

Of course, most of my views are probably very na�ve, and I'm underestimating impact of emulating old GL on top of new GL (performance, backwards compatibility, ...). But the approach taken by GL 3.1 does not solve one of the biggest problems with OpenGL either (stability).

Jan
Friday, April 10, 2009

I am in the process of porting to GL3. For now i still use GL2 but remove everything deprecated. The "forward compatible" flag is a good idea, but i wonder, whether it is implemented, at all. Just like with the ARB_compatibility extension, i fear, that IHVs simply give us the old crap, with a new label. So, do you know, whether this flag has actually any influence? Does it raise errors and ignore calls to deprecated functions?

Jan

Humus
Friday, April 10, 2009

From WGL_ARB_create_context spec:

"If the WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB is set in WGL_CONTEXT_FLAGS_ARB, then a <forward-compatible> context will be created. Forward-compatible contexts are defined only for OpenGL versions 3.0 and later. They must not support functionality marked as <deprecated> by that version of the API, while a non-forward-compatible context must support all functionality in that version, deprecated or not."

So yes, providing this flag disables all deprecated stuff. I have also found that both AMD and Nvidia implementations does it right too, so in fact, deprecated function calls are ignored. I haven't checked if any errors are generated, but at least no rendering comes out of it.

Rohit
Friday, April 10, 2009

Actually, since all the crap is an extension now, won't all code have to put ARB suffix in all of their gl calls? And BTW, whatever may be the merits of their decision, they are doing what they said they would do. Now that the deprecated stuff has been moved out to extensions and there is no requirement for it's implementation, I'd expect these things to be dropped from the consumer cards over time at least in future. IHV's will have an incentive to do this, they'll make sure that only the workstation cards have those drivers with legacy crap so that they can screw those not updating the code, some poetic justice eh!

BTW, in your framework, you'll be enabling the forward compatible flag by default right?

Rohit
Friday, April 10, 2009

I meant enabling that bit in the linux port of your code?

n00body
Saturday, April 11, 2009

Slow though it may be, progress is progress. Since they've shown they will follow through on their deprecation plan, I'm hopeful, if still a bit skeptical about the future.

Question:
I thought the whole point of framework 4 was to exploit all the new features of DX11. So won't supporting OGL 3.1 severely inhibit that goal?

Mave
Saturday, April 11, 2009

I totally agree with you. There is so much crap in the OGL 1.x spec that makes writing an OpenGL driver much more complicated that it should be.

Of course, for companies such as nVidia with a huge amount of resources, they do not care about the fact that writing an optimized OpenGL driver is incredibly difficult. In fact, it's an advantage for them. They've spent a huge amount of time optimizing all this legacy stuff so why would they want you loose their edge on competitors?

The problem is that the Kronos group is more or less controlled by nVidia because they are pretty much the only one contributing new stuff. nVidia deserves a lot of credit for contributing so much, but it also has a downside: Their priority is making money. Their goal is not write the most elegant specification to the detriment of their profit.

And also, will someone one day contribute some validation tests for hardware vendors. That's also a big problem that I wish will be resolved one day.

Overlord
Sunday, April 12, 2009

@n00body:
Why would it, technically speaking DX11 is not that different from DX10.1, at least not in a dramatic way, i look at it more as an update on dx itself than the hardware.
Open GL development may be slower, but it's also faster in some respects as once a card supports something it is usually directly exposed as an extension, besides nvidia doesn't even have 10.1 hardware out yet so just take a breather.
Once the hardware is out there openGL will support it in some way.

So supporting 3.1 doesn't mean that one is limiting oneself, it may just be the other way arround.

More pages: 1 2