next up previous contents
Next: 20.5 Win32's Scheme for Up: 20 Using OpenGL Extensions Previous: 20.3 How to Read   Contents

20.4 Portably Using OpenGL Extensions

The advantage of using OpenGL extensions is getting access to cutting edge rendering functionality so you application can achieve higher performance and higher quality rendering. OpenGL extensions give you access to the latest features of the hottest new graphics hardware. The problem with OpenGL extensions is that lots of OpenGL implementations, particularly older implementations, will not support the extensions that you would like to use. When you write an OpenGL application that uses extensions, you should make sure that your application still works when the extension is not supported. At the very least your program should report that it requires whatever extension is missing and exit without crashing.

The first step to using OpenGL extensions is to locate the copy of the <GL/gl.h> header file that advertises the API interfaces for the extensions that you plan to use. Typically you can get this from your OpenGL implementation vendor or OpenGL driver vendor. You could also get the API interface prototypes and macros directly from the extension specifications, but getting the right <GL/gl.h> from your OpenGL vendor is definitely the preferred way.

You will notice that <GL/gl.h> sets C preprocessor macros to indicate whether the header advertises the interface of a particular extension or not. For example, the basic <GL/gl.h> supplied with Microsoft Visual C++ 4.2 has a section reading:

/* Extensions */
#define GL_EXT_vertex_array               1
#define GL_WIN_swap_hint                  1
#define GL_EXT_bgra                       1
#define GL_EXT_paletted_texture           1
#define GL_EXT_clip_disable               1
These macros indicate that the header file advertises the above five extensions. The EXT_bgra extension lets you read and draw pixels in the Blue, Green, Red, Alpha component order as opposed to OpenGL's standard RGBA color component ordering.11 If you wanted to write a program to use the EXT_bgra extension, you could test that the extension is supported at compile time like this:
#ifdef GL_EXT_bgra
   glDrawPixels(width, height, GL_BGRA_EXT, GL_UNSIGNED_BYTE, pixels);

When GL_EXT_bgra is defined, you can expect to find the GL_ BGRA_EXT enumerant defined. Note that if the EXT_bgra extension is not supported, expect the glDrawPixels() line above to generate a compiler error because the standard unextended OpenGL header does not define the GL_ BGRA_EXT enumerant.

So based on the extension name #define in <GL/gl.h>, you can write your code so that it can compile in the extension functionality if your development environment supports the extension's interfaces. The next problem is that even though your development environment may support the extension's interface at compile-time, at run-time, the target system where you run your application may not support the extension. In UNIX environments, different systems with different graphics hardware often support different sets of extensions. Likewise, in the Win32 environment, different OpenGL accelerated graphics boards will support different OpenGL extensions because they have different OpenGL drivers. The point is that you can not just assume a given extension is supported. You must make a run-time check to verify that the extension you wish to use is supported.

Assuming that your application thread is made current to an OpenGL rendering context, the following routine can be used to determine at run-time if the OpenGL implementation really supports a particular extension:

#include <GL/gl.h>
#include <string.h>

isExtensionSupported(const char *extension)
  const GLubyte *extensions = NULL;
  const GLubyte *start;
  GLubyte *where, *terminator;

  /* Extension names should not have spaces. */
  where = (GLubyte *) strchr(extension, ' ');
  if (where || *extension == '\0')
    return 0;

  extensions = glGetString(GL_EXTENSIONS);

  /* It takes a bit of care to be fool-proof about parsing the
     OpenGL extensions string.  Don't be fooled by sub-strings,
     etc. */
  start = extensions;
  for (;;) {
    where = (GLubyte *) strstr((const char *) start, extension);
    if (!where)
    terminator = where + strlen(extension);
    if (where == start || *(where - 1) == ' ')
      if (*terminator == ' ' || *terminator == '\0')
        return 1;
    start = terminator;
  return 0;

With the isExtensionSupported routine, you can check if the current OpenGL rendering context supports a given OpenGL extension. To make sure that the EXT_bgra extension is supported before using it, you can do the following:

  /* At context initialization. */
  int hasBGRA = isExtensionSupported("GL_EXT_bgra");

  /* When trying to use EXT_bgra extension. */
#ifdef GL_EXT_bgra
  if (hasBGRA) {
    glDrawPixels(width, height, GL_BGRA_EXT, GL_UNSIGNED_BYTE, pixels);
  } else
    /* No EXT_bgra so bail (or implement software workaround). */
    fprintf(stderr, "Needs EXT_bgra extension!\n");
Notice that if the EXT_bgra extension is lacking at either run-time or compile-time, the code above will detect the lack of EXT_bgra support. Sure the code is a bit messy, but the code above works. You can skip the compile-time check if you know what development environment you are using and you do not expect to ever compile with a <GL/gl.h> that does not support the extensions that your application uses. But the run-time check really should be performed since who knows on what system your program may end up getting run on.

next up previous contents
Next: 20.5 Win32's Scheme for Up: 20 Using OpenGL Extensions Previous: 20.3 How to Read   Contents