View Full Version : Porting Linux App to Windows

Jeff is Lost
01-19-2009, 04:05 PM
I am new to OpenGL but I have been given the task of making a Linux based gtkglext based project to run on Windows. I got it converted and running with the gdk runtime and it renders the screen fine. However, I am getting about 2 fps in Windows and better than 15fps on on Linux.I am thinking that it is not using Hardware rendering but I am not sure. Is there a way to verify if a project is using hardware rendering?

Simon Mihevc
01-20-2009, 12:05 AM
Check these functions to get some information about your implementation.


Jeff is Lost
01-20-2009, 05:44 PM
I put those commands in the code but it doesn't seem to indicate one way or another about the state of hardware rendering.

Simon Mihevc
01-21-2009, 12:27 AM
Those commands return strings that describe your renderer. You'd need to check their output to see what they do.

To get the string in c++ you'd do something like this:

std::string renderer(
reinterpret_cast<const char*>(glGetString(GL_RENDERER))

...and similarly for others. If you have software rendering, the renderer string would probably be "GDI Generic". Otherwise if vendor string contains Nvidia or Ati you have acceleration.

Jeff is Lost
01-21-2009, 12:23 PM
I am getting GDI_GENERIC so software rendering it is. Now to figure out how to get the Hardware Rendering to work.

Thank you very much for the help.

01-21-2009, 12:48 PM
For that you hav to know your OS version and the video accelerator model. Then download and install appropriate graphic drivers (not the ones suggested by Windows autodetection).
That's all.

It can be more tricky on laptops.

Jeff is Lost
01-21-2009, 12:57 PM
OK so I found some code that pointed me a bit further:

int pf = ChoosePixelFormatEx(hdc,&bpp,&depth,&dbl,&acc);
ZeroMemory(&pfd,sizeof(pfd)); pfd.nSize=sizeof(pfd); pfd.nVersion=1;
//try to set the pixel format.
if(!SetPixelFormat(hdc, pf, &pfd))
TRACE0("Can't Set The PixelFormat for OPENGL !!!\n");
return FALSE;

//see if we are indeed accelerated
DescribePixelFormat(hdc, pf, pfd.nSize, &pfd);
// Driver type /////////////////////////////////
if(0 != (pfd.dwFlags & PFD_GENERIC_FORMAT))
TRACE1(" PD: %d ************** Generic Software ******************\n", pf);
if(0 != (pfd.dwFlags & PFD_GENERIC_ACCELERATED))
TRACE1(" PD: %d ************************* MCD ACCELERATED **************\n", pf);
TRACE1(" PD: %d **************** ICD ACCELERATED ************************\n", pf);

My results were: ICD ACCELERATED

Jeff is Lost
01-21-2009, 01:07 PM
Must be something about how it is initialized I am guessing. OpenGL Extension viewer is showing my ATI card.

01-21-2009, 02:07 PM
What parameters do you use for ChoosePixelFormatEx ?
This is quite old "many cannot accelerate when the desktop is at 32bpp in high-resolution" but may help :

Jeff is Lost
01-21-2009, 02:43 PM
That is where I started. I used the same parameters that are specified on that page.

Simon Mihevc
01-22-2009, 12:32 AM
Check GLenum (http://download.microsoft.com/download/platformsdk/other/1.0/win98me/en-us/glen.exe) to see what parameters need to be set on the PIXELFORMATDESCRIPTOR structure to get an accelerated pixel format.

Jeff is Lost
01-22-2009, 10:31 AM
That link appears to be broken. Anyone have another link to it?

01-22-2009, 10:57 AM
The program linked above just WorksForMe (tm).

Jeff is Lost
01-22-2009, 11:16 AM

I get page can not be found.

Jeff is Lost
01-22-2009, 11:54 AM
nice... I switched to my Linux Box and the link worked.

01-22-2009, 12:07 PM
You might want to check your <Windows folder>\system32\drivers\etc\hosts ... maybe microsoft.com was redirected to something else ...

Jeff is Lost
01-22-2009, 02:01 PM
So the pixel format that ChoosePixelFormatEx is shows up as IDC Accelerated Type. There are approximately 26 accelerated types. Should I just try all the types? Filling the structure for SetPixelFormat.

01-22-2009, 03:01 PM
Yes.. Yuo can try to initialise structure before Choose/SetPixelFormat calls... Example..

PIXELFORMATDESCRIPTOR pfd = // pfd Tells Windows How We Want Things To Be
sizeof (PIXELFORMATDESCRIPTOR), // Size Of This Pixel Format Descriptor
PFD_SWAP_EXCHANGE | // Version Number
PFD_DRAW_TO_WINDOW | // Format Must Support Window
PFD_SUPPORT_OPENGL | // Format Must Support OpenGL
PFD_DOUBLEBUFFER, // Must Support Double Buffering
PFD_TYPE_RGBA, // Request An RGBA Format
32, // Select Our Color Depth
0, 0, 0, 0, 0, 0, // Color Bits Ignored
8, // No Alpha Buffer
24, // Shift Bit Ignored
0, // No Accumulation Buffer
0, 0, 0, 0, // Accumulation Bits Ignored
24, // 24Bit Z-Buffer (Depth Buffer)
0, // No Stencil Buffer
0, // No Auxiliary Buffer
PFD_MAIN_PLANE, // Main Drawing Layer
0, // Reserved
0, 0, 0 // Layer Masks Ignored

Jeff is Lost
01-30-2009, 10:53 AM
Well I generally gave up on trying to get hardware rendering using the win32 version of gtk opengl interface. I have converted the implementation over to glut and I am getting hardware rendering. The frame rate problem ended up being related to the unix gettimeofday function that has no win32 equivalent. Once I wrote an found an Win32 function function that did the same thing my frame rates jumped up to something similar to the Linux frame rates.

Thanks for all the help.