64-bit OpenGL for 64-bit Windows XP (AMD)?

Will a 64-bit version of OpenGL be available for the 64-bit version of XP for the AMD chipset? If so, will it be OpenGL 2.0?

Will the necessary include files, libs, dlls, etc. be provided for developers by Microsoft or from another source?

Will a software only version be available as with OpenGL 1.1? This support is vital for diagnosing driver problems.

Originally posted by A Developer:
[b]Will a 64-bit version of OpenGL be available for the 64-bit version of XP for the AMD chipset? If so, will it be OpenGL 2.0?

Will the necessary include files, libs, dlls, etc. be provided for developers by Microsoft or from another source?

Will a software only version be available as with OpenGL 1.1? This support is vital for diagnosing driver problems.[/b]
There’s no change in the ICD model for win64, both software rendering and hardware ICDs are supported. The headers and .lib files should come with the Windows SDK or with your compiler of choice; the opengl32 .dlls, as always, come with the OS and ICDs should come from your graphics card vendor.

In Win64 OpenGL can work under the wow64 compatibility layer (for 32-bit applications), or in native 64-bit (for native 64-bit apps).
In any case, for hw accelerated OpenGL you are going to need a 64-bit display driver from your graphics card manufacturer (all kernel drivers are 64-bit in Win64) and a 32-bit and a 64-bit ICDs.
Software rendering for 32-bit and 64-bit apps should work straight out-of-the-box.

OpenGL on Windows is currently a 32 bit graphics engine. What I need is a 64 bit graphics engine that will allow the internals of OpenGL to maintain full double precision numbers. Even though you can push coordinates down as double precision to OpenGL, internally they get converted to single precision. So instead of upto 15 significant digits, the numbers are truncated to 7 significant digits.

Will there be an “OpenGL64.dll” that allows the internal computations done by OpenGL to stay in full double precision?

It appears that full 64 bit versions of OpenGL are available on 64-bit Linux…

Originally posted by A Developer:
[b]OpenGL on Windows is currently a 32 bit graphics engine. What I need is a 64 bit graphics engine that will allow the internals of OpenGL to maintain full double precision numbers. Even though you can push coordinates down as double precision to OpenGL, internally they get converted to single precision. So instead of upto 15 significant digits, the numbers are truncated to 7 significant digits.

Will there be an “OpenGL64.dll” that allows the internal computations done by OpenGL to stay in full double precision?

It appears that full 64 bit versions of OpenGL are available on 64-bit Linux…[/b]
I would like to see those “full 64 bit versions of OpenGL”: Most graphic cards don’t support 64-bit calculations, so even if the matrix multiplies inside the ICD were done in doubles, hw T & L would still operate in singles and the geometry has to be converted into singles before being consumed by the graphics card.

You will have to resort to a software renderer, maybe Mesa’s offscreen renderer supports also double calculations :?

BTW, the fact that the calculation is done in doubles or in singles, has nothing to do with the ICD being a 64-bit or 32-bit binary (in fact in x87, operations are carried away internally in 80-bit, but the results are rounded/truncated to the desired precision when spilled out to memory).

That’s really interesting that most cards can’t handle 64-bit calculations. That would imply that 64-bit DirectX isn’t fully 64-bit once it gets out to hardware acceleration on the card.

It is a pity though, 15 digit coordinate precision would be helpful.

Of course not. Software being “64 bit” just refers to its operating model on the CPU. Ie pointers and integer registers are (or at the very least are allowed to be) 64 bits wide.

64 bit floating point calculations were available since the early days of math coprocessors (287!?) and noone ever talked about “64 bit software” because of that.

It also refers to the instructions for the CPU.
Nothing to do with the FPU. A CPU need not have a FPU integrated at all and not much reason to have double in GPUs. It would be a waste of silicon.

quote

There’s no change in the ICD model for win64, both software rendering and hardware ICDs are supported. The headers and .lib files should come with the Windows SDK or with your compiler of choice; the opengl32 .dlls, as always, come with the OS and ICDs should come from your graphics card vendor.

In Win64 OpenGL can work under the wow64 compatibility layer (for 32-bit applications), or in native 64-bit (for native 64-bit apps).
In any case, for hw accelerated OpenGL you are going to need a 64-bit display driver from your graphics card manufacturer (all kernel drivers are 64-bit in Win64) and a 32-bit and a 64-bit ICDs.
Software rendering for 32-bit and 64-bit apps should work straight out-of-the-box.

what is ICDs?
what is hw accelerated ?

ICD = installable client driver. The OpenGL implementation proper, which you get from your graphics chip vendor.

hardware accelerated = faster than a software emulation of OpenGL :wink:
OpenGL is implemented on top of a hardware device and uses its capabilities to accelerate rendering.

Hi,

Can anybody tell me this

I have been reading that there should be 2 ICD drivers in a x64 graphics driver.

i have a nVidia graphics adapter and a ATI graphics adapter

When i use the ATI graphics adapter on XP64 Build 1184 i have 2 ICD drivers 8.01
1.\Windows\System32\atio6axx.dll 64-Bit GL
2.\Windows\SysWow64\atioglxx.dll 32-Bit GL

When i use the nVdia graphics adapter on XP64 Build 1184 i have only 1 ICD driver 57.30
1.\Windows\System32
voglnt.dll 64-Bit GL

But when i run e.g. Quake3(32-Bit GL) the game works, how does the nVidia driver do that? When there is no 32-Bit GL driver in \Windows\SysWow64???

Also when i extract the file nVidia 57.30 package i only see 1 GL driver file.
While i do see 2 GL driver files when extracting the ATI 8.01 package

WoW stands for “windows on windows” and is a more or less universal translation layer, so that 32 bit applications can use 64 bit DLLs.

The NVIDIA GL driver looks like it’s purely 64 bit. If a 32 bit application attempts to load this driver (in this case by creating a GL context), communication between the application and the driver will be funneled through a translation layer. This is automatic.

The ATI GL driver appears to have native support for 32 bit apps and 64 bit apps (hence the two DLLs). No translation is necessary between applications and ICDs (the 32 bit ICDs may still need to go through a translation to access the actual hardware layer).

In theory, ATI’s model should be a bit more efficient for 32 bit applications, because there’s less overhead per function call. Maybe you can do a few tests, since you have both? I’d find this very interesting :slight_smile:

Originally posted by zeckensack:
WoW stands for “windows on windows” and is a more or less universal translation layer, so that 32 bit applications can use 64 bit DLLs.

Just a note on that, WoW only works for “selected” system dlls, not for any DLL. WoW doesn’t know how to do the marshalling/parameter realignment for any DLL, only for those it has been designed to support.


The NVIDIA GL driver looks like it’s purely 64 bit. If a 32 bit application attempts to load this driver (in this case by creating a GL context), communication between the application and the driver will be funneled through a translation layer. This is automatic.

Nope, that cannot be. A 32-bit OpenGL program will use the 32-bit opengl32.dll, which will only load 32-bit ICDs, using the 32-bit registry (the ICD loading mechanism fetches some registry values to get the ICD name and other tidbits).

Also, 32-bit apps cannot use 64-bit DLLs (unless for those supported by WoW, and opengl32.dll is not one of those). This also means that 32-bit apps cannot use 64-bit inprocess COM objects, for example.

My guess is that the reason why Quake3 works is because it’s using Microsoft's DX layer which, as strange as it is, would mean that NVIDIA doesn’t accelerate 32-bit apps in Win64 :?
This is as easy to test as is looking at what renderer is Quake3 using when run on an NVIDIA card on Win64.

Kernel-space drivers are always 64-bit in Win64 (note that an ICD is not a kernel-space driver).

Thanks, evanGLizr. Always nice to learn something :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.