Rendering in a xf86dga framebuffer?

How can I do this ?
I will try the following when I come back home this evening:

1/ setup the wanted videomode
2/ call glXCreateGLXPixmap() with pixmap = the XDGADevice’s pixmap.
3/ render to that pixmap.

Is it the right method ?

Julien.

Originally posted by deepmind:
How can I do this ?

Do what ? GL rendering ? Render in a regular window. Full screen ? Make that window fit the whole screen (and set its override_redirect bit to bypass the WM). Latest nvidia’s drivers (2313) will even optimize this case (actually, I guess many drivers will).

DGA has no role in GL rendering, it’s a way to access the framebuffer directly, it won’t cooperate with a GL driver…

Offscreen rendering is not widely available. Mesa with software backend can of course do it, so does recently Nvidia with the pbuffer extension.

Originally posted by zerodeux:
DGA has no role in GL rendering, it’s a way to access the framebuffer directly, it won’t cooperate with a GL driver…

DGA is the only way to get a proper display depth without asking the user “Please edit your /etc/X11/XF86Config-4 and set the display depth to 24 bits per pixel”…

Using X11 really sucks when you need an alpha buffer and the desktop has a 16bpp depth :stuck_out_tongue:

Julien.

Well, the thing is that DGA is not a simple add-on to switch bitdepth, ‘et voilà’ … X11 let the programmer make the very comfortable and safe assumption that its visuals are constant. It’s not designed to support bitdepth change on the fly, nothing you really can’t do about that. If you use DGA2 to switch bitdepth, you’ll just get framebuffer access, I won’t bet that the GL driver catches and support such a change. Moreover, when DGA is activated, most X calls are ignored and acces to a simple X window is necessary to do GL rendering.

However you should consider that the end-user is responsible enough to choose its resolution. 16bpp is disappearing, it’s even considered a no-op when you do 3D gaming today (32bpp is the only way !). If you can support 16bpp and feel like it, do it. If you need 32bpp and it’s not available, just claim it to the user. If he’s running 16bpp and he as a good point (G200…), he’ll forget about your app. If not, he’ll realize it’s high time to switch to 32bpp.

You can also change bitdepth without editing the XF86Config file, simply use server options (’-bpp 32’ in /etc/X11/xinit/xserverrc, depends on your distro). I also used to run two X servers on the same machine when I had a TNT, one in 16bpp and 32bpp, I could switch with Alt+F7/F8 at any time, pretty efficient. Plus it makes two desktops (multiply with 6 virtual desktops per desktop ).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.