Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 23

Thread: OpenGL and how the driver works ?

  1. #11
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    Strictly speaking, this isn't specific to GLX. The same issues would apply to using a graphics card in a system whose CPU has a different byte order to the GPU.
    Actually no. The OpenGL standard requires that, if the client writes a string of bytes as a "GLuint", then the server must interpret those bytes as a proper "GLuint". So whatever bit fiddling that the server needs to do must be built into whatever processes the server uses to read that memory.

    FWIW, I have trouble understanding why there seems so little interest in exploiting one of the features which really sets OpenGL apart from DirectX.
    Because:

    1: It requires having more than one computer.

    2: Doing so requires being Linux-only.

    3: It relies on the asymmetric computing situation, where your local terminal is weak and a central server has all the processing power. This situation becomes less valid every day. Between GLES 3.0-capable smart phones and Intel's 4.1-class integrated GPUs, the chance of not being able to execute OpenGL code locally is very low.

    It's very difficult to exploit this feature unless it's explicitly part of your application's design requirements. It may differentiate OpenGL from Direct3D, but it's such a niche thing that very few people ever have a bone-fide need for it. It's nice for when you need to do it, but you can't say that it's a pressing need for most OpenGL users.

  2. #12
    Member Regular Contributor
    Join Date
    Jun 2013
    Posts
    491
    Quote Originally Posted by Alfonse Reinheart View Post
    Actually no. The OpenGL standard requires that, if the client writes a string of bytes as a "GLuint", then the server must interpret those bytes as a proper "GLuint". So whatever bit fiddling that the server needs to do must be built into whatever processes the server uses to read that memory.
    I don't really see your point. If the GPU can be made to use either byte order, then the X server can tell it to use the (X) client's byte order rather than the server's byte order. If the GPU's byte order is hard-coded, then a driver for a big-endian system with a little-endian GPU would need to twiddle the buffer contents based upon the commands which use the buffer.

    Quote Originally Posted by Alfonse Reinheart View Post
    1: It requires having more than one computer.
    That's the case for practically anything beyond "home" use.

    Quote Originally Posted by Alfonse Reinheart View Post
    2: Doing so requires being Linux-only.
    I regularly run an X server on Windows systems.

    Quote Originally Posted by Alfonse Reinheart View Post
    3: It relies on the asymmetric computing situation, where your local terminal is weak and a central server has all the processing power. This situation becomes less valid every day. Between GLES 3.0-capable smart phones and Intel's 4.1-class integrated GPUs, the chance of not being able to execute OpenGL code locally is very low.
    The example of smart phones is one where it's useful. The local terminal has decent graphics capability (where there server may have none) but limited CPU, memory and storage capacity. Making it a reasonable "terminal" for a back-end system but not so good as a stand-alone system.

    Quote Originally Posted by Alfonse Reinheart View Post
    It's very difficult to exploit this feature unless it's explicitly part of your application's design requirements.
    It's trivial to exploit this feature. Every X11 GUI application automatically has the ability to be run remotely. Well, except for ones which rely upon OpenGL 3 support, although it's not just the lack of GLX wire protocol which makes such reliance problematic at present.

    It's useful enough that there is no shortage of attempts to retrofit similar functionality onto other platforms.

  3. #13
    Junior Member Regular Contributor
    Join Date
    Dec 2009
    Posts
    211
    Quote Originally Posted by GClements View Post
    I don't really see your point. If the GPU can be made to use either byte order, then the X server can tell it to use the (X) client's byte order rather than the server's byte order. If the GPU's byte order is hard-coded, then a driver for a big-endian system with a little-endian GPU would need to twiddle the buffer contents based upon the commands which use the buffer.
    Well, indirect GLX allows the sharing of buffer objects between different clients, so "the" client byte order may be ambigous. The way GLX handles this is that it doesn't allow the creation of any GL context including buffer objects unless the client explicitly opts-in to the different byte order semantics (via the GLX_CONTEXT_ALLOW_BUFFER_BYTE_ORDER_MISMATCH_ARB attribute), and then the client (i.e. your application) is responsible for filling the buffer in the server byte order.

  4. #14
    Senior Member OpenGL Pro
    Join Date
    Apr 2010
    Location
    Germany
    Posts
    1,128
    Quote Originally Posted by GClements
    I regularly run an X server on Windows systems.
    That just feels wrong. Seriously though, you probably are among a select few there.

    The example of smart phones is one where it's useful. The local terminal has decent graphics capability (where there server may have none) but limited CPU, memory and storage capacity. Making it a reasonable "terminal" for a back-end system but not so good as a stand-alone system.
    Am I getting this right? Do you suggest offloading rendering to your smart phone over the network is a reasonable use-case for supporting such capabilities?

    Every X11 GUI application automatically has the ability to be run remotely. Well, except for ones which rely upon OpenGL 3 support, although it's not just the lack of GLX wire protocol which makes such reliance problematic at present.
    At least on Linux distributions that go down that path, as soon as X is dropped in favor of Wayland or Wayland-like architectures, remote rendering isn't available anymore. At least not with vanilla Wayland. You can layer stuff on top of Wayland but in general the capability is gone.

  5. #15
    Advanced Member Frequent Contributor
    Join Date
    Apr 2009
    Posts
    593
    I'd like to make my (usual) case for why/how I think the entire remote rendering jazz of X is borderline useless. Here goes: in times of past the idea was that the terminal (the thing that did the displaying) had a very poor CPU and could only really be used for displaying stuff. This idea made perfect sense ages ago.

    Then X came along, and now that terminal needs to run an XServer. The powerful remote machine would then send the drawing commands over the wire for the terminal to display. To be honest, this sounds kind of neat and in decades past it was not a bad idea.

    Now enters OpenGL; that means the terminal needs to have a good GPU to render stuff at a reasonable speed. If a box has a good GPU, it likely has a reasonable CPU. I suppose there are the severe corner cases where some super-hefty CPU box is doing lots of calculations and the terminal needs to visualize the data and the way it is visualized it does not send oodles of data. Seems to me like a rare corner case.

    It gets worse; implementing a good XServer driver system is pain, severe pain. OpenGL remote rendering is very touch and go anyways; it can be tricky to setup, there are limits on what one can expect to work well... can you imagine how poorly something like glMapBuffer is going to work? It is hideous. X makes a very severe implementation burden and the benefits of that burden are rarely used; and more often than not when that remote rendering is really used bad things and bad surprises happen.

    Even ignoring the OpenGL thing, most UI tool kits usually do NOT want to use X to draw. Qt prefers to draw everything itself (it does have an X-backend which is labeled as native and it performs horribly when compared to raster). Similar story with Cairo, GDK, and on and on.

    When X dies, it will likely be a very, very good thing for Linux desktop; to give an idea of how bad X really is, watch this where the fellow talking was a major contributor to X and essentially said after a while, X is not working:

    http://www.youtube.com/watch?v=RIctzAQOe44


    fast forward to 18:45... bit of a shocker.
    Last edited by kRogue; 08-26-2013 at 02:31 PM.

  6. #16
    Member Regular Contributor
    Join Date
    Apr 2009
    Posts
    268
    Quote Originally Posted by thokra View Post
    That just feels wrong. Seriously though, you probably are among a select few there.
    We are many . I do that as well. Its actually pretty handy (well X part, not GLX). But then again, its primarily used for some pretty obscure stuff.

  7. #17
    Member Regular Contributor
    Join Date
    Jun 2013
    Posts
    491
    Quote Originally Posted by thokra View Post
    That just feels wrong. Seriously though, you probably are among a select few there.
    Hummingbird (since acquired by OpenText) basically built their business on eXceed (a commercial X server for Windows), so it can't be that rare.

    Quote Originally Posted by thokra View Post
    Am I getting this right? Do you suggest offloading rendering to your smart phone over the network is a reasonable use-case for supporting such capabilities?
    If your going to use a smartphone or tablet as a terminal, using X avoids having to construct a separate client for each platform for each application.

    Quote Originally Posted by thokra View Post
    At least on Linux distributions that go down that path, as soon as X is dropped in favor of Wayland or Wayland-like architectures,
    That would be Ubuntu. Everyone else seems to view Wayland/Mir as an API for the X server to communicate with device drivers.

  8. #18
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    If your going to use a smartphone or tablet as a terminal, using X avoids having to construct a separate client for each platform for each application.
    Let's look at the evolution of, well, all computing.

    In the earliest days, computers were gigantic. But they were kinda useful. So people found a way to make these large, centralized computers which could be used by multiple people. Thus, the smart server/dumb terminal paradigm came to be. Time passes and computers get a lot smaller. Personal computers made dumb terminals... effectively obsolete. They're still used in places, but it is exceedingly rare. Even when you're networking to a smart server, you're generally using it from a smart terminal.

    In the earliest days, the web was very much server-only. The server coughed up HTML. Someone invented PHP scripts that allowed server-side mucking with the HTML. Again, you have smart server/dumb terminal, just with the web browser as the dumb one. Fast-forward to... today. Sure, PHP scripts still exist, but client-side scripting via JavaScript is all the rage. You can't effectively navigate half the web without JavaScript on.

    In every case, we started with dumb terminals, then slowly traded them up for smart ones. That is the nature of computing: client-side wins in the long term. And the same is true for OpenGL: client-side won. There are numerous features of modern OpenGL that only improve performance if everything is running on the same machine. Mapping buffers for example would absolutely murder performance for a networked renderer compared to even a much slower client-side GPU.

    That doesn't mean that some people can't find uses for it. But it's very much a niche application, so niche that the ARB is spending precious little time keeping the protocol up-to-date.

    If your going to use a smartphone or tablet as a terminal, using X avoids having to construct a separate client for each platform for each application.
    Or you could make your application completely independent of a network, and therefore more useable and reliable. No network hiccup or going through a tunnel or whatever can interrupt your client-side application. Not to mention faster in many cases. Smart Phones may not have the best GPUs, but they're reasonably serviceable for most needs.

    Also, using X does nothing for being able to write a platform-independent client. Sure, your rendering code may be independent, but that would be no less true than if you were using straight OpenGL ES. You still need the platform-specific setup work; even initializing an application that will use X differs between the platforms. Not to mention processing input or any of the other tasks you need to do. Oh sure, minor quirks between implementations would not exist, but the majority of your porting work doesn't deal with them anyway.

  9. #19
    Member Regular Contributor
    Join Date
    Jun 2013
    Posts
    491
    Quote Originally Posted by kRogue View Post
    Now enters OpenGL; that means the terminal needs to have a good GPU to render stuff at a reasonable speed. If a box has a good GPU, it likely has a reasonable CPU. I suppose there are the severe corner cases where some super-hefty CPU box is doing lots of calculations and the terminal needs to visualize the data and the way it is visualized it does not send oodles of data. Seems to me like a rare corner case.
    Not really. Dedicated server systems often don't have any kind of GPU. It's not that useful when the system is serving many users, none of whom are in physical proximity to the server.

    Quote Originally Posted by kRogue View Post
    It gets worse; implementing a good XServer driver system is pain, severe pain. OpenGL remote rendering is very touch and go anyways; it can be tricky to setup,
    It shouldn't require any setup, beyond what is required for X itself and the OpenGL driver. To the driver, the X server is just another client.

    Quote Originally Posted by kRogue View Post
    there are limits on what one can expect to work well...
    To be honest, I don't expect OpenGL with direct rendering to work well on Linux. It isn't a high priority for the hardware vendors, the hardware is complex, and the hardware vendors historically haven't been particularly open with technical specifications.

    Quote Originally Posted by kRogue View Post
    can you imagine how poorly something like glMapBuffer is going to work?
    That depends upon how badly it's misused. If you map an entire buffer but only read/write a portion of it, that's going to be inefficient. It will be far more inefficient with GLX, but it's significant in any case. Use of glMapBufferRange() with the invalidate/flush bits shouldn't be any worse than glBufferSubData() or glGetBufferSubData() (clearly, you can't avoid actually transferring data over the network).

    Quote Originally Posted by kRogue View Post
    It is hideous. X makes a very severe implementation burden and the benefits of that burden are rarely used; and more often than not when that remote rendering is really used bad things and bad surprises happen.
    This isn't my experience.

    Quote Originally Posted by kRogue View Post
    Even ignoring the OpenGL thing, most UI tool kits usually do NOT want to use X to draw. Qt prefers to draw everything itself (it does have an X-backend which is labeled as native and it performs horribly when compared to raster). Similar story with Cairo, GDK, and on and on.
    All of those use X. Maybe you're confusing "core X protocol" with XRender?

  10. #20
    Advanced Member Frequent Contributor
    Join Date
    Apr 2009
    Posts
    593
    All of those use X. Maybe you're confusing "core X protocol" with XRender?
    No; all of those use X to do exactly the following:
    1. Create -one- window
    2. Poll X for events


    All the drawing is done to a -buffer- by the toolkit. The entire "remote" rendering thing is dead. In order for the program to run on one machine and display on another usually means that the buffer (the window contents) is sent over the wire. What you have now is essentially a really crappy per-window VNC. One can claim that if GL was network happy on the XServer then the application would send the GL commands to the XServer and all would be great; but it does happen that way. Sorry.


    It shouldn't require any setup, beyond what is required for X itself and the OpenGL driver. To the driver, the X server is just another client.
    OpenGL resides on the XServer. The OpenGL implementation is then required to be able to take commands from a remote device (the client). OpenGL itself together with GLX are part of the X-driver often enough. Pretending that it will just work is putting one's head in the sand; it requires heroic efforts to make a GL implementation take commands from a remote source. Compounding the pain is that many GL features do not even really make sense in this case; my favorite one being glMapBuffer, but there are others.

    To be honest, I don't expect OpenGL with direct rendering to work well on Linux. It isn't a high priority for the hardware vendors, the hardware is complex, and the hardware vendors historically haven't been particularly open with technical specifications.
    Huh?!! AMD has released the specs to the GPU's (outside of video decode); Intel's GL driver for Linux is entirely open source. Lets take a real look at why it is not there: the effort to make remote rendering just work is borderline heroic. The underlying framework (DRI2) does not work over a network.

    Regardless this proves my point: remote rendering is such a rarely used/wanted feature that it is not implemented really. Exactly my point. If there was commercial demand then it would be. Therefore the only ones warning it are, well no offense, borderline Slashdot trolls.

    Please everyone who thinks X is network transparent and great, take the hour to watch that video (or stop when he talks about how great Wayland is); that video will wake you up to the reality: X should die.


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •