Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 3 123 LastLast
Results 1 to 10 of 26

Thread: VBO and vertexprogram in software

  1. #1
    Advanced Member Frequent Contributor
    Join Date
    Oct 2001
    Posts
    596

    VBO and vertexprogram in software

    I got strange results when i tried VBO on my gf4mx ( nforce2 ).. normally it works as it should, increased FPS.. But ehwn i tried bumpmap the fps dropped fast. Moving the vertexarrays back to the normal routines (bypass the VBO) restored the FPS again..
    On my gf3 it worked fine all the time.

    So the question is, Doenst gf4mx emulate the vertexprograms? and if so, can the slowdown depend on that it has to transfer data from agp/videomem, back tp system, calculate, and trasnfer it back? in that case they either must let the driver take care of this (using a 'vertex shadow' in systemmem if vertexprograms are active) or let the developer know that what will happen through a glget something, do that we can avoid this behaivor..

    or maybe its just some bug in my code

  2. #2
    Senior Member OpenGL Guru
    Join Date
    Mar 2001
    Posts
    3,576

    Re: VBO and vertexprogram in software

    VBO's and software vertex programs are a bad combination. Typically, VBO memory is not memory that the CPU should read from. Of course, software vp's requires the CPU to read from that memory. Hence the slowdown.

    I don't think nVidia is going to put too much driver emphesis on what to do with someone who is using software vertex programs.

  3. #3
    Advanced Member Frequent Contributor
    Join Date
    Oct 2001
    Posts
    596

    Re: VBO and vertexprogram in software

    I mean 'Software vertex program' as in that ARB_vertex_program are emulated with the CPU where the chip arent able to do it, but the driver is. (ie, the emulation is hidden to the developer)

  4. #4
    Advanced Member Frequent Contributor
    Join Date
    May 2001
    Location
    France
    Posts
    765

    Re: VBO and vertexprogram in software

    With VBO, vertex data may be stored in the graphics card, but the vertex program software emulation has to be performed on the CPU side.
    In that case, vertex programs computations need to download the vertex data from the graphics card to the CPU side every time ; and because AGP is optimized for upload (not for download), downloading vertex data every time is expensive.

  5. #5
    Senior Member OpenGL Pro Zengar's Avatar
    Join Date
    Sep 2001
    Location
    Germany
    Posts
    1,932

    Re: VBO and vertexprogram in software

    The problem is: on cards that support vertex programs in software driver should support a copy of VBO in system memory to avoid performance slowdown.
    How can I detect if vertex programs are supported in hardware howewer? If it's impossible, than VBO makes not much sence for <<GL3 cards. I like to use both VBO and vertex programs in my programs, so I must be shure it will also work at all cards.
    nVidia should really consider it.

  6. #6
    Advanced Member Frequent Contributor
    Join Date
    May 2001
    Location
    France
    Posts
    765

    Re: VBO and vertexprogram in software

    I don't think that "nVidia should consider it". Every nVidia chip prior to NV20 emulates vertex programs in software. So when you detect a graphics card NVxx, if xx is inferior than 20 you may disable vertex programs.

  7. #7
    Senior Member OpenGL Guru
    Join Date
    Mar 2001
    Posts
    3,576

    Re: VBO and vertexprogram in software

    I don't think that "nVidia should consider it". Every nVidia chip prior to NV20 emulates vertex programs in software. So when you detect a graphics card NVxx, if xx is inferior than 20 you may disable vertex programs.
    Actually, I disgree on 2 parts.

    First, I think you should turn off VBO before you turn off vertex programs. Given a moderately decent CPU, you can still get decent speed out of the feature. Also, I would imagine that the power of a vp is more important to the user (who has an older card) than the speed of VBO's.

    Secondly, there is no readily avaliable way to "detect a graphics card NVxx." If nVidia isn't going to expose a means of testing vertex-program-with-VAR/VBO performance, then their drivers should have to deal with the situation. Granted, I seriously doubt nVidia is going to spend significant driver developer resources on 2-year-old cards, but that's the correct solution.

  8. #8
    Senior Member OpenGL Pro Zengar's Avatar
    Join Date
    Sep 2001
    Location
    Germany
    Posts
    1,932

    Re: VBO and vertexprogram in software

    I agree with Korval in all points. vp is more important then VBO :-)

  9. #9
    Senior Member OpenGL Guru Humus's Avatar
    Join Date
    Mar 2000
    Location
    Stockholm, Sweden
    Posts
    2,345

    Re: VBO and vertexprogram in software

    The problem is solvable on the driver-side, but I don't expect this to be top-priority until apps using VBO appears on the market. The driver can make system memory copies for the CPU processing. The hard problem is to decide when to do that. The driver will have to track how each buffer is used and see whether it would be beneficial to make a system memory copy. For well-behaving apps this should be feasable.

  10. #10
    Advanced Member Frequent Contributor
    Join Date
    May 2001
    Location
    France
    Posts
    765

    Re: VBO and vertexprogram in software

    Originally posted by Korval:
    First, I think you should turn off VBO before you turn off vertex programs.
    So far, I don't remember having written that vertex programs should be disabled 'before' VBO.
    Though, it all depends on the application (as usual) since sometimes quality is a must, and sometimes speed is a must.


    Originally posted by Korval:
    Secondly, there is no readily avaliable way to "detect a graphics card NVxx."
    Even though I agree it still lacks some kind of glGetIntegerv(GL_CHIPSET_NV), you can actually check glGetString(GL_RENDERER) which gives IMO a sufficient information for the 4-5 years to come at least. And if your software survives that period, you still can deliver online a patch that tests brand new 2008 graphics cards.


    Originally posted by Korval:
    Granted, I seriously doubt nVidia is going to spend significant driver developer resources on 2-year-old cards.
    That is at least one point on which we agree

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •