PDA

View Full Version : Enabling the GPU for VBOs



gmseed
11-11-2011, 12:54 AM
Hi

I coded up the VBOs sample at:

http://www.java-tips.org/other-api-tips/jogl/vertex-buffer-objects-nehe-tutorial-jogl-port-2.html

and load a large model.

I then installed Open Hardware Monitor:

http://openhardwaremonitor.org

to see how hard the GPU was working while rotating the model.

However, Open Hardware Monitor informs that the GPU is not doing any work.

When using VBOs with JOGL does the GPU have to be explicitly activated?

Thanks

Graham

Alfonse Reinheart
11-11-2011, 01:10 AM
I then installed Open Hardware Monitor:

to see how hard the GPU was working while rotating the model.

The hardware monitor appears to simply look at GPU temperatures, fan, and clock speed. Considering that you're throwing 32K static, unlit, untextured triangles at the GPU, I doubt the GPU would notice.

That workload isn't going to put the GPU into higher stress modes, especially if this is a high-end GPU. My 80-shader integrated HD-3300 can chew through 32K untextured, unlit triangles without noticing. So any real GPU will barely skip a beat.

Odds are, your GPU is bored.

Or, to put it another way, the Open Hardware Monitor (or any other hardware monitoring program) should not be taken as a reliable measure of whether the GPU is being used for rendering.

gmseed
11-11-2011, 01:44 AM
Hi

Thanks for your prompt reply.

How do I tell that my code's setup correctly and using the GPU as expected and not using my backup glVertexPointer() implementation?

Graham

_arts_
11-11-2011, 03:07 AM
Theoretically, if you use STATIC_DRAW_ARB for your VBO, the buffer will be stored in the graphic memory.

Note: you always need to call to VertexPointer or VertexAttrib* functions. Without them, you simply can't draw any vertex arrays, with or without VBO.

mbentrup
11-11-2011, 03:10 AM
Also the GPU would have to do *more* work if VBOs were disabled, because it would have to read the data from system memory every frame.

Alfonse Reinheart
11-11-2011, 10:20 AM
How do I tell that my code's setup correctly and using the GPU as expected and not using my backup glVertexPointer() implementation?

What `backup glVertexPointer() implementation`? Are you talking about code that you wrote? If you want to know if you're running code you wrote, I suggest looking into a good debugger. Put a breakpoint in this "backup" function and see if it gets called.

gmseed
11-11-2011, 02:33 PM
Are you talking about code that you wrote? --> Yes.

If you want to know if you're running code you wrote, I suggest looking into a good debugger. --> OK. And a debugger that steps into the GPU?

Put a breakpoint in this "backup" function and see if it gets called. --> Yes, I know about breakpoints.

Thanks.

kyle_
11-11-2011, 02:57 PM
it would have to read the data from system memory every frame.


This is not entirely true.

Alfonse Reinheart
11-11-2011, 03:04 PM
If you want to know if you're running code you wrote, I suggest looking into a good debugger. --> OK. And a debugger that steps into the GPU?

I don't know how that has anything to do with what you asked. You asked, "How do I tell that my code's setup correctly and using the GPU as expected and not using my backup glVertexPointer() implementation?"

The "GPU" cannot use your "backup glVertexPointer() implementation", because as you stated, that is your code. Since it cannot use it, it is not using it. So your question answers itself.

I think you're trying to verify something that really doesn't matter. If your buffer object rendering code works, then obviously the GPU is doing something with it. If you're wondering if it is the most optimal way to render something, that is a different (and generally unanswerable) question.