VBO C code sample?

Hello!

Does someone has a sample program for using vertex buffer objects in C under linux?

I followed a tutorial but it does not work :frowning:

Thanx
Christiane.

First, make a program with traditional vertex arrays.
Then, use the specification to port traditional VA to VBO thanks to the example of the end of the document.

Linux or not doesn’t make any difference (except extension loading maybe).

At which point are you stuck ? Make traditional VA working ? porting to VBO ? loading extension ? finding adequate VBO format ? map/unmap VBO ? fill sub-part of VBO ?
As you can see there are many issues with VBO, so please explain your problem more in detail.

I read this tutorial:
http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=45
and “converted” it to C.

I added the following lines:

#include “glext.h”

PFNGLGENBUFFERSARBPROC glGenBuffersARB = NULL;
PFNGLBINDBUFFERARBPROC glBindBufferARB = NULL;
PFNGLBUFFERDATAARBPROC glBufferDataARB = NULL;
PFNGLDELETEBUFFERSARBPROC glDeleteBuffersARB = NULL;

…

glGenBuffersARB = (PFNGLGENBUFFERSARBPROC) wglGetProcAddress(“glGenBuffersARB”);
glBindBufferARB = (PFNGLBINDBUFFERARBPROC) wglGetProcAddress(“glBindBufferARB”);
glBufferDataARB = (PFNGLBUFFERDATAARBPROC) wglGetProcAddress(“glBufferDataARB”);
glDeleteBuffersARB = (PFNGLDELETEBUFFERSARBPROC) wglGetProcAddress(“glDeleteBuffersARB”);

…

The first error message I get is a parse error before glGenBuffersARB. The glext.h is there but for some reason the definition of PFNGLGENBUFFERSARBPROC (and all the other definitions) is not found.

I did a work around and copied the following lines into my c file:

typedef void (APIENTRY * PFNGLBINDBUFFERARBPROC) (GLenum target, GLuint buffer);
typedef void (APIENTRY * PFNGLDELETEBUFFERSARBPROC) (GLsizei n, const GLuint *buffers);
typedef void (APIENTRY * PFNGLGENBUFFERSARBPROC) (GLsizei n, GLuint *buffers);
typedef void (APIENTRY * PFNGLBUFFERDATAARBPROC) (GLenum target, int size, const GLvoid *data, GLenum usage);

Now there is no parse error, but a segmentation fault when calling wglGetProcAddress(“glGenBuffersARB”);

I tried to use the SDL library, this works in some way, but I want to use GLUT, so I wondered, if it is possible.

Any ideas about the problems with glext.h and wglGetProcAddress?

thnx
Christiane

You should learn about how to use OpenGL under Linux, and specially how to load extensions. There are documentation about that on this site. Just type a search.

Or you can, and you stepped, use SDL.

Hope this helps.

A link would help even more :slight_smile:

I searched opengl.org before I posted to this forum. The search functionality doesn’t work, I get a server error. And Documentation->OpenGL Extensions just leads to sgi. Couldn’t find anything in “coding resources”.

Christiane.

Instead of using wglGetProc…, use glxGetProcAddress, or glxGetProcAddressARB (it seems it depends on your graphic card and drivers).

I don’t think you need something else.

Hope this helps.

Does your program even link and compile under linux when you call wglGetProcAddress ?!
Yes you have to use glXGetProcAddressARB (with upper-case X). It works essentially like wglGetProcAddress, granted that your GL driver supports the glx-get-proc-address feature (almost all drivers do).

Thanx for your help!! It works now.

I’m drawing 1.000.000 points and now get 15 fps (instead of 7 fps without vbos) but that’s not very much, isn’t it? Is this rendered in software?

I use SUSE 9.1 and a GeForce FX 5200. I installed the NVidia drivers via Yast Online Update. How can I find out wether the NVidia driver is used by the system or not?

Christiane.

cat /proc/drivers/nvidia/agp/status will bring you the current state of your system. If it’s enabled, then you already have had it.

15,000,000 vertices per seconds sounds good for a GeForceFX 5200.
Is texturing enabled ? lighting ? fog ? vertex colors (do you call glColor per vertex) ?

Originally posted by jide:
cat /proc/drivers/nvidia/agp/status will bring you the current state of your system. If it’s enabled, then you already have had it.
Status: Enabled
Driver: AGPGART
AGP Rate: 2x
Fast Writes: Disabled
SBA: Disabled

looks good :slight_smile:

thanks a lot
Christiane

btw: it’s cat /proc/drivers/nvidia/agp/status

Originally posted by Christiane:

btw: it’s cat /proc/drivers/nvidia/agp/status

sorry, I meant
cat /proc/driver/nvidia/agp/status

Originally posted by vincoof:
15,000,000 vertices per seconds sounds good for a GeForceFX 5200.
Is texturing enabled ? lighting ? fog ? vertex colors (do you call glColor per vertex) ?

Of course it’s not rendered in software, otherwise there wouldn’t be any speedup using vbos, right :slight_smile: ?
I had a look at nvidia’s, and they say a GeForce FX 5500 can handle 68 million vertices/sec. So, yes, 15 million sounds not that bad for a 5200 :slight_smile:
I disabled lighting already and I’ll try the others things now.

Christiane.

It’s very difficult to reach what the graphic card vender says about performances.
I already tried out that some years ago, while having a geforce 2 mx. If I remember well (was discussed a bit here), I reached to have half the max performances, or a little over. But I was using colors, lighting and single texturing…

Another fact for your current limits are that you’re using agp 2X instead of 8x. And maybe there are other things that might not be optimal.

Hope this helps.

AGP speed shouldn’t really matter WRT performance if vertices are stored on the GPU side. It will matter only while uploading, which is performed once. Then if arrays are big enough you won’t make the performance difference between a PCIx16 bus and AGP 1x.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.