PDA

View Full Version : ARB_VBO in pascal



satan
04-03-2003, 12:31 AM
Hello everyone,

I am really no advanced OGL coder but I want to try ARB_VBO in my little engine. Up to now I had very little success, meaning I just could not get it to work.
And I think my problem is that I am unable to translate the following macro to pascal.

#define BUFFER_OFFSET(i) ((char *)NULL + (i))

Perhaps this is considered OT, if this is the case I am very sorry. But I would really like to know how to get ARB_VBO working since it is included in the extension string of the actual NVIDIA Linux driver.

thanks in advance

tfpsly
04-03-2003, 01:39 AM
> #define BUFFER_OFFSET(i) ((char *)NULL + (i))

That's a weird line. It just convert a number to a pointer.
Try something like this:
var ptr: pointer = nil;
incr( ptr, i );

satan
04-03-2003, 01:45 AM
Originally posted by tfpsly:
> #define BUFFER_OFFSET(i) ((char *)NULL + (i))

That's a weird line. It just convert a number to a pointer.
Try something like this:
var ptr: pointer = nil;
incr( ptr, i );

I will try it, but I read in the specs that I have to use sometimes BUFFER_OFFSET(0). Which would correspond to a nil pointer, if I get it right.

Tom Nuydens
04-03-2003, 01:52 AM
Or even just a simple typecast:



ptr := Pointer(i);

It's normal that you'll end up with nil pointers sometimes. When using VBO, the "pointers" aren't really pointers anymore, they're offsets relative to the start of the data (measured in bytes). The offset of the first vertex in a vertex array would hence be 0 (or nil), the second would be sizeof(Vertex), and so on. The only reason you have to typecast the offset to a pointer is because glVertexPointer() and others require a pointer as an argument.

-- Tom

Roderic (Ingenu)
04-03-2003, 01:53 AM
NULL is often (if not always) = 0

so you could probably safely replace nil by 0

satan
04-03-2003, 03:04 AM
If it is just a typecast then my problem must be somewhere else, because I get an access violation at my glDrawRangeElements call (with standard VA everything was fine).
So here is my code:



// global variables
var
VertexArray : ArrayOfGLFloat;
NormalArray : ArrayOfGLFloat;
Model: ArrayOfGLuint;
buffers: array [0..2] of GLuint;

// my buffer init procedure
glGenBuffersARB(3, buffers);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffers[0]);
glBufferDataARB(GL_ARRAY_BUFFER_ARB, VertexArray.Length*SizeOf(GLfloat), VertexArray.Data, GL_STATIC_DRAW_ARB);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffers[1]);
glBufferDataARB(GL_ARRAY_BUFFER_ARB, NormalArray.Length*SizeOf(GLfloat), NormalArray.Data, GL_STATIC_DRAW_ARB);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffers[2]);
glBufferDataARB(GL_ARRAY_BUFFER_ARB, Model.Length*SizeOf(GLuint), Model.Data, GL_STATIC_DRAW_ARB);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0);

// rendering
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffers[0]);
glVertexPointer(3, GL_FLOAT, 0, pointer(0));
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffers[1]);
glNormalPointer(GL_FLOAT, 0, pointer(0));
glBindBufferARB(GL_ARRAY_BUFFER_ARB, buffers[2]);
glDrawRangeElements(GL_TRIANGLES,0,count-1,count,GL_UNSIGNED_INT,pointer(0));
glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0);

Tom Nuydens
04-03-2003, 03:06 AM
You need to use GL_ELEMENT_ARRAY_BUFFER_ARB as the target for your index array. GL_ARRAY_BUFFER_ARB is only valid for vertex data.

-- Tom

satan
04-03-2003, 03:22 AM
Originally posted by Tom Nuydens:
You need to use GL_ELEMENT_ARRAY_BUFFER_ARB as the target for your index array. GL_ARRAY_BUFFER_ARB is only valid for vertex data.

-- Tom

I replaced GL_ARRAY_BUFFER_ARB with GL_ELEMENT_ARRAY_BUFFER_ARB in the 2 glBindBuffer calls for my index array but with no effect. I also added a glBindBufferARB(GL_ELEMENT_ARRAY_ARB,0) in the buffer creation procedure. No effect (I would have wondered if this would have changed anything). Right before the glDrawRangeElements glGetError returns 0, so everything seems to be ok.

vincoof
04-03-2003, 04:01 AM
Ah, my mistake, sorry !

You may use GL_ELEMENT_ARRAY_BUFFER_ARB with glBindBufferARB and also with glBufferDataARB when specifying indices.

satan
04-03-2003, 06:49 AM
Despite so much help from all you nice people I do not get it to work.
Now I made a new project which shall just draw one little triangle. No normal- or index array and just using glDrawArrays There is no access violation anymore but there is nothing drawn and after some seconds the project just dies without any error message. I must say that I had to do the bindings for gl.h myself but I don't expect it to be the problem (although I am not sure so I place this code, too)
Here is the code:



// Variables
var VertexArray: array [0..8] of GLFloat;
buffers: array [0..0] of GLuint;

// Init
glEnableClientState(GL_VERTEX_ARRAY);
//
VertexArray[0]:=0;
VertexArray[1]:=1;
VertexArray[2]:=0;
VertexArray[3]:=-1;
VertexArray[4]:=-1;
VertexArray[5]:=0;
VertexArray[6]:=1;
VertexArray[7]:=-1;
VertexArray[8]:=0;
glGenBuffersARB(1,buffers);
glBindBufferARB(GL_ARRAY_BUFFER_ARB,buffers[0]);
glBufferDataARB(GL_ARRAY_BUFFER_ARB,9*SizeOf(GLFlo at),@VertexArray,GL_STATIC_DRAW_ARB);
glBindBufferARB(GL_ARRAY_BUFFER_ARB,0);

// Render
glBindBufferARB(GL_ARRAY_BUFFER_ARB,buffers[0]);
glVertexPointer(3,GL_FLOAT,0,pointer(0));
glDrawArrays(GL_TRIANGLES,0,3);

// Bindings
procedure glBindBufferARB(target: GLenum; buffer: GLuint); cdecl; external;
procedure glBufferDataARB(target: GLenum; size: GLsizeiptrARB; data: GLvoid; usage: GLenum); cdecl; external;
procedure glGenBuffersARB(n: GLsizei; buffers: PGLuint); cdecl; external;

// types
GLvoid = Pointer;
GLenum = Cardinal; //(DWord)
GLuint = Cardinal;
PGLuint = ^GLuint;
GLsizei = LongInt;
GLsizeiptrARB = LongInt;


Hopefully one of you gurus can enlighten me.

(edit: corrected the code tag)

[This message has been edited by satan (edited 04-03-2003).]

Zengar
04-03-2003, 06:54 AM
Shouldn't it be stdcall?

And it whold be nice idea to load entries dynamically(via wglGetProcAddress)


[This message has been edited by Zengar (edited 04-03-2003).]

vincoof
04-03-2003, 07:14 AM
Do you have enabled backface culling ? Is the triangle visible with 'traditional' vertex arrays ?

[This message has been edited by vincoof (edited 04-03-2003).]

satan
04-03-2003, 08:00 AM
Originally posted by Zengar:
Shouldn't it be stdcall?

And it whold be nice idea to load entries dynamically(via wglGetProcAddress)


[This message has been edited by Zengar (edited 04-03-2003).]
If you take a look at my first post you will see that it is definitly no nice idea to use wglProcAddress at all.
And what do you mean with stdcall? Sound like C to me.
And of course it is visible with 'traditional vertex arrays' else I would not post here.
Sounds like that there is nothing really wrong with my code or how should I interpret your answers.

vincoof
04-03-2003, 08:06 AM
Everything seems fine apart the fact that I'd rather call glBufferDataARB(GL_ARRAY_BUFFER_ARB,9*SizeOf(GLFlo at),VertexArray,GL_STATIC_DRAW_ARB); (remove the 'at' character in front of VertexArray) since you're sending the array, not a pointer on the array. Or if I'm wrong, then you may call glGenBuffersARB(1, @buffers); (add the 'at' character). It's been a long time since I used C functions in Pascal so I don't remember if it's better to send a pointer to the array or the array itself. It's just a detail, but can make the difference.

Zengar
04-03-2003, 08:37 AM
satan, I didn't understand if you code under Windows or underLinux. As you work with Delphi, I assume it's Windows. In Windows all API entries should have stdcall. If I am wronk, please correct me.

Zengar
04-03-2003, 08:40 AM
I guess, I type too fast... This spelling of mine...

satan
04-03-2003, 09:48 AM
@Zengar: No M$ or Borland products involved here. Compiler: Freepascal, OS: customized RedHat Linux 8.0

@vincoof: I think it is fine like I posted it. The @ before buffers in the GenBuffers call is optional (i think it is because it is a typed pointer) and in the BufferData call it is necessary since it is a pointer.
In gl.h it is 'const GLvoid *data' which looks like a pointer to me.

Tom Nuydens
04-03-2003, 10:27 AM
DEFINITELY replace all "cdecl" by "stdcall".

-- Tom

satan
04-03-2003, 11:22 AM
Originally posted by Tom Nuydens:
DEFINITELY replace all "cdecl" by "stdcall".

-- Tom
Why should I do this? The cdecl just worked fine. Just to let you know I did not do the bindings only for some commands. I did my own bindings since NVIDIA Linux driver version 2980 for the complete gl.h and it all worked great. I can guaranty that this stdcall stuff is not needed on Linux.

but I am open to any new ideas

satan
04-03-2003, 11:44 AM
I told you that I just have very little C knowledge, but I just took nehes lesson1 (the glx port as I had it on disk) and put my vbo code in. And I get the exactly same behaviour as I did in pascal.
I come to believe that it is not a problem with pascal but either my code is wrong (which is not very likely as you people said it is ok) or there is a driver bug.

please comment on this (and if you need code I will post it on demand)

thanks again to everyone trying to help

vincoof
04-03-2003, 11:52 AM
It may not be positive feedback, but I'm currently using Linux drivers which "seem" to have bugs with vbo. I opened a topic yersterday and you can still take a look at it if it helps.

Actually I'm having problems with vertex buffer object for GL_VERTEX_ARRAY, but for the index array it works fine as long as I use GL_ELEMENT_ARRAY_BUFFER_ARB

satan
04-03-2003, 12:00 PM
I have seen your thread, but unfortunatly it does not help me.
My first try was using glDrawRangeElements with vertex, normal and index array.
My second one was just using glDrawArrays with only one vertex array.
I thought with vbo I should be able to do all the stuff I actually can do with 'traditional VA'. And therefor my progs should work.

jra101
04-03-2003, 12:27 PM
Can you try this Linux arb_vbo demo and let me know if it works for you?

http://www.cfxweb.net/~delphigl/files/simple_vbo.zip

To compile, extract the zip file to an empty folder, go to DEMOS/OpenGL/src/simple_vbo and run 'make'. To run, just execute './simple_vbo' after a successfull compile.

vincoof
04-03-2003, 12:27 PM
Originally posted by satan:
I thought with vbo I should be able to do all the stuff I actually can do with 'traditional VA'. And therefor my progs should work.

In theory you can. Maybe you need a bit of organization when you want to deal with multiple buffers simultaneously, but still it is possible.

In practice, well...

satan
04-03-2003, 12:42 PM
It does not work for me. Same problems as my tries. Black screen, some seconds later aborted.

[satan@darkbreed simple_vbo]$ make
make: *** Warning: File `Makefile' has modification time in the future (2003-04-04 01:30:09 > 2003-04-04 00:37:14)
g++ -Wall -g -DUNIX -I../../inc -I../../../../inc -c simple_vbo.cpp
g++ simple_vbo.o -o simple_vbo -lglut -lGLU -lGL
make: warning: Clock skew detected. Your build may be incomplete.
[satan@darkbreed simple_vbo]$ ./simple_vbo
Aborted

My hardware if it matters is a 400MHz K6-II and a Geforce2MX.

I will be away for some days now, but I will check this when I am back. I hope there is a solution to my problems as I would really like to use vbo in my engine.

[This message has been edited by satan (edited 04-03-2003).]

vincoof
04-03-2003, 12:51 PM
idem. black screen. wait a few seconds. "Aborted"

pkaler
04-03-2003, 01:03 PM
Runs fine for me with the 43.49 drivers.

Hey satan,

I have never seen that clock skew warning before. What distro are you running? What kernel?

jra101
04-03-2003, 01:07 PM
Ya, looks like there is a problem with GeForce2 MX and GeForce4 MX cards. I've filed a bug on this and it should get fixed in the next driver release.

vincoof
04-03-2003, 01:07 PM
PK, what graphics card do you have please ?

He's running Redhat8.0 (I think he's written it somewhere in the thread).
I'm running Mandrake9.0 and the clock skew warning appears too.

vincoof
04-03-2003, 01:10 PM
Thanks alot Jason,
Let's hope the "next driver release" will be asap http://www.opengl.org/discussion_boards/ubb/smile.gif

pkaler
04-03-2003, 02:21 PM
Originally posted by vincoof:
PK, what graphics card do you have please ?

He's running Redhat8.0 (I think he's written it somewhere in the thread).
I'm running Mandrake9.0 and the clock skew warning appears too.


Geforce3 Ti500 (NV22)
43.49 drivers
$ uname -a
Linux localhost 2.4.19-gentoo-r10 #6 SMP Thu Feb 6 16:48:24 PST 2003 i686 AMD Athlon(tm) Processor AuthenticAMD GNU/Linux

vincoof
04-03-2003, 09:33 PM
The fact you've got a GeForce Titanium may be the reason why it works with your machine, according to Jason.

btw:
$ uname -a
Linux linux 2.4.19-16mdk #1 Fri Sep 20 18:15:05 CEST 2002 i686 unknown unknown GNU/Linux