PDA

View Full Version : Slow textures... glDrawElements



Kek
06-26-2002, 04:48 AM
Hello,
I have had many problems of slow texture mapping, and I'm trying to use glDrawElement: I have an array for the vertices (eight for a cube) and another one for the faces (6*2 for a cube). So I have made a vector with all the indices of the vertices to display and I call:

glVertexPointer(count,vert_array);
glTextCoordPointer(count,text_array);
glDrawElement(count,vect);

(approximately)
but it is still too slow...

maybe should I initialise the array with gl_Pointer just once? but I'll have to keep the first index of each object... Is there another simpler way???

Thanks a lot!

06-26-2002, 10:54 PM
The fillrate seems to be the problem so vertex arrays will not help. If you have a fillrate problem is the speed very depending on the window size.

Perhaps are you running in software mode or have old hardware.

Kek
06-26-2002, 11:32 PM
Well, I'm afraid I don't understand what fillrate means... but concerning my hardware, that's right it's a ****ing compaq with a 8Mb memory video card... not very very good!
But most of 3d games works and if mine doesn't, it is because I am a bad programmer, not because my computer is lazy...
Maybe am I trying to map too nice textures...

Well, I'll try to see what I can do...
Thank you

BlackJack
06-27-2002, 08:22 AM
8 MB videocard..... hum...... well, three possible things:
-either your hardware is not supported with OpenGL at all
-your desktop is switched to 32 bit colordepth, but your card only supports 16 bit (on many of this old stuff the case)
-you haven't installed the correct drivers.

What card exactly do you have, else it's a bit hard to answer your question.

BlackJack

Kek
06-28-2002, 03:01 AM
My card is an Intel integrated card in a compaq despro...
but well, I'm going to change it today, I'll see if it's better?
I don't think I'm working in 32 bits mode, but in 16 bits and may card supports 24 bits... but in my linux, I'm not sure of anything
maybe I have an old openGL? or maybe, my beautiful compaq is a very big ....
well... I'll see what happen with my new card (which is also a very little one 8 or 16 Mb?)

BlackJack
06-28-2002, 03:29 AM
Well, am working in a software firm. Here we've also a couple of these IBM ones in our test computers and they only work in 16 bit. For 2D you can also switch them to 32/24 bit, but then they are not hardware accelrated in OpenGL anymore then. How ever, if you have no AGP port try to get a Riva TNT or so, should cost a nothing anymore today (20$ something, if anyway). If you have an AGP port at least a GeForce 1 is the investition worth of course, so you can already learn working with T&L. In any case you should take an NVidia card, if you are going to buy an old one, the old ATI or 3dfx ones, like rage, are total crap, if you want to use them for OpenGL (buggy and non standard drivers). Where by... depending which IBM chip you have exactly on your board... normally there is for some of them a fine GL support. But the IBM drivers are relative buggy, at least the ones I'd to work with in past. They do things like switching on cullfacing, when you enable lighting and such things = /, so with any of the old NVidia ones you're in any case on the right way.

BlackJack

ioquan
07-06-2002, 11:30 PM
The problem is probably that you are using glTexImage2D on every frame update. You need to use texture objects.

07-07-2002, 02:33 PM
Good call. Here's how you do that:
Call glGenTextures to get a texture identifier, then call glBindTexture ( GL_TEXTURE_2D, ... ). With the texture bound, use glTexImage2D to load the texture. Finally, call glBindTexture ( GL_TEXTURE_2D, ... ) again each frame to select the texture.


Originally posted by ioquan:
The problem is probably that you are using glTexImage2D on every frame update. You need to use texture objects.