Hi
I use NVidia Riva 128 card.
When I call glTexImage2D(GL_TEXTURE_2D, 0, 4, 128, 128, 0, GL_RGBA, GL_UNSIGNED_BYTE, PPBits^);
It crashes in NV3OGL.DLL - access violation,read. In Software it crashes too.
But OpenGL says about 1024 when I do glGet(GL_MAX_TEXTURE_SIZE).
When I call it with 128X64 size all is ok(and even fast).
Cant I determine real Max texture size without crash?
I can’t believe a such bug could exist in a driver for a such card and never been reported before. IMO you are doing something wrong. Check out the pointer to the texture pixels you’re passing in arguments.
In my experience, crashes are most likely the result of accessing memory that you shouldn’t be accessing. Like going over the bounds on a dynamically created array. You should definitely take Bob’s advice and check that you have memory allocated for the pointer you’re passing in.
edit note
Oops… meant to say crashes are the “result” not the “cause”
[This message has been edited by Deiussum (edited 01-16-2001).]
Fixed. I passed invalid pointer. It was working only when texture was resized.
BTW Riva 128 can render 1024X1024 - with 5.83 fps.(when changing texture each time).