View Full Version : Vertex Array Range (nvidia) Problem!!

06-09-2002, 02:49 AM
I have troubles getting VAR working!
I try to allocate some memory, but i always get a nullpointer...
Hereīs my code:

unsigned char *AGPMemory;

// This call allocates Memory in AGP-Memory!!
AGPMemory = (unsigned char*)wglAllocateMemoryNV(1024*sizeof(float), 0.2f, 0.2f, 0.5f);

if (!AGPMemory)
//No AGP Memory available
VARSupported = false;
//Everything worked fine and the memory could be allocated...

//...Proceed with memcopy etc.



But actually itīs always !AGPMemory .... so wahtīs wrong with that
BTW: The NVIDIA demos workes perfectly, so it canīt be my machine....

06-09-2002, 03:45 AM
Hi Mr.Blob http://www.opengl.org/discussion_boards/ubb/smile.gif, try these values for wglAllocateMemoryNV( 1024*sizeof(float), 0, 0.1f, 0.75f ).

06-09-2002, 05:03 AM
Well, I tried those values, but it sill doesnīt work...
Or my check-code is not correct?? I dunno...
Please help me!

06-09-2002, 05:36 AM
Are you sure you have a valid OpenGL rendering context, when you do this? I assume you do, or you wouldn't have been able to get the address of the allocate function. But thats all I can think of.


06-09-2002, 07:10 AM
That. Or your machine doesn't have the right AGP drivers installed (common on VIA chipset machines, I hear). Or your card is actually a PCI card, so there is no AGP memory. Or you are out of AGP memory. Or one of a million other things.

It's perfectly fine for AllocateMemory() to return NULL; the fallback case is to use malloc() for your memory. You can still call VertexArrayRange() on malloc()-ed memory, and it will still make vertex arrays draw faster when they're in that memory.

bool inMalloc = false;
void * varMemory = 0;

void init()
inMalloc = false;
if( !(varMemory = wglAllocateMemory(...)) ) {
varMemory = malloc(...);
inMalloc = true;

void terminate()
if( inMalloc ) {
else {

[This message has been edited by jwatte (edited 06-09-2002).]

06-09-2002, 07:18 AM
i had this problem too... does you driver support AGP ?
check out the renderer string.
if you have a pci-version only (this can be true even on agp machines) then try to allocate videomem with: glAllocateMemoryNV(size,0,0,1);

06-09-2002, 07:24 AM
Although he said the nvidia demos work fine, I suppose he should verify that they are allocating AGP memory. Perhaps they're just falling back to video memory, or sys memory.

06-09-2002, 08:38 AM
nutty, you are right. the nvidia demos check for AGP mem first, and when they can't allocate it, they allocate pure videomem.

06-09-2002, 09:30 AM

try this app, it reports from where the mem is allocated..

06-09-2002, 10:17 AM
Ok, I treid the app and I found out that itīs allocating the memory in GeForce RAM i.e video-mem. I have a GeForce1, but actually i thought the GeForce would provide AGP Memory...the problem is, that i wnat to allocate about 1024*1024*11 bytes *g*! Thatīs a little bit too much for VideoMem!
But I Canīt help it...thank you!