Vertex Array Range (nvidia) Problem!!

Hi!
I have troubles getting VAR working!
I try to allocate some memory, but i always get a nullpointer…
Here´s my code:

  unsigned char *AGPMemory; 

  if(VARSupported)
  {
  	// This call allocates Memory in AGP-Memory!!
  	AGPMemory = (unsigned char*)wglAllocateMemoryNV(1024*sizeof(float), 0.2f, 0.2f, 0.5f);

  	if (!AGPMemory)
  	{
  		//No AGP Memory available
  		VARSupported = false;
  	}
  	else
  	{
  		//Everything worked fine and the memory could be allocated...

  		//...Proceed with memcopy etc.

  	}
  
  }

But actually it´s always !AGPMemory … so waht´s wrong with that
BTW: The NVIDIA demos workes perfectly, so it can´t be my machine…

Hi Mr.Blob , try these values for wglAllocateMemoryNV( 1024*sizeof(float), 0, 0.1f, 0.75f ).

Well, I tried those values, but it sill doesn´t work…
Or my check-code is not correct?? I dunno…
Please help me!

Are you sure you have a valid OpenGL rendering context, when you do this? I assume you do, or you wouldn’t have been able to get the address of the allocate function. But thats all I can think of.

Nutty

That. Or your machine doesn’t have the right AGP drivers installed (common on VIA chipset machines, I hear). Or your card is actually a PCI card, so there is no AGP memory. Or you are out of AGP memory. Or one of a million other things.

It’s perfectly fine for AllocateMemory() to return NULL; the fallback case is to use malloc() for your memory. You can still call VertexArrayRange() on malloc()-ed memory, and it will still make vertex arrays draw faster when they’re in that memory.

bool inMalloc = false;
void * varMemory = 0;

void init()
{
inMalloc = false;
if( !(varMemory = wglAllocateMemory(…)) ) {
varMemory = malloc(…);
inMalloc = true;
}
glVertexArrayRangeNV(varMemory,…);
glEnable(GL_VERTEX_ARRAY_RANGE_NV);
}

void terminate()
{
glDisable(GL_VERTEX_ARRAY_RANGE_NV);
if( inMalloc ) {
free(varMemory);
}
else {
wglFreeMemory(…,varMemory);
}
}

[This message has been edited by jwatte (edited 06-09-2002).]

i had this problem too… does you driver support AGP ?
check out the renderer string.
if you have a pci-version only (this can be true even on agp machines) then try to allocate videomem with: glAllocateMemoryNV(size,0,0,1);

Although he said the nvidia demos work fine, I suppose he should verify that they are allocating AGP memory. Perhaps they’re just falling back to video memory, or sys memory.

nutty, you are right. the nvidia demos check for AGP mem first, and when they can’t allocate it, they allocate pure videomem.

http://tyrannen.starcraft3d.net/PerPixelLighting

try this app, it reports from where the mem is allocated…

Ok, I treid the app and I found out that it´s allocating the memory in GeForce RAM i.e video-mem. I have a GeForce1, but actually i thought the GeForce would provide AGP Memory…the problem is, that i wnat to allocate about 1024102411 bytes g! That´s a little bit too much for VideoMem!
But I Can´t help it…thank you!