PDA

View Full Version : AllocateMemoryNV always return null



jide
01-22-2002, 12:47 AM
hello,

i've put this in the linux forum, but i didn't get reply anymore.
As this is an ext problem, i put it here, in the case where...

glXAllocateMemoryNV always return null
and
GL_NV_vertex_array_range seems not supported
as said by glIsEnabled().

any idea ?

i haven't tried it under windows.

Bob
01-22-2002, 12:59 AM
GL_NV_vertex_array_range seems not supported
as said by glIsEnabled().

glIsEnables is used to determine whether sometine is enabled or not. Not if something is supported. For supported features, check the extension string.

jide
01-22-2002, 01:02 AM
and how i will do that ?

if you can't tell for linux, just tell me for windows.

thanks

jide
01-22-2002, 02:49 AM
can anyone help ?

always null returned by glXAllocateMemoryNV().

must I use glx, or glut comply with it ?

Nutty
01-22-2002, 05:09 AM
const char* extensions = glGetString(GL_EXTENSIONS);

Scan the string extensions for
"GL_NV_vertex_array_range"

If it's in there, then VAR is supported.

I assume it is, cos you're using glxAllocateMemoryNV, which is part of the extension.

Try reducing the size of the memory request.

Nutty

kon
01-22-2002, 06:09 AM
Try changing the priority:
glxAllocateMemoryNV(nNumBytes, 0, 0, priority);

If priority is in the range [0.25f,0.75f] then AGP mem will be used. If it's [0.75f,1.0f] then video mem will be used.

kon


[This message has been edited by kon (edited 01-22-2002).]

jide
01-22-2002, 07:07 AM
i'll try this night.

it's not a problem due to huge allocation since i allocate around 5*sizeof(GLfloat).

And if priority changes on any system: beurk !!!

cause (in linux forum), i tried Lev example,
and tried to change some datas (prority and other params).

will see it.

thanks

JD

jide
01-23-2002, 12:32 AM
OK,

i still haven't got the drivers for agp.
But you said it isn't important for getting that working. So, it seems not the problem.

And the allocation always fail. I tried everything but no succeed.

then, glGetString( GL_EXTENSIONS); returns a null string. However, glxinfo returns that all what's needed is supported. what a strange thing in fact !!
I couldn't get where is the problem.
glXAllocateMemoryNV is not null, as glXFreeMemoryNV and glVertexArrayRangeNV, but all have got the same adress (as 0x80700400 but unsure).

JD

kon
01-23-2002, 12:49 AM
Just to be sure: Do you have a valid rendering context when calling these functions? What does glGetString(GL_RENDERER) return?

kon

jide
01-23-2002, 12:51 AM
i haven't tried. i can't try now, will try it this night.
But i can get a display.
i am using glut, is this a problem ?

and if the context is not valid, what should i do ?

JD

jide
01-23-2002, 01:04 AM
i don't know if the context is good.
anyway, i can see my display at screen and it works fine.

i use glut, is it a problem ?

if my context is not valid, what should i do ?

thanks

JD

kon
01-23-2002, 01:28 AM
Call OpenGL functions only after glutCreateWindow() has been called.

kon

jide
01-23-2002, 02:12 AM
ah !!!

what an interressant thing !!!
but i don't remember if it's before or later.
i think it's before glutMainLoop() and after glutCreateWindow()

and what about the difference btw glGetString() and glxinfo ?
i will try to erase all opengl and rebuilt it in a new clear way; but i'm fearly sure it's not the source of my problem.

JD

Lev
01-23-2002, 03:59 AM
You said you were using glut? Why do you mention glX function then? I would advice reading either some book or at least the article on how to use extension available here at opengl.org. You're mixing several things which shouldn't be mixed up.

-Lev

kon
01-23-2002, 04:43 AM
He's using the VAR extension which needs to call some OS specific function like glXAllocateMemoryNV on XWindows or wglAllocateMemoryNV on MSWindows respectively.

kon

jwatte
01-23-2002, 09:02 AM
It's perfectly valid for AllocateMemoryNV to return NULL. It will do this if it can't allocate the specific type of memory that you want. For example, if your Linux kernel or BIOS doesn't give the driver any AGP memory, it may return NULL. If your graphics card is a PCI card, it will certainly return NULL.

Proper code to use AllocateMemoryNV and VertexArrayRangeNV looks something like:

bool inAGP = false;
void * ptr = 0;

ptr = AllocateMemoryNV( size, 0, 0, 0.7 );
if( ptr )
inAGP = true;
else
ptr = malloc( size );
VertexArrayRangeNV( ptr, size );
Enable( VERTEX_ARRAY_RANGE );


the shut-down code looks like:


Disable( VERTEX_ARRAY_RANGE );
if( inAGP )
FreeMemoryNV( ptr );
else
free( ptr );

Lev
01-23-2002, 09:42 AM
Originally posted by kon:
He's using the VAR extension which needs to call some OS specific function like glXAllocateMemoryNV on XWindows or wglAllocateMemoryNV on MSWindows respectively.
kon

Sure, but asking about the difference between glGetString and glXInfo (or whatever it is supposed to be)? And checking extensions with glIsEnabled? This article (http://www.opengl.org/developers/code/features/OGLextensions/OGLextensions.html) is very good for getting things straight. Although its win32 focused, under linux its sufficient to replace wgl with glX in most cases.

Regards,
-Lev

[This message has been edited by Lev (edited 01-23-2002).]

jide
01-24-2002, 07:47 AM
thank you all.

Lev, i've never used glGetString before. No utilities for me. any problem with that ?
And if your source code that you are using and shown me were clear enough, i may haven't try glIsEnabled().
I use glut because for me it's more easier to do win32 and linux program that works on both. More, glut uses glx under linux. So, glx must be supported by glut as Kon said.
Do you know anything ? maybe, but you're lucky, then. don't take it bad, i don't like that, almost i found you helpful.

Now, my problem remains the same, and it seems now only a linux problem.

thanks all

JD

jide
01-24-2002, 07:52 AM
Lev, i've never pretended to be a star, or a god or something like that (nor you i know).

But, i though this forum is here to be helpful for everyone here. I think it already is now.

sorry if knowledge lacks are prejudicies.

JD

Lev
01-24-2002, 08:05 AM
Originally posted by jide:
Lev, i've never pretended to be a star, or a god or something like that (nor you i know).

But, i though this forum is here to be helpful for everyone here. I think it already is now.

sorry if knowledge lacks are prejudicies.

JD

So where's the problem? I just suggested you to read the article about extension and explained Kon why I suggested this. Maybe I wasn't very polite saying that you mix up some things - sorry for that. But if you're checking if extension is supported with glIsEnabled then its wrong - this is not a prejudice - its a fact. Everybody makes mistakes, me too, and its nothing bad in making mistakes (well, as long as nobody gets hurt) - I just tried to help you.

It wasn't my intention to offend you, I'm sorry if I've done so.

Regards,
-Lev


[This message has been edited by Lev (edited 01-24-2002).]

jide
01-24-2002, 11:05 PM
Lev, no problem !

to all,

i've tried many things yesterday evening.
First, i was false on calling glGetString(), it was before glutCreateWindows, so the context wasn't created when
calling. that's why it gaves me a null string.

so, now glGetString( GL_EXTENSIONS); gives me many thing: it supports ARB, NV, EXT, SGIS, IBM, KTX
(under Linux). Of course, vertex_array_range, vertex_array_range2, vertex_array_program and
draw_range_element. This was under glut.
Under glx (i've tried a glx demo program), no array, no draw extensions are supported (maybe because it uses
Mesa).
So, i understand less why it doesn't work.

I have tried all the values with all combinations for x,y,z in glXAllocateMemoryNV( 1000*sizeof( GLfloat), x,y,z);
It didn't work.

Concerning glGetString( GL_RENDERER);
under glut, it returns GeForce2 MX/PCI/3DNOW!
under glx, it returns Mesa X11

do you think i have to recompile glut ? my version is the originale provided by Mandrake (so, uses Mesa).

When trying to allocate under the glxdemo program, it stops with a segmentation fault in the
glXAllocateMemoryNV link with glGetProcAddressARB.

any ideas are welcome.

thanks

JD

tfpsly
01-25-2002, 09:02 AM
> GeForce2 MX/PCI/3DNOW!

I am afraid the answer is just there: PCI, no AGP. Check your bios settings

jide
01-27-2002, 04:19 AM
just here ?
you may be right.

i thought that if it wasn't recognized as AGP, it works with PCI.

As i haven't got AGP drivers, enable it under Linux can't load X and just stop on console. But there is no driver for AGP via for Asus A7V266.

JD