PDA

View Full Version : Point Sprites and glslang crash my system



Lurker_pas
01-29-2005, 01:37 AM
I recently wanted to toy with PointSprites. I have added them to one of my programs and they worked fine. However, in order to test the performance I started increasing their number. When I got to 1024 everything was OK, but the next step - 2048 - crashed my camputer: Windows failed to respond, monitor turned off and on few times and then a restart...
I was using simple
glBegin(GL_POINTS);
for p1 := 1 to PointCount do
begin
glVertex3fv(Addr(Points[p1]));
end;
glEnd;
I switched to
glVertexPointerEXT(3,GL_FLOAT,3*4,PointCount*3,Add r(Points));
glDrawArraysEXT(GL_POINTS,1,PointCount);
Same effect...

My vertex program is
void main(void)
{
gl_Position = ftransform();
gl_PointSize = 2000.0/length(gl_Position);
}
my fragment is
uniform sampler2D SpriteTexture;
void main(void)
{
vec4 Color = texture2D(SpriteTexture,vec2(gl_TexCoord[0]));
gl_FragColor = vec4(vec3(Color),Color.b);
}
Nothing fancy as you see.
My question is:
Is it a bug, am I doing something wrong or is my PC falling apart? Has anyone had similar problems?

I was trying to solve it myself, but I just lost my patience after the nth restart...
My system is Radeon 9800 pro with Catalyst 5.1.

zed
01-29-2005, 10:24 AM
ild see if youve declared enuf memory for Points[p1] (shouldnt crash computer though anyways)

also this is incorrect
glVertexPointerEXT(3,GL_FLOAT,3*4,PointCount*3,Add r(Points));

heres whats its meant to be
void VertexPointer( int size, enum type, sizei stride,
void *pointer );

Lurker_pas
01-29-2005, 11:42 AM
1)Points is a static array -
Points : packed array[1..PointCount] of TVector3f;
Besides, during the initialization I fill it with random values - and the program crashes *after* the init, not during it.
2)glVertexPointer - yes, my mistake there:
now I changed to
glVertexPointerEXT(3,GL_FLOAT,3*4,PointCount,Addr( Points));
Still, fixing it doesn't solve the problem. Besides, the problem first occured while I was using my favourite glBegin() glEnd() :) . I tried glVertexPointer to see whether it is a problem of too much glVertex calls.

3) copy/paste from *EXT*_vertex_array spec:
void VertexPointerEXT(int size,
enum type,
sizei stride,
sizei count,
const void* pointer);


And it does work in my program, but only if the number of the points is 1024 or less...

Any other suggestions?

NitroGL
01-29-2005, 08:10 PM
Vertex arrays are a part of GL 1.1, why would you use the extension? It's outdated.

Lurker_pas
01-29-2005, 11:55 PM
One reason is that I didn't know that until yesterday. Really. In fact I wasn't interested in such functionality because scenes I usually draw don't have much geometry and glBegin() glEnd() is sufficient for me (I don't say that vertex arrays and VBO are useless, I just didn't need them so far).
But when this point sprite problem occured, I wanted to eliminate it (I thought maybe it was immediate mode related or something) and so I tried to change my drawing method. And I found the EXT version first...
Please guys, could anyone suggest something about *point sprites*, not my usage of vertex arrays (unless there's an error)?

Relic
01-31-2005, 05:35 AM
Not sure about the Addr() operator, but if Addr is the same as & in C than it needs to be Addr(Points[1])
Did you call glEnableClientState(GL_VERTEX_ARRAY)?

Lurker_pas
02-05-2005, 07:07 AM
I wasn't programming during the last few days due to my exams. Today - hopefully - was the last one. I started doing this point sprite thing right from the scratch. In my last program, they were an add-on. This one is all about them - no interaction with anything possible. No arrays, no vertex arrays, no pbuffers, no VBO, no Addr() operator, so please don't even mention them...

Fragment and vertex shaders are the same. The program (GL related thing) is as follows:

procedure InitGL;
var
TabI : array[0..2] of integer;
begin
InitOpenGL;
InitDevIL;
ilinit;
h_DC := getDC(Form1.Handle);
h_RC := CreateRenderingContext(h_DC,[opDoubleBuffered],32,24,0,0,0,0);
ActivateRenderingContext(h_DC,h_RC);

SpriteTexture := iltexloader.LoadILReadyToUseTextureRGBA('sprite.bm p');
glBindTexture(GL_TEXTURE_2D,SpriteTexture);
glTexEnvi (GL_POINT_SPRITE_ARB, GL_COORD_REPLACE_ARB, GL_TRUE);

SpriteVertexShader := dotshaders.dotGLSLLoadShaderFromFile('SpriteVertex Shader.shader',GL_VERTEX_SHADER_ARB);
SpriteFragmentShader := dotshaders.dotGLSLLoadShaderFromFile('SpriteFragme ntShader.shader',GL_FRAGMENT_SHADER_ARB);
TabI[0] := SpriteVertexShader;
TabI[1] := SpriteFragmentShader;
SpriteShader := dotshaders.dotGLSLLinkPrograms(TabI);
SpriteTextureHandle := dotshaders.dotGLSLUniformLocation(SpriteShader,'Sp riteTexture');
glUseProgramObjectARB(SpriteShader);
glUniform1iARB(SpriteTextureHandle,0);

ResizeGL;

end;

procedure DeInitGL;
begin
destroyRenderingContext(h_RC);
end;

procedure ResizeGL;
var
x,y : integer;
begin
if (h_RC <> 0) then
begin
x := Form1.ClientWidth;
y := Form1.ClientHeight;

glViewport(0,0,x,y);
glMatrixMode(GL_PROJECTION);
glLoadIdentity;
gluPerspective(55,x/y,1,1000);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity;

end;
end;

procedure DrawGl;
var
p1: integer;
begin
glClear(GL_DEPTH_BUFFER_BIT or GL_COLOR_BUFFER_BIT);

glLoadIdentity;
gluLookAt(10,10,10,0,0,0,0,1,0);

glEnable(GL_DEPTH_TEST);
glEnable(GL_POINT_SPRITE_ARB);
glEnable(GL_BLEND);
glEnable(dglopengl.GL_VERTEX_PROGRAM_POINT_SIZE_AR B);
glBlendFunc(GL_SRC_ALPHA,GL_ONE);

glEnable(GL_TEXTURE_2D);


glBegin(GL_POINTS);

for p1 := 1024 downto 1 do
begin
glVertex3f(p1/256,-p1/256,-p1/256);
end;

glEnd;

swapBuffers(h_DC);
end;

When the loop is 1024 or less IT DOES REALLY WORK, so EVERYTHING SEEMS to be setup correctly. But again, when I change to 2048 : crash... Any ideas? My card seems to work correctly with other aps. What is going on?