screen corruption

Hi, I am having a problem on some displays where part of the screen is corrupted.

It happens on a

GeForce4 Ti 4200 Win 98 with latest drivers
color mode set to 16 bit
http://beta.codebaby.com/testfiles/yikes.JPG

Something similar happens on an intel onboard 3d accelerator (82815 i believe). This one has a limit of 16 bit texture color depth,
and we use 32 bit textures, so I thought this may be why. (unfortunately, i dont have a screen of that one)

Do you think it is hardware related?
Any ideas of what might be causing it?
Or a workaround?

[This message has been edited by bumby (edited 01-12-2004).]

“the screen is corrupted” does lack a little information, in what kind is it corrupted, any special area of the window ? is it just crap on the screen or what is it ?

A screenshot might be helpfel.

Mikael

I had this type of corruption when using buffers (pbuffers) that were disallocated by mistake.
Maybe you reuse the backbuffer after a SwapBuffers (works on some cards, but not required to work by the GL spec).

How do you render the image ?

Thanks,

re: mikael_aronsson

there is a screenshot in my previous post
here it is again http://beta.codebaby.com/testfiles/yikes.JPG

re:zbuffer

The rendering is pretty straight forward.

I animate
I render
I swapbuffers

Your idea about the reusing the buffer after a swapbuffer caught my eye though…Because i was doing something like that in order to create a windows region out of an object. But that code has been taken out, so i am sure that wasn’t it.

I noticed that I have called glFinish before swapbuffers. I think I read in a post somewhere that swapbuffers effectively is calling glFinish(or maybe glFlush). Could that be causing that? Maybe it is trying to swap a buffer that has been purged already?
(I know i could try it, but I dont have access to the test box)

[This message has been edited by bumby (edited 01-13-2004).]

glFlush has to be avoided for performance questions.
glFinish should not cause these artifacts, but is redundant with SwapBuffers, you can remove glFinish.

To be more precise, when I asked ‘how’, I wanted to see the actual window creation code, as well as stuff related to front and back buffers, paint events, etc.

Thanks for helping.

Here is the window creation code:

BOOL InitEditor (HWND hWindow, HWND theNotRealWindow, DWORD nWidth,
DWORD nHeight, BYTE nBitsPerPixel, DWORD nFrequency)
{
PIXELFORMATDESCRIPTOR pfd;
DEVMODE newDM;
RECT rect;
HWND hWindowPos;
HDC hDC;
DWORD nMode;
DWORD nWindowStyle;
INT nPixelFormat;

ShowWindow( hWindow, SW_HIDE );


nWindowStyle = WS_CHILD;
hWindowPos = HWND_NOTOPMOST;




hDC = GetDC( hWindow );

if ( !hDC )
{
    return FALSE;
}

memset( &pfd, 0, sizeof( PIXELFORMATDESCRIPTOR ) );

pfd.nSize = sizeof( PIXELFORMATDESCRIPTOR );
pfd.nVersion = 1;
pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
pfd.iPixelType = PFD_TYPE_RGBA;
pfd.cDepthBits = 32;
pfd.cColorBits = 32;
pfd.iLayerType = PFD_MAIN_PLANE;
pfd.cStencilBits = 8; 


nPixelFormat = ChoosePixelFormat( hDC, &pfd );

if ( !nPixelFormat )
{
    ReleaseDC( hWindow, hDC );

    return FALSE;
}

if ( !SetPixelFormat( hDC, nPixelFormat, &pfd ) )
{
    ReleaseDC( hWindow, hDC );

    return FALSE;
}

DescribePixelFormat( hDC, nPixelFormat, sizeof( PIXELFORMATDESCRIPTOR ), &pfd );

if(( pfd.dwFlags & PFD_NEED_PALETTE ) != 0)
{
    sprintf(strError,"Pallettes unsupported, please choose 16 bit color or higher");
    return FALSE;
}


m_hGLRC = wglCreateContext( hDC );

if ( !m_hGLRC )
{
    ReleaseDC( hWindow, hDC );

    return FALSE;
}

if ( !wglMakeCurrent( hDC, m_hGLRC ) )
{
    wglDeleteContext( m_hGLRC );

    ReleaseDC( hWindow, hDC );

    return FALSE;
}

ReleaseDC( hWindow, hDC );

rect.left = 0;
rect.right = nWidth;
rect.top = 0;
rect.bottom = nHeight;

AdjustWindowRect( &rect, nWindowStyle, FALSE );

SetWindowLong( hWindow, GWL_STYLE, nWindowStyle );

SetWindowPos( hWindow, hWindowPos, 0, 0, rect.right - rect.left,
              rect.bottom - rect.top, SWP_FRAMECHANGED | SWP_NOCOPYBITS );
SetWindowPos( hWindow, HWND_TOP, 0, 0, nWidth, nHeight,
              SWP_FRAMECHANGED | SWP_NOCOPYBITS );

SendMessage( hWindow, WM_SIZE, SIZE_RESTORED, MAKELPARAM( nWidth, nHeight ) );


ShowWindow( hWindow, SW_NORMAL );

return TRUE;

}

as for the other stuff:
The only real thing that is using the buffers is swapbuffers,
there are some tests set:

glDepthFunc(GL_LEQUAL);

// Alpha test

glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, alphatest);

The window creation seems correct, but I am not a PFD guru

there are some tests set:

glDepthFunc(GL_LEQUAL);

// Alpha test
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, alphatest);

That makes me think about one thing, do you clear the screen color, stencil and z-buffers at each frame, or at least on startup ?? Else you may keep uninitialized data around…

Second point, does the corruption move around or stays the same, if you happen to resize the window or cover/uncover it ?

Third, it can be linked to not correclty bound texture . Does the corruption covers simple filled triangles too ?

(and please, try to put your code in enclosing UBB ‘code’ markups)

Can you give an example of not correctly binding a texture?

Unfortunately, the corruption was uncovered at a testing company. We cannot reproduce it here. I will need to ask them the things you suggested.

I will look into the buffer clearing too! Thanks

what is UBB ‘code’ markings? I would have done it if I had known

Having re-read your first post, it may well be connected to texture color depth. How come you talk about ‘color mode set to 16 bit’ when on your code you specify 32bits ?

Well, i was under the impression that if 32 bits wasnt available that it would choose the next available pixel format. (maybe a naive assumption ). And I have had the application working with the display color depth set to 16 bits before.

Originally posted by ZbuffeR:
Having re-read your first post, it may well be connected to texture color depth. How come you talk about ‘color mode set to 16 bit’ when on your code you specify 32bits ?

Is it possible that you don’t clear your depht buffer or your stencil buffer (even if you don’t use them)? That may be the cause of those artifacts, as I have seen similar things happen on my ATI-Board if I don’t clear all buffers I use.

Ok, so we tracked down a Geforce 4 ti 4200, put it in and we get the screen corruption as noted above. I made sure the buffers are being cleared, but i still get the problem.

Since I am able to reproduce it i can give more details about when it happens…

The rendering goes on like normal, but at certain times in the characters speech, GDI message bubbles pop up on top of the gl window. This is when the corruption happens. Some kind of windows painting issue i guess. Maybe it is amplified on win98/ME.

Has anybody had similar problems, and is there something I should be taking extra care of in these situations. I know there are problems with opengl and gdi, so maybe there is nothing i can do?

[This message has been edited by bumby (edited 01-14-2004).]

Ok, so we tracked down a Geforce 4 ti 4200, put it in and we get the screen corruption as noted above. I made sure the buffers are being cleared, but i still get the problem.

Since I am able to reproduce it i can give more details about when it happens…

The rendering goes on like normal, but at certain times in the characters speech, GDI message bubbles pop up on top of the gl window. This is when the corruption happens. Some kind of windows painting issue i guess. Maybe it is amplified on win98/ME.

Has anybody had similar problems, and is there something I should be taking extra care of in these situations. I know there are problems with opengl and gdi, so maybe there is nothing i can do?

FYI, I am using VCL (cpp builder—egad!), and have overloaded the WM_ERASEBKGND handler to return non zero.

[This message has been edited by bumby (edited 01-14-2004).]

Fixed it, got an old nvidia driver (30.82)…this one has a WHQL cert…and it works great!