old skool pixels blur

Hi

I cannot get games with resolutions smaller than 640x480 to render without the blur. I want pixel perfect for my oldskool pixel art games. I am using opengl obviously for speed.

I render my sprites into textures (power 2). When loading these textures I use glTexImage2D() and glTexParameteri with GL_LINEAR (tried GL_NEAREST too). I cannot seem to get rid of the blurryness.

Can anyone please advize me on how to apply 2x, 3x, or 4x nearest-neighbour filtering (or something else?) to make it pixel clear?

:slight_smile:

Can anyone please advize me on how to apply 2x, 3x, or 4x nearest-neighbour filtering (or something else?) to make it pixel clear?

You need to use GL_NEAREST and make sure your fragments fall on whole pixel coordinates. To do that, translate everything by 0.375f in the x and y directions. Obviously, you’ll need to use a pixel-perfect orthographic projection.

If you are still getting blur, then the problem is something else entirely. Are you using FBOs?

GL_NEAREST in both MAG and MIN modes.

Currently I set up the projections like this


glViewport(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, SCREEN_WIDTH, SCREEN_HEIGHT, 0, -1, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

Currently I set these when loading a texture:


glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

When I draw a texture onto screen I draw them to quads with clipping:


glBindTexture(GL_TEXTURE_2D, texture);
glBegin( GL_QUADS );
   glTexCoord2f(x1, y1);
   glVertex3f(x, y, 0.0f );
   glTexCoord2f(x2, y1);
   glVertex3f(x + w, y, 0 );
   glTexCoord2f(x2, y2);
   glVertex3f(x + w, y + h, 0 );
   glTexCoord2f(x1, y2);
   glVertex3f(x, y + h, 0 );
glEnd();

x1, x2, y1, y2 are double type clipping coordinates to draw only relevant frame in spritesheet/texture.

Why do I need to translate everything by 0.375f in x and y directions?

I’m not very experienced in opengl, what do you mean with fragments fall on whole pixel coordinates?

What is FBOs?

Thank you for the responses :slight_smile:

GL_NEAREST should do what you want.
I would check that you have the correct texture bound when you set the texture settings.
I would also check for OpenGL errors with a glGetError() somewhere in the code.

Hi sqrt[-1]

I bind my texture right before I set the texture settings (then call glTexImage2d, and all of them have the same filter setting. I know it works because I get white blocks insteads of the images when set to gl_nearest_mipmap_linear, etc, plus for 640x480 no blur is visible.

I checked with glGetError() now and no error was found when loading nor rendering the textures nor initializing the projection and gl initialization. I also translated everything by 0.375f like said. Didn’t work.

I must be doing something silly somewhere. Will alpha textures be the problem? I load my textures as transparent pngs (with power of 2 width and height). I call


glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, 0.5f);

right before I bind a texture and draw the quads like in my previous post, and disable it again after finished quad drawn.

Or maybe a setting on my graphics drivers/card is wrong?

I also enable gl_texture_2d and gl_blend indefinitely, but disable gl_depth_test and gl_lighting indefinitely.

Can you post a screenshot comparing the actual unblurred image aside the blurry texture ?

Sometimes mipmaps can be forced by the driver, ie. in Nvidia Control Panel there is an option AFAIK. No more ideas.

Well when I printscreen the program and paste it in a graphics program is turns up pixel-clear. It’s only blurry when in program running. When I up res to 640x480 its as clear as day. Maybe I should write screenshot to file from inside my app instead (instead of screen capture on keyboard)? Will that capture the blurryness?

I played around with the nvidia control panel settings, especially forcing mipmaps, but to no avail. I tried a game engine called AGS, and a game below 320x240 is also blurry, but you can turn nearest neighbour filtering on (2x, 3x, or 4x), and that works perfect on my pc, so I assume its not a card error, but in my code. I’ve mailed the creator (though I think he uses directx).

Anyways, if I ever turn up with an answer I’ll report it here. Thanks for your responses :slight_smile:

The only thing I can think of is that you might be running some sort of “override” filter effect from the driver. Nvidia and ATI have previously supplied options in their drivers to adjust the final render with post FX.

eg. For Nvidia
http://www.nzone.com/object/nzone_ambientocclusion_home.html

Ok, do you setup a fullscreen window on a 320x240 resolution? If so, be advised that most modern video cards and monitors don’t support anything lower than 640x480 and will upscale.

Your best bet is to use 640x480 and adjust your projection matrix to account for that (trivial). Even that will entail some amount of blurring on LCD monitors. Best solution? Keep the current resolution and make a “best effort” fit to the aspect ratio.

Thanks guys.

Yes, I run my app fullscreen at 320x240, and also tried at 400x300 at first which was the actual res I’m going for. I use the opengl in combination with SDL, though I don’t think SDL makes the difference, I’ll take a look in that direction.

I have changed to 640x480 and then scaled the graphics which works, but I want a generic solution that will work on all systems consistently. There must be a solution to my problem. Hopefully the maker of AGS comes back to me. Someone must have had this problem somewhere before, or do nobody else use opengl for such small res apps :frowning:

bdude, did you actually read what just said Stephen A ?
This blurring is not caused by OpenGL, but by your LCD monitor.
To avoid it, either run your program at the native LCD resolution, or never use fullscreen, or play with your monitor settings to disable blurring.
In the general case, the best solution is the first.
You can do your rendering on a 400*300 viewport inside your bigger fullscreen window, then glCopyTexSubImage2D it to a texture, then render your texture as a full screen quad in GL_NEAREST mode. Guaranteed without blur. Idealy the fullscreen windo size should be an integer multiple of width and height. Ie, for 400x300, 800x600 will do perfectly, 1200x900 too (but as it is non-standard, you will have to add some black borders to fill up to 1280x1024).

Also, beware of aspect ratio :
400x300 is 4:3, only seen on CRT, AFAIK
LCD are often 5:4 1280x1024
And more and more monitors are widescreen nowadays, with sometimes 16:9 or 16:10 aspect ratios.

Yes I did read it. I do understand that. But I’m running the program on my old 17 inch CRT screen (I even disable the LCD screen, my bad for not mentioning it clearly). I understand completely that the LCD monitors and new graphics card upscale resolutions smaller than 640x480. I just figured that if someone elses 320x240 program is successfully unblurred and worked fine on my 17inch, there must be a solution code-wise, since this disproves that the monitor is responsible, unless the other program uses tricks like you mention (note that his program also blurs if I turn his filters off).

Thanks for your comments, I will do as you say.

(I did attempt to play with monitor settings and will continue to do so to try disable blurring, haven’t had success)

In my Nvidia driver settings there is an option that “doubles lines for low resolution modes”, as was done with 8/16bits consoles and oldschool 320x200 MCGA. Maybe that is the effect you are looking for.

Just out of curiosity, how do you know that someone else’s program is running the monitor at 320x240? (ie how do you know they are actually switching the monitor to this mode and not rendering to a small target and up-scaling with nearest filtering?)

Thanks zbuffer, I’m at work now but will try this when at home.

sqrt: I have no idea, that’s why I’ve tried to contact him. Hopefully, he will reveal a bit about his process. All I know is I can create a game in his engine with 320x240 and it blurs. Then, after compiled, once the game is distributed, a settings file is distributed with it where windowed-mode and stuff like that can be set. And the options for filtering is here, which when sets fixes the blurry-ness. I’ve have no idea what he does at this point.

Thanks for all of the suggestions, you’ve given me a lot of insight into this.