View Full Version : sharpening a texture when rendering
12-14-2007, 05:44 AM
since I start from a low resolution video when played at fullscreen the image is quite blurred, is there a way when rendering to sharpen the image?
thanks in advance
12-14-2007, 05:55 AM
If you start from a low resolution video, there's not much you can do to improve the resolution. You can set the magnification filter for your texture to GL_LINEAR to get rid of the pixel-blocks but this will result in blurry images when your magnification factor is high.
Other than that, you can take a look at the unsharp filter or deblurring algorithms such as deconvolution using some FFT or wavelet based approaches. Just google for deblurring (http://www.google.be/search?q=deblurring).
Don't be fooled by the CSI series on TV where they're able to extract megapixel resolution images from surveillance cameras with 4x4 pixels ;)
12-14-2007, 06:36 AM
Thanks for answering,
well I know that it's not so possible to improve quality, the only thing I was thinking is that maybe when rendering there was some OpenGl built-in Sharpener filter. That can maybe give the impression to the viewer that the quality is better...
12-14-2007, 07:52 AM
The Nvidia cards have a sharpening filter that can be used to sharpen textures when anti aliasing is enabled. It should be somewhere in the nvidia control panel. Maybe there's also something similar for ATI cards.
12-14-2007, 09:15 AM
I am pretty sure the "sharpen texture" is based on using other mipmap levels. It will not help when you are already magnifying level 0.
Otherwise, you may do a shader that does an unblur, there is an example in the Orange Book.
12-14-2007, 11:48 AM
If you arn't at the point of magnification, you can sharpen via subtracting out part of the next lowest mipmap level (use the texture lod bias to do this).
tex2D(,,0) - sharpenAmount * tex2D(,,1)
This also assumes that you manually correctly generated your mipmaps via a gaussian blur and downsampling instead of glGenerateMipmapEXT() which I believe just does a non-overlapping box filter (depends on driver, I'm not sure?). Basically what you are doing in Photoshop terms is an un-sharp mask (subtract out a blurred version of an image results in a "sharpened" image).
If you are trying to sharpen under magnification, I believe you might want to read Chris Green's (Valve) "Improved Alpha-Tested Magnification for Vector Textures and Special Effects",
This is from their publications page,
Hope that helps!
12-14-2007, 01:21 PM
Timothy, that is a nice paper, but not useful for video.
The Nvidia cards have a sharpening filter that can be used to sharpen textures when anti aliasing is enabled. It should be somewhere in the nvidia control panel.
This only increases the max anisotropic filter level (or does nothing when you already use 16x AF). It's a completely redundant setting.
Powered by vBulletin® Version 4.2.3 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.