Lower End Users of OpenGL

There are alot of people i know that have lower end computers. Personally, i’m running a p133 and i’m not going to be upgrading anytime soon due to a lack of money. I’d like to see support for fake 12-bit color using 256 color modes.

This fake 12-bit color mode is as simple as putting two pixels side by side, one pixel being the red component only and the other being the combined blue and green component. the effect of doing this blends the colors together as far as the eye is concerned. The only real problem is that the display appears darker. making the palette is simple.

i got this idea from something i saw when i was playing with DJGPP - i can locate a sample for anyone who wants it (or create a zip and mail it with the source to show how its done) I would really like to hear some input on this idea - it wouldn’t be difficult to implement - and using it means palette fades are possible, so i’m sure cool ideas using that will pop right into mind.

SteelGolem

cut from the top of the example:

/*

  •   Example program for the Allegro library, by Richard Mitton.
    
  •         or "How to get a 12-bit mode on an 8-bit card"
    
  • This program sets up a 12-bit mode on any 8-bit card, by setting up
  • a 256-colour palette that will fool the eye into grouping two 8-bit pixels
  • into one 12-bit pixel.
  • It’s quite simple (sort of). You make your 256-colour palette with all the
  • combinations of blue and green, assuming green ranges from 0-15 and blue
  • from 0-14. This takes up 16x15=240 colours. This leaves 16 colours to use
  • as red (red ranges from 0-15).
  • Then you put your green/blue in one pixel, and your red in the pixel next
  • to it. The eye gets fooled into thinking it’s all one pixel.
  • It’s all very simple really. Honest.
  • To start with, you set a normal 256 color VESA mode, and construct a
  • special palette for it. But then comes the trick: you need to write to
  • a set of two adjacent pixels to form a single 12 bit dot. Two eight bit
  • pixels is the same as one 16 bit pixel, so after setting the video mode
  • you need to hack the screen bitmap about, halving the width and changing
  • it to use the 16 bit drawing code. Then, once you have packed a color
  • into the correct format (using the makecol12() function below), any of
  • the normal Allegro drawing functions can be used with this 12 bit display!
  • Things to note:
  • The horizontal width is halved, so you get resolutions like 320x480,
  • 400x600, and 512x768.
  • Because each dot is spread over two actual pixels, the display will
  • be darker than in a normal video mode.
  • Any bitmap data will obviously need converting to the correct 12
  • bit format: regular 15 or 16 bit images won’t display correctly…
  • Although this works like a truecolor mode, it is actually using a
  • 256 color palette, so palette fades are still possible!
  • Note: This code only works in linear screen modes (so don’t try Mode-X).
    */

using this information, i’m sure it wouldn’t be hard to do this.

SteelGolem

The problem with this method is that pixels are packed in 16-bit formar, with the first 12 bits used. So transfering the data (writting and reading pixels) would have the same speed as in 16 bpp (or 8bpp with twice the resoltion

xvs:

The problem with this method is that pixels are packed in 16-bit formar, with the first 12 bits used. So transfering the data (writting and reading pixels) would have the same speed as in 16 bpp (or 8bpp with twice the resoltion

Your problem is that you don’t understand what is all about.

> The only real problem is that the display appears darker
Not just darker.
Twice darker.

It’s simply unacceptable in almost all cases.

btw, this method has no any real advantages in the quality against regular RGB palette (for example, 8x8x4) with reduced brightness (something like a darker part of the 16x16x8 RGB palette).