Right now, in OpenGL 1.whatever, we assume several different buffers... frame buffer, depth buffer, texture memory, stencil buffer, ... My suggestion is to have only one kind of buffer, treat them as a memory source and manage buffers like texture objects.
For example, when we initialize GLUT or ..., we can say, we want two buffers, buffer1 is of size GL_RGBA per pixel, buffer2 is of size GL_FLOAT per pixel... and we can do the same job for texture but texture is not per pixel.
Then, the programmer can link various buffers for different usage, say, I want to use buffer1 as the frame buffer for rendering and I want buffer2 for depth buffer during rendering and I want to show the buffer1 to the actual screen when the rendering is done.
At the same time, these buffers can be unified with pixel shader stuff so that users can access these buffers during the pixel shader we can know these values...
In this way, buffer1 acts as the frame buffer and it can act as a texture source later on in the pixel shader.
In this sense, buffer is just some memory and the most important thing is that there is no type bounded to buffers, the way we link them for different usage and how we interpret them (FLOAT/RGBA/...) is more important.
Furthermore, we can treat the texture stuff in the same way too. Is it possible?