New way to create objects

Here is an example for textures


GLTEXTURE1 mytexture;
mytexture.Version=1;
mytexture.Type=GL_TEXTURE_2D;
mytexture.Pixels=mypointer;
mytexture.Width=xxxx;
mytexture.Height=xxxx;
mytexture.Border=0;
//No more internal and external format
mytexture.Format=GL_BGRA8;
mytexture.Handle=this is filled by OpenGL

glCreateTexture(mytexture);
glGetError();  //GL_ERROR_NOTSUPPORTED, sorry, that format is not supported by this GPU
glNGenerateMipmap(mytexture.Handle);

glNBindTexture(GL_TEXTURE5, mytexture.Handle);
glNSetTexEnvStatei(GL_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glNSetTexEnvStatei(GL_MAG_FILTER, GL_LINEAR);
glNSetTexEnvStatef(GL_ANISO, 0.0);
//RENDER THE OBJECT

Yes, I am aware DSA exists but this way is better.
TexEnvState now includes mipmap state and aniso state.
glNBindTexture eliminates the need for glActiveTexture.
glCreateTexture can create any type of texture : 2D, 3D, cubemap.

First of all if such idea would have to be done, I hope that image and sampler would be different objects to that the stupid idea of the texture object would disappear.

Second, why two different ways to set the object values? immutable and mutable? What if something immutable become mutable in a further release? .Pixels would be mutable anyway …

I prefer DSA and no more bind to edit :stuck_out_tongue:

This was Longs Peak.

Longs Peak is dead. It is not coming back. Get over it.

  1. What do you mean by image and sampler?
    I made it so that anisotropy and filter states are part of the EnvState.

  2. There isn’t a second way.

  3. DSA sucks. It is based on old OpenGL. The example I gave shows that you create a texture in one shot. There is no glGenTexture and glBindTexture. You only bind if you want to render something.

It must come back :slight_smile:

We may see Longs Peak and Mt. Evans yet, though perhaps not as one or two giant mountains but rather as an assortment of smallish hills.

From what I understand, the OpenGL texture object is composed of an image, sampler states, and other states:

  • The meaning of the image should be clear.
  • The sampler states determine how the texture is sampled, i.e. they specify texture filtering and all associated states (aniso, lod), and image boundary conditions (repeat, clamp). Theoretically all mutable texture unit states on modern hardware should be considered sampler states.
  • By other states I mean e.g. generate_mipmap.

Generally, one should be able to create a separate “sampler state object” (something that is highly anticipated in OpenGL) and an “image object” (the rest of the current texture object). The texture unit should then have two attachment points taking these two kinds of objects.

This was suggested many times before (e.g. here and here).

generate mipmaps is suposed to be a state but and action or function, I don’t know how to call it. I’m very happy now with glGenerateMipmaps();

There are some mutable state relate to the image like the LOD base and the lod max.

Several samplers would allow to apply different filtering to a single image. Separating the sampler from the image could also be a step to get use programmable samplers … I’m not sure I’m 100% for this idea.

Using EnvState is confusing. The “Env” have being use previously for fixed pipeline capabilities … well image sampling is still no programable.

I think the new texture object have to work pretty much the same way as VBOs do now, so to upload instead of this
glBufferDataARB( GL_ARRAY_BUFFER_ARB, elementssizeof(float), data, GL_DYNAMIC_DRAW_ARB );
you do this
glBufferDataAttribsARB( GL_TEXTURE_BUFFER_ARB, elements
sizeof(float), data, GL_STATIC_READ_ARB, attribList );

and to use it you point it like you would in glsl 1.4

The only confusing thing to me is all the texture sampler states that needs to be set, especially if you want to be able to change some of it in the shader.
I am pretty sure using a sampler state struct of some kind, is the better one since you can upload a bunch of textures and samplers and then mix’n’match as you go, maybe you want some samples to have high anisotropic filtering normally, but less at the edges, far away, where it’s blue, or something like that.

Groovounet: i am pretty sure more programmability is always better in the long run.

I’m not sure how this glBufferDataAttribsARB function is supposed to work but it needs to allow to specific if it is a 1D, 2D or 3D images to allow 2D cache (3D cache in the future ?).

zeoverlord: I don’t believe in Larabee :wink:
Well, for filtering maybe, but still unsure.

I hope for Larrabee. Imagine a fresh new development force rippling through the still water in the ARB.

well the attribList adds the missing meta data like size, format and so on and in which you specify 1D,2D,3D, Cube or whatever texture format the data is in.
A little bit like the OP did but in an array like wglCreateContextAttribs.

well neither do i, i think it’s to little to late, but still, programmability has worked well every time it has replaced something for graphics.

Imagine a fresh new development force rippling through the still water in the ARB.

How would Larrabee help the ARB do anything?

I mean Intel may become an active member in the ARB to influence the future of OpenGL. There hasn’t been a reason to do it with their lazy IGPs, but that may change with Larrabee…

I mean Intel may become an active member in the ARB to influence the future of OpenGL.

It would only be to turn OpenGL into LarrabeeGL, similar to NVIDIA tries to turn OpenGL into NVIDIA_GL.

What other reason would Intel have to help improve OpenGL? They could just release Larrabee-only extensions.

It seems pretty likely that eventually things will generalize into something like a Larrabee, but we’ll probably always have a need for something like a GL to make things accessible to mere mortals. Also pretty likely that folks will continue differ on what that accessible form should take, so we’re probably looking at a future equally rife with choices and tradeoffs.

It is tempting to imagine casting the GL and DX APIs to the winds and unfurling a single, all encompassing API in a beam reach for the white sands of ease and plenty. Though personally I don’t think that stands a ghost of a chance of happening any time soon, if ever.