Applying 2 image textures to simple shape object?

I have some sample code that draws an image onto a sphere or simple shape.
It’s covered with triangle vertices and texture is made active and bound with simple uv setup.
Is it possible to use 2 different images to texture the object (i.e. 1/2 of the sphere with 1 image and 1/2 sphere with different image) by setting mapping and offset functions?
If so, which OpenGL commands should I dig into? Is the only way to rebuild the UV setup or to use 1 large image with the images already combined?

Since these are dynamic and changing images I wanted to avoid compositing them together into a larger image before texturing if possible.
If it it does require combining 2 texture maps side by side into a large one what is the most efficient way in OpenGL to do that?
(On CPU I’d combine them as a pixmap with offset)

This is straightforward with shaders. If you don’t want to use shaders, the simplest approach is probably to draw the object twice, with a different texture each time.

The main question is: how are you planning on specifying which parts of the object should use which texture?

Thanks for responding.

I guess my question is can I apply textures to an existing simple shape (even a plane) by specifying rectangle coordinates w/o changing shaders or vertices.
(i.e. 0 to 1 across the entire object or mesh so if I took an image texture and said apply it to 0,0, to .5,.5 it would cover 1/4 of the object for example)
That would certainly allow changing layouts of images without having to revert to messing with the layout itself or the shaders.
By default all sample code shows covering the entire geometry it’s been assigned to with the only option seemingly to repeat or clamp if > 1

If it has to be done with shaders then as I understand it I’d have to do something like this in the shader:
if (gl_TexCoord[0].s < 0.25){
gl_FragColor = texture2D( MyTexture0, gl_TexCoord[0].st );
gl_FragColor[1] = gl_FragColor[1] * 0.90;
}

Is there a suggested example for this on the web?

i.e. 0 to 1 across the entire object or mesh so if I took an image texture and said apply it to 0,0, to .5,.5 it would cover 1/4 of the object for example

How could the system possibly know what “the entire object” is? That would imply that there is some mapping from a position on an object to a location in the texture. That mapping? That’s what texture coordinates provide. Hence the term “texture mapping”.

So unless your mesh contains texture coordinates, there’s no way to do that for any arbitrary object.

You could do it for specific objects, but that would only be with specialized shader code that knows exactly what the object being rendered is. And it would still basically be doing texture mapping for you.

If it has to be done with shaders then as I understand it I’d have to do something like this in the shader:

Actually, that would fail due to accessing a texture within non-uniform control flow.

Seems like a basic shader steps through the texture coordinates and returns the corresponding sample color. (assuming just flat image)

So is there a recommended way for a shader to take 2 textures and select which texture to sample based on coordinates?

You can define either the texture coordinates or a texture coordinate transformation so that the corners of the shape are mapped to values outside of the 0…1 range, then set the texture’s repeat mode to GL_CLAMP_TO_BORDER and the border colour to transparent.

Or you could composite the textures into a single texture then apply that to the entire surface.

Or you could do something like your shader, except that you need to either move the texture() call outside of the conditional, or calculate the derivatives outside the conditional and use e.g. textureGrad() instead. Derivatives are undefined inside of non-uniform control flow, and texture sampling functions which don’t take derivatives as a parameter calculate derivatives implicitly, so those functions shouldn’t be used inside non-uniform control flow.

It depends upon what’s convenient for the program.

One common technique is to have a separate “map” texture which behaves like multiple alpha channels. E.g.


vec4 blend = texture(map_tex, texcoords);
vec4 c0 = texture(tex0, texcoords);
vec4 c1 = texture(tex1, texcoords);
vec4 c2 = texture(tex2, texcoords);
vec4 c3 = texture(tex3, texcoords);
vec4 color = c0 * blend[0] + c1 * blend[1] + c2 * blend[2] + c3 * blend[3];

But note that this can only affect which texture(s) are sampled, it can’t affect the mapping (scale, offset, rotation, etc) of a texture.

If you want different textures to have different mappings, you can supply multiple sets of texture coordinates, or you can have a single set of source texture coordinates and add a uniform for each texture containing a transformation matrix. But this alone won’t let you specify arbitrary portions to be cut out of a texture.

Thanks, I’ll explore those options.

[QUOTEI If so, which OpenGL commands should I dig into? Is the only way to rebuild the UV setup or to use 1 large image with the images already combined?

Since these are dynamic and changing images I wanted to avoid compositing them together into a larger image before texturing if possible.
If it it does require combining 2 texture maps side by side into a large one what is the most efficient way in OpenGL to do that?
(On CPU I’d combine them as a pixmap with offset)[/QUOTE]

I have used two of the options discussed above to show the dark and light side of the earth. Neither of these involve shaders. In the attached picture, two full earth images are read in and carefully combined to get one texture which is mapped onto the earth. The advantage of this technique is that I can do a gradient across the terminator. This is probably a hard way to do what could be done with shaders. A simpler, less attractive, approach is to draw two earth spheres - one with a daytime texture and one with a nighttime texture. Clipping planes can then be used to cut away half of each sphere. The normal to these planes point directly at or away from the sun. This is a very simple technique, but it results in a sharp terminator which doesn’t look all that good, but might be fine for your application.

[ATTACH=CONFIG]2111[/ATTACH]