What you need is a framebuffer object (FBO). See glGenFramebuffers, glBindFramebuffer, glFramebufferRenderbuffer and glFramebufferTexture2D. This allows you to direct rendering into textures and/or...
Type: Posts; User: GClements
What you need is a framebuffer object (FBO). See glGenFramebuffers, glBindFramebuffer, glFramebufferRenderbuffer and glFramebufferTexture2D. This allows you to direct rendering into textures and/or...
Your best bet is likely to be to create a framebuffer object whose colour attachment is a single-channel 8-bpp texture (GL_R8 or GL_R8UI). Once you've finished rendering to it, you can use it as a...
That would occur if the magnification filter is nearest rather than linear. Mipmaps are only used for minification, i.e. when each base-level texel maps to a screen area smaller than a pixel (i.e....
It's not necessary for the image format to support pre-multiplied alpha directly. You can just convert the data after loading it.
You really need to use pre-multiplied alpha when using alpha in conjunction with filtering (either bilinear interpolation or mipmaps or both). This means that the texture values must be...
glReadPixels() can be used to read the contents of a framebuffer.
If the OBJ file contains vertex normals, there's no need to calculate them. If an OBJ file contains vertex normals, they should normally be preferred to anything you calculate yourself, as a...
No, it's the vertex normal for that vertex.
Vertex normals are used for smooth shading, where the polygon mesh is an approximation to a smooth surface. The vertices are samples of the surface,...
Yes.
When each vertex is only affected by a single bone, the calculation is straightforward. The vertex position is transformed by the bone transformation.
When vertices are affected by...
Yes.
Yes.
Yes; good catch. For some reason, AMD's GLSL didn't notice that.
Sorry, that should be:
vec3 y = normalize(cross(z, x));
(the compiler did pick that one up; I forgot to copy the...
vec4 c = modelview * vec4(centerIn, 1.0);
vec4 u = modelview * vec4(up, 0.0); // w=0 as this is a direction, not a position
vec3 z = -c.xyz;
vec3 x = normalize(cross(u, z));
vec3...
No. Vertex normals are only used for lighting calculations.
In the absence of any other source, a common approach is to average the face normals of all of the faces to which a vertex belongs.
That isn't the reason. Vertex normals are only used for lighting. Face culling is based upon face normals, which are calculated from vertex positions. If the object is inside-out, either your Z...
In the general case, you need to use a topological sort. Note that it's possible for the ordering graph to contain cycles, in which case you need to split polygons to break the cycles. If the...
The second shader is identical to the first, except that it rotates the billboard so that the billboard's orientation is fixed in model space rather than screen space. The up vector is a vector in...
You can't determine the rotation simply from a vertex position. You need at least the object-space position of the billboard's centre and the position of the vertex within the billboard (i.e. which...
It's glGet() that's slow. Typical OpenGL commands simply append a command (an opcode and parameters) to the command queue then return immediately. glGet() has to do that, then flush the commands to...
"Row-major" means that the outermost array dimension is the row. "Column-major" means that the outermost array dimension is the column.
Given a matrix
[ 0 1 2 3]
[ 4 5 6 7]
[ 8 9 10...
The above converts polar coordinates (radius and angle) to rectangular (Cartesian) coordinates (X and Y). Thus, the loop generates a sequence of points with a fixed radius r and increasing angles...
That matrix performs an oblique projection, but I'm unsure as to what theta and alpha are meant to represent.
Oblique projections are normally expressed as
[1 0 a*cos(phi) 0]
[0 1 a*sin(phi)...
No.
In this case, QT or DQ are just more compact representations of a matrix constructed from a rotation and a translation (note that the matrix can't have any scaling or skew, which is why you...
OBJ uses 1-based indices throughout. You appear to be using 1-based indices for positions but 0-based indices for texture coordinates and normals.
You don't appear to be checking the link status of your program.
Beyond that, there's not much point in asking other people to debug your code based upon isolated fragments. More often than not,...
That looks as if depth testing isn't working, resulting in back faces being drawn over front faces.
Is depth testing enabled? Does the window have a depth buffer? Is the depth buffer being cleared...