I think you solution could be the answer to my question. But i have a small request, how exactly should i get the color value from the texture and how to convert it to RGB and put the corresponding values into glColor3f()?
To tell you the truth, it’s part of a project that i’m doing. The objective is to achieve texture mapping without using openGL’s texture mapping functions.
I hope this clears your doubt on me. So, is this acceptable?
Thats a strange goal. But I think it could be accomplished with fragment shaders. Just send in the texture without uv data, then create some uv data within the shader.
Complicated solution to a simple problem. I still like the crayon idea
First of all, i would like to thank all the contributors for spending their precious time looking into this problem for me…Actually i already tried to build tons of polygon. This idea came from some of the contributors and i would like to take this opportunity to thank them again.
Although i’ve manage to build 1000 x 1000 for a quad, but i still can’t manage to figure out how to get the texels colour and apply it into glColor3f(). Can anyone show me how?
I can send those who are interested and willing to help me, the application + source codes.
Might as well forget about opengl and write your own rasterizer. That way it’d make some sense, but trying to do texturing in opengl without the opengl texturing functions… Why don’t you draw triangle meshes without using triangles while you’re at it
I say sorry if programmers out there think that my objective in this project is foolish, but i have no turning back except find a solution to my problem!
I hope some of the programmers can give me an answer.
I apologize if I sounded unduly harsh. It’s more interesting to work on a problem if it’s not just “make work”, so if you had a real need for this kind of functionality and you couldn’t use the texture mapping API, then it might have been semi-interesting.
Ultimately, though, texture mapping - while straightforward - is fairly expensive. You can display images that use texture mapping in OpenGL without using OpenGL texture mapping, but to do so you have to implement texture mapping in software, and that makes OpenGL (and any hardware support it provides) pretty much irrelevant.
So your options are:
software renderer w/ glDrawPixels at the end
polygon dicing w/ per-vertex glColor
I’ll leave it to you to figure out how to implement these options.
I’m sorry too, if I sounded harsh. But you are trying to find a solution to a solved problem and your solution is obviously going to be slower and more difficult than the existing one.
Maybe you could implement displacement mapping instead? You could subdivide the polys and do the lookups just like you would with texturing, but then you’d displace the surfaces instead of just painting them. That way you’ll get an effect that’s remotely worth the effort.
Or are you trying to draw more complex geometry with texture mapping and not using the gl Functions?
If it’s the latter I’d say “Good luck” or “Use D3D (booo!)”.
If it’s the former then just create a number of quads (x by y quads) where x by y is the dimension of your image and build the color values for your vertices according to the corresponding pixel color in your image. Each quad will be x -> x+ 1 and y -> y+ 1 with z = 0.
It’s quite easy in reality but it’s going to be flipping slow (256 x 256 will equal 65536 quads and 262144 vertices!)
Of course you could just do something sensible and use the gl functions. Or, if you don’t want to use glTexCoord() you could use TexPointer() and DrawElements…