PDA

View Full Version : glLookat, you see what?



reader1
04-29-2015, 02:09 AM
Still talk about glLookAt, what will you see if it's default? nothing

glLoadIdentity();
gluLookAt(0.0,0.0,0.0.0, 0.0,0.0,0.0, 0.0,1.0,0.0); //look at what?
glTranslatef(0.0,0.0,-15.0);
glRotatef(20,1.0,0.0,0.0);
glutSolidSphere(2,22,22);
This is a standard view code. where camera is located at the original point, as default.
What will you see? nothing. The doc says this is its default postion, but if you remark the look statement, you can see the sphere, why?
then we change the camera z axis at 0.000001, here is,

glLoadIdentity();
gluLookAt(0.0,0.0,0.000001, 0.0,0.0,0.0, 0.0,1.0,0.0); //look at what?
glTranslatef(0.0,0.0,-15.0);
glRotatef(20,1.0,0.0,0.0);
glutSolidSphere(2,22,22);
well, you can see the sphere. if you remark the glLookAt, the sphere is still being seen.
that means no matter how small the number is, it can do the trick. but if it's zero, you will see nothing. what's matter?

reader1
04-29-2015, 02:42 AM
similarly, if eye position is at zero, the object position has to be at negative a tiny number. says, 0.000000001, can do the trick.
that means the eye and the object have to have a distance, even if scanty tiny.
from the view of physics, this makes sense. but dos has avoided it.

Alfonse Reinheart
04-29-2015, 02:46 AM
This is a standard view code. where camera is located at the original point, as default.
What will you see? nothing. The doc says this is its default postion

There is nothing about that code that is "standard" or "default", with regard to gluLookAt (https://www.opengl.org/sdk/docs/man2/xhtml/gluLookAt.xml).

Just look at the math on that page. Your eye position and your look-at point are equal. When you subtract one from the other, you get a zero vector. Which has no direction. Therefore... what direction are you looking in?

Garbage in, garbage out. You fed it bad parameters, you got out bad values.

There is no "default position" or default look-at point. There is no one right answer that always does the right thing. You pick the parameters that work for your needs.

reader1
04-29-2015, 02:59 AM
There is nothing about that code that is "standard" or "default", with regard to gluLookAt (https://www.opengl.org/sdk/docs/man2/xhtml/gluLookAt.xml).

Just look at the math on that page. Your eye position and your look-at point are equal. When you subtract one from the other, you get a zero vector. Which has no direction. Therefore... what direction are you looking in?

Garbage in, garbage out. You fed it bad parameters, you got out bad values.

There is no "default position" or default look-at point. There is no one right answer that always does the right thing. You pick the parameters that work for your needs.
zero vector? what is it? you know what?
try this (0.0,1.0,0.0) for eye, and (0.0,0.0,.0.0) for object. you can see what?
is this zero vector? have u every learned math or physics?

reader1
04-29-2015, 03:23 AM
if you think that is not enough, how about this one 60.0,11.0,0.0) for eye, and (10.0,10.0,.0.0) for object?
have u been taught this is zero vector? no direction? no size?

Agent D
04-29-2015, 04:52 AM
First of all, there is no such thing as a camera in OpenGL(R). The gluLookAt function is a function from the old utility
library that pieces together a view matrix from the input and lookat positions and the up vector you give it. Look at
the manual to see how the matrix is calculated.

The function internaly computes a view-vector by subtracting the view position from the target and takes the cross product
of that with the up-vector to get a vector pointing right. Those three vectors are supposed to form a basis for a 3 dimensional
sub-space and are used for the view-matrix. In your example.

In your first example, you use "gluLookAt(0.0,0.0,0.0.0, 0.0,0.0,0.0, 0.0,1.0,0.0);". If we ignore the fact that there are
too many decimal points in one of the numbers, you have a view position and target position that are equal, thus the
forward vector is a zero vector and the cross product with the up-vector is also zero. Your result matrix is garbage.

If you use "gluLookAt(0.0,0.0,0.000001, 0.0,0.0,0.0, 0.0,1.0,0.0);", you get a non-zero forward vector and a valid right
vector, so it works. The same happens when you use a zero view position and a very small target position.

reader1
04-29-2015, 06:42 AM
First of all, there is no such thing as a camera in OpenGL(R). The gluLookAt function is a function from the old utility
library that pieces together a view matrix from the input and lookat positions and the up vector you give it. Look at
the manual to see how the matrix is calculated.
Alphonse has shown the address of the function. I see. especially normalization, that will be no meaning result. however his words is not strict, thus I make an example. forget it.
you say this is an old function, do we still use it at new version ogl?



...of that with the up-vector to get a vector pointing right. Those three vectors are supposed to form a basis for a 3 dimensional
sub-space and are used for the view-matrix. In your example.

In your first example, you use "gluLookAt(0.0,0.0,0.0.0, 0.0,0.0,0.0, 0.0,1.0,0.0);". If we ignore the fact that there are
too many decimal points in one of the numbers, you have a view position and target position that are equal, thus the
forward vector is a zero vector and the cross product with the up-vector is also zero. Your result matrix is garbage.

If you use "gluLookAt(0.0,0.0,0.000001, 0.0,0.0,0.0, 0.0,1.0,0.0);", you get a non-zero forward vector and a valid right
vector, so it works. The same happens when you use a zero view position and a very small target position.

However, I think it should define a default direction at this case. just as 0!=1.
this function is just as same as a finder of a camera. put the target at the proper position to watch. but many docs take so many paragraphes to discuss it, which make head spin after read that.

Alfonse Reinheart
04-29-2015, 07:02 AM
However, I think it should define a default direction at this case at this situation.

glScalef doesn't define a default scale if you pass all zeros. Also, if a user passes the wrong values, it's better to fail in a way that's unmistakably a failure, than for the user to think that they've done the right thing.

reader1
04-29-2015, 08:17 AM
glScalef doesn't define a default scale if you pass all zeros. Also, if a user passes the wrong values, it's better to fail in a way that's unmistakably a failure, than for the user to think that they've done the right thing.

Partly agree. but this is different from scale. it will be a dot if you pass zeros scale factor to them.

Alfonse Reinheart
04-29-2015, 08:29 AM
Partly agree. but this is different from scale. it will be a dot if you pass zeros scale factor to them.

No, it won't. OpenGL's rasterization rules for triangles are very clear: if two (or more) of the vertices of a triangle are equal, then the triangle has no area. And you only rasterize the area within the triangle.

So a zero-sized triangle isn't visible.

reader1
04-29-2015, 09:35 AM
No, it won't. OpenGL's rasterization rules for triangles are very clear: if two (or more) of the vertices of a triangle are equal, then the triangle has no area. And you only rasterize the area within the triangle.

So a zero-sized triangle isn't visible.
It can't be halped, now that they made such rule, however, oddly enough, two vertices of a triangle are equal, it should be formed a line, if three, that will be a dot.

GClements
04-29-2015, 10:13 AM
It can't be halped, now that they made such rule, however, oddly enough, two vertices of a triangle are equal, it should be formed a line, if three, that will be a dot.
The current rule is correct. If it behaved like you suggest, user code would have to analyse the data before passing it to OpenGL and explicitly discard degenerate triangles, which would be a nightmare both for performance and complexity.

reader1
04-29-2015, 06:39 PM
The current rule is correct. If it behaved like you suggest, user code would have to analyse the data before passing it to OpenGL and explicitly discard degenerate triangles, which would be a nightmare both for performance and complexity.

I havn't yet thought of that too much. I still confuse with too many bewilder name extending, like glew, glu, glut freeglu, gl etc......
well,why did you say,"
user code would have to analyse the data before passing it to OpenGL and explicitly disca..."

If there would have no had such rule, you would have no experienced such procedure.

reader1
04-29-2015, 06:55 PM
Just think of that, if the screen displays nothing, user will have no clue where the error is, and Ogl has to have a procedure tht analysis the data passing to it. if they are equal or not.
Isn't it becoming more complex?

Alfonse Reinheart
04-29-2015, 08:20 PM
That assumes that, to the user, an edge-on triangle being not draw represents a bug.

For most people, that's a feature. For many reasons.

It is impossible to design a rasterization algorithm that can decide that zero-area triangles are still rasterized and still provide a contiguity guarantee. What I mean by that is the following. Mulitsampling aside, if you have two triangels, and they have binary-identical vertex positions for two of their vertices (ie: they share an edge), OpenGL guarantees that each pixel along that edge will be built from fragments generated by one triangle or the other. No pixel will be considered covered by both triangles.

If you allow zero-area triangles to be rasterized, then you cannot make this guarantee. Just consider the case of two degenerate triangles that share an edge (and not the edge where the two vertices have the same position). In this case, the two triangles form two lines that are the same. Well, this means that both triangles will generate fragments, and they will generate the same fragments covering the same pixels. Thus violating the guarantee.