View Full Version : why this code can't draw a line on screen

03-13-2004, 02:28 AM
#include <GL/glut.h>
#include <stdlib.h>

#define drawOneLine(x1,y1,x2,y2) glBegin(GL_LINES); \
glVertex2f ((x1),(y1)); glVertex2f ((x2),(y2)); glEnd();

void myinit (void) {
/* background to be cleared to black */
glClearColor (0.0, 0.0, 0.0, 0.0);
glShadeModel (GL_FLAT);

void display(void)
int i;

/* draw all lines in white */
glColor3f (1.0, 1.0, 1.0);

glLineStipple (1, 0x0101);
drawOneLine (50.0, 125.0, 150.0, 125.0);

glFlush ();
int main(int argc, char** argv)
glutInitDisplayMode (GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize (400, 150);
glutInitWindowPosition(0, 0);
glutCreateWindow (argv[0]);
myinit ();

it can pass compile , but when I run it , I can't see a line , thanks

03-13-2004, 03:28 AM
0x0101 isn't hex You mean 0x5. I am a newbie to Opengl so that might not work but I do know that

03-13-2004, 03:32 AM
You have not setup your projection matrix, so I am guessing it is being draw out of view.

Add this to the start of your display routine

glMatrixMode (GL_PROJECTION); // Tell opengl that we are doing project matrix work
glLoadIdentity(); // Clear the matrix
glOrtho(0.0, 130.0, 0.0, 130.0, 0.0, 60.0); // Setup an Ortho view
glMatrixMode(GL_MODELVIEW); // Tell opengl that we are doing model matrix work. (drawing)
glLoadIdentity(); // Clear the model matrix

03-13-2004, 06:00 AM
Originally posted by grimoire:
0x0101 isn't hex You mean 0x5. I am a newbie to Opengl so that might not work but I do know that
Of course it's a valid hex number; if begins with 0x, and contains only characters in the (regexp) set [0-9a-fA-F]. The decimal value is 257.

03-16-2004, 01:34 AM
great! You are right nexusone.

03-16-2004, 01:43 AM
I found that glOrtho() 's arguments are very important to make those lines show on screen, I try some set of values, some can work,but some can't . next day,I decide to realize more on the bizarrerie function.