PDA

View Full Version : What do you think of the future of OpenGL?



PixelDuck
02-18-2003, 11:35 PM
Yeah, the topic said it all http://www.opengl.org/discussion_boards/ubb/smile.gif

Do you think that OpenGL will not be fully replaced by DirecX (specifically D3D)? Since many developers in the gaming industry are have turned to DX for their needs since it provides many tools to make programming with it very efficient. But are there other reasons, like speed? I don't think there's that much difference in the efficiency of OpenGL compared to DirectX. Ofcourse small things like constant assignment for vertex/pixel shaders. Since you can assign values to multiple constants in one call, which you can't in OpenGL. Or am I missing something? Is there some function that takes an array of values that can be assigned to the constant registers (environment/local parameters)? I didn't see such a function in the specs of ARB_vertex/pixel_program, but then, I have been known to actually miss some "details" http://www.opengl.org/discussion_boards/ubb/wink.gif

Anyways, please post your opinnions.

Btw. am I just another questioner, that is have there been topics like this in the near past? http://www.opengl.org/discussion_boards/ubb/wink.gif

Cheers!

Tom Nuydens
02-18-2003, 11:53 PM
Originally posted by PixelDuck:
Btw. am I just another questioner, that is have there been topics like this in the near past? http://www.opengl.org/discussion_boards/ubb/wink.gif

Yeah, it's come up once or twice before http://www.opengl.org/discussion_boards/ubb/rolleyes.gif

-- Tom

Nutty
02-19-2003, 12:44 AM
Do you think that OpenGL will not be fully replaced by DirecX (specifically D3D)? Since many developers in the gaming industry are have turned to DX for their needs since it provides many tools to make programming with it very efficient

OpenGL is used in alot more things that just games. And until Apple and linux, and SGI's and SUN's etc..etc support D3D, then theres no chance that OpenGL will die.

If anything, it will be D3D that will die, when theres hardly anything between the 2 api's in a few years. There'd be no point keeping it just for a single OS and platform.

knackered
02-19-2003, 02:14 AM
Unless, of course, that OS runs on 95% of all desktop PC's and one of the leading games consoles.

[This message has been edited by knackered (edited 02-19-2003).]

Eric
02-19-2003, 04:14 AM
Originally posted by knackered:
one of the leading games consoles.

You're not talking about the X-Box, are you??? http://www.opengl.org/discussion_boards/ubb/wink.gif

(OK, I stop flooding...)

V-man
02-19-2003, 05:42 AM
>>>Since you can assign values to multiple constants in one call, which you can't in OpenGL. Or am I missing something? Is there some function that takes an array of values that can be assigned to the constant registers (environment/local parameters)? I didn't see such a function in the specs of ARB_vertex/pixel_program, but then, I have been known to actually miss some "details" <<<

Sure you can. For example in ARB_vertex_program, you can do this.

PARAM myPI={ 4.0, 1.5708, 3.14159, 6.28319 };

You can pass values to one of the constants registers by function (4 floats at a time).

I think your question is GL vs DX (or D3D). And I think you may have asked this question on other forums because I spot these once in a while. Please dont polute forums and do your own study on which you prefer.

PixelDuck
02-19-2003, 05:50 AM
Originally posted by knackered:
Unless, of course, that OS runs on 95% of all desktop PC's and one of the leading games consoles.


And Microsoft seems to be losing a great deal of Windows users to Linux. But it would sound logical that Microsoft actually (when/if it comes to a huge drop in the number of Windows users) port DirectX to Linux, for instance.

Though until the gamedev community will adopt Linux as a serious target for games DX will propably be only for Windows. Since the only company taking Linux, MacOS X, IRIX and SOLARIS, for instance, seriously is id Software. And the're pretty much one of the few big companies using OpenGL for their graphics engines, although a great part of the industry uses id Software technology, so that expands the user range a bit. But really, I haven't heard of any other "BIG" company using OpenGL :\

PixelDuck
02-19-2003, 05:56 AM
V-man: I have asked the same question, or other's opinnion, on pcplayground but not here.

Anyways, I meant arrays of four-component vectors, not a single four-component vector, that's apparent. Like SetVertexConstantF(...) with which you can set an array of vectors, multiple registers in one call. Does this exist in OpenGL?

And btw. I am asking opinnions and what you people think, I'm not trying to pollute you with my opinnions, if you it seems like that then just discard them, but I do have my own opinnions, like everyone http://www.opengl.org/discussion_boards/ubb/smile.gif

Cheers!

JustHanging
02-19-2003, 07:35 AM
Oh my god, not another duel. At least not this one, it's been discussed to death so many times before.

Sorry, I'm waiting for my animation assignment to render and I'm just bored.

knackered
02-19-2003, 08:00 AM
Originally posted by PixelDuck:
And Microsoft seems to be losing a great deal of Windows users to Linux. But it would sound logical that Microsoft actually (when/if it comes to a huge drop in the number of Windows users) port DirectX to Linux, for instance.\

Oh please, linux people have been saying this for years - where's the sales figures of these mysteriously popular distros?
Linux is popular for web servers and dbase management, and that's pretty much it. You can't argue with market forces.

dabeav
02-19-2003, 08:13 AM
You cant rely on marketing numbers for Linux, cause its FREE. You download it "freely". So there is NO numbers. Alot of people do develope for Linux though, How do i know? Well some one out there is writing the cross platform code, i see posted all over the net.

SirKnight
02-19-2003, 08:51 AM
Heck, I don't run a server or do dbase management and I run linux. I use it because some of my computer science classes use it to compile the progams on, and I want to do my programs at home, and because I think it's a really good OS. I actually would like to drop windows and go pure linux but I of course can't do that for obvious reasons. The only thing I don't like about it is installing stuff can sometimes be a pain. Like updating the kernel, installing things that do stuff with the kernel, like installing that one package to read NTFS partitions, etc. It's just having to log in as root and type in all of those many cryptic commands. Bleh. I hope one day this will change and it will be easier to install stuff in linux like it is with windows. Run the executable and let it go. Reboot, or re-login and you're good to go. Linux IS getting more popular. It's getting easier to use, more graphical and windows like. Plus it's a way to escape the wrath of MS. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

zed
02-19-2003, 09:26 AM
will d3d die if sony choose opengl for the ps3?

of course not, but it certainly is gonna hurt d3d.
FWIW sony are the largest games console figure by a LONG way.

also from NPD figures in 2002 in the usa more number 1 pc games used opengl than d3d.

this begs the other question
if i want to make a pc game should i choose d3d?

personally opengl seems to just work out of the box (with commerical games or apps) nearlyu every time. d3d has about a 30% chance of failure on my machine. beter driver quality most likely.

anyone have an idea why this is?

HalcyonBlaze
02-19-2003, 09:27 AM
Hey everyone,

I think that both OpenGL and DirectX are going to hang around. They both have their places in the graphics world and each have their disadvantages and advantages. For example, just because we have C++ does not mean that Java is useless and will die off. I don't know about all of you, but if I had to make an application to go on the internet, I'd definetely use Java instead of ActiveX in C++.
What I'm trying to say is that I don't see a reason why both of them can't just coexist. They'll be competition for each other. If one API pulls ahead and provides more functionality, then the other is going to respond with more features. It's a lot like the ATI and NVIDIA war. If we just had one API, then they are free to update as slowly as they want.

- Halcyon

PixelDuck
02-19-2003, 10:22 AM
HalcyonBlaze: A good point. It's just that who will take the upper hand :\ The battle between nVidia and ATI is quite diverse and I would never have believed that ATI would get the upper hand for the moment, as it seems to have done. Well we never know, perhaps nVidia will take the lead again :\

Actually, to take part in the Linux debate, I've heard that a great part of the administrative side of the countys (don't know if that's the right word) in Finland are actually turning to Linux. But the gaming on Linux is still in it's infancy.

Btw. I didn't expect this to become somekind of a battle as I posted this topic http://www.opengl.org/discussion_boards/ubb/smile.gif I just thought if people could tell what they think about the two and their future.

Anyways, cheers!

pkaler
02-19-2003, 10:32 AM
Originally posted by SirKnight:
The only thing I don't like about it is installing stuff can sometimes be a pain. Like updating the kernel, installing things that do stuff with the kernel, like installing that one package to read NTFS partitions, etc. It's just having to log in as root and type in all of those many cryptic commands. Bleh. I hope one day this will change and it will be easier to install stuff in linux like it is with windows. Run the executable and let it go. Reboot, or re-login and you're good to go.

You need to try a distro with better package management (debian, gentoo). If certain features of the kernel were compiled as modules you don't need to reboot even if you install a driver. I haven't had to reboot the last bunch of times I've installed new nVidia drivers. I kill X. emerge nvidia-glx. emerge nvidia-kernel. Then I'm ready to go. The drivers are downloaded and installed automagically.

DIE RPM DIE RedHat is actually holding back Linux on the desktop by using their poor package management system.

But Gentoo is definitely a power users desktop. The Portage system is quite cool though. I haven't had to much exposure to Debian and Apt, but it looks quite cool also. Red Hat needs to adopt one of these package management systems and throw a pretty UI around it to make it more intuitive for new users.

dorbie
02-19-2003, 10:40 AM
personally opengl seems to just work out of the box (with commerical games or apps) nearlyu every time. d3d has about a 30% chance of failure on my machine. beter driver quality most likely.

anyone have an idea why this is?


The reason is that OpenGL is much more clearly specified and hammered out before it get's accepted. In addition extensions are approved by significant numbers of developers and voted on before being rubber stamped, instead of being a deal between Microsoft and one developer forced on everyone else.

This means that stuff in OpenGL is either clearly core, clearly vendor specific or somewhere in the middle which these days tends to mean good support on ATI and NVIDIA for most important stuff.

Other advantages like lead time and early publication of specs to all impelmentors help.

All of this contributes to the quality and consistency of OpenGL implementations.

[This message has been edited by dorbie (edited 02-19-2003).]

knackered
02-19-2003, 10:46 AM
Originally posted by SirKnight:
Heck, I don't run a server or do dbase management and I run linux. I use it because some of my computer science classes use it to compile the progams on, and I want to do my programs at home, and because I think it's a really good OS. I actually would like to drop windows and go pure linux but I of course can't do that for obvious reasons. The only thing I don't like about it is installing stuff can sometimes be a pain. Like updating the kernel, installing things that do stuff with the kernel, like installing that one package to read NTFS partitions, etc. It's just having to log in as root and type in all of those many cryptic commands. Bleh. I hope one day this will change and it will be easier to install stuff in linux like it is with windows. Run the executable and let it go. Reboot, or re-login and you're good to go. Linux IS getting more popular. It's getting easier to use, more graphical and windows like. Plus it's a way to escape the wrath of MS. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

Sorry - forgot about universities.
Other than that, the linux user counts are negligable - hobbiests and masochists.

richardve
02-19-2003, 11:41 AM
Originally posted by knackered:
Oh please, linux people have been saying this for years - where's the sales figures of these mysteriously popular distros?
Linux is popular for web servers and dbase management, and that's pretty much it. You can't argue with market forces.

Oh please, stop bragging **** like that.

Don't bash (*grin*) the penguin or it will bite!

pkaler
02-19-2003, 12:17 PM
Originally posted by knackered:
Sorry - forgot about universities.
Other than that, the linux user counts are negligable - hobbiests and masochists.




$ uptime
13:14:39 up 18 days, 12 min


Who's the masochist now?

masterpoi
02-19-2003, 12:46 PM
my uptime record is about three months (suse linux). Win2K: two weeks.

MtPOI

Tom Nuydens
02-19-2003, 01:04 PM
"My uptime's longer than yours!" http://www.opengl.org/discussion_boards/ubb/rolleyes.gif

That's pathetic. Why don't you all just drop down your pants and see who's got the biggest d***?

-- Tom

pkaler
02-19-2003, 01:39 PM
It was pretty obvious that this was going to be a flamewar by just reading the subject.

But you can't say the flamewars aren't diverse here. I'm just waiting for this to degenerate into a C vs C++ with Tom trying to convince everyone about the merits of Delphi. http://www.opengl.org/discussion_boards/ubb/wink.gif

Tom Nuydens
02-19-2003, 01:53 PM
My dad can beat up your dad!

-- Tom

V-man
02-19-2003, 02:17 PM
Originally posted by PixelDuck:

Anyways, I meant arrays of four-component vectors, not a single four-component vector, that's apparent. Like SetVertexConstantF(...) with which you can set an array of vectors, multiple registers in one call. Does this exist in OpenGL?

In the spec it says that this wasn't added because it would mean having to add more entry points plus GL has old tricks up it's sleeve that solves any future problem that comes up.

In this case, you can use display lists for example.


Side note: go ahead and ask questions such as the above, but asking which is better (GL or DX) is ridiculous for 2 reasons :

1. It's been asked on a weekly basis in the past.

2. Most people here are biased for GL, so the answer is obvious to them.

Some informative links :
http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/000903.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/003901.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/008241.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/000963.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/000915.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/008292.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/006316.html http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/003928.html

pkaler
02-19-2003, 03:26 PM
Originally posted by Tom Nuydens:
My dad can beat up your dad!


Hey there's no need for that.

Maybe I can help port some of your demos to Linux. I looked over the shadow mapping paper by Brabec and I'd like to see the theory in action.

Then with our superior kernel, compiler, and API we can rule the world with an iron fist. Muwahaha. We'll get a mini penguin clone that cannot get enough fudge. We'll force everyone to cough up one meeellion dollars or else mini-tux will mess you up with his shiny, bumpmapped teapot. Muwahahahaha!!! Muwahahahah!!! Muwahahaha!!!

Alright, ignore that last paragraph. I've had to much coke.

SirKnight
02-19-2003, 05:10 PM
Originally posted by PK:
You need to try a distro with better package management (debian, gentoo). If certain features of the kernel were compiled as modules you don't need to reboot even if you install a driver. I haven't had to reboot the last bunch of times I've installed new nVidia drivers. I kill X. emerge nvidia-glx. emerge nvidia-kernel. Then I'm ready to go. The drivers are downloaded and installed automagically.

DIE RPM DIE RedHat is actually holding back Linux on the desktop by using their poor package management system.

But Gentoo is definitely a power users desktop. The Portage system is quite cool though. I haven't had to much exposure to Debian and Apt, but it looks quite cool also. Red Hat needs to adopt one of these package management systems and throw a pretty UI around it to make it more intuitive for new users.

I'm using Red Hat 8.0. I'm still pretty new to linux so I havn't heard of these package management systems you speak of. Well I recognize Debian but I never knew what it was. http://www.opengl.org/discussion_boards/ubb/smile.gif Is there anyway I can use one of these systems like Debian or whatever with my Red Hat 8? It would sure make installing stuff easier it seems.

I seem to be having a problem logging in as root in Red Hat. When it comes up to the point where it says "login:" or something like that, a few seconds later X starts and KDE loads up, not letting me type in root then my password. This is off topic but how the heck do I kill X so I can login as root in Red Hat 8 so I can install things like drivers or update the kernel (oh the horror http://www.opengl.org/discussion_boards/ubb/biggrin.gif) ?

-SirKnight

SirKnight
02-19-2003, 05:12 PM
Originally posted by PK:

Alright, ignore that last paragraph. I've had to much coke.

Coke eh?

And this children is why you DON'T do drugs. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

hehe

-SirKnight

SirKnight
02-19-2003, 05:17 PM
Originally posted by Tom Nuydens:
"My uptime's longer than yours!" http://www.opengl.org/discussion_boards/ubb/rolleyes.gif

That's pathetic. Why don't you all just drop down your pants and see who's got the biggest d***?

-- Tom


LOL! That sounds about like something us guys would do. http://www.opengl.org/discussion_boards/ubb/wink.gif

-SirKnight

FXO
02-19-2003, 06:06 PM
Please don't do it!

EDIT:
DOH!, this was supposed to go up one step and was directed to SirKnight w. friends http://www.opengl.org/discussion_boards/ubb/wink.gif

[This message has been edited by FXO (edited 02-19-2003).]

pkaler
02-19-2003, 06:06 PM
Alright this thread is pretty off-topic already so here goes.


Originally posted by SirKnight:
I'm using Red Hat 8.0. I'm still pretty new to linux so I havn't heard of these package management systems you speak of. Well I recognize Debian but I never knew what it was. http://www.opengl.org/discussion_boards/ubb/smile.gif Is there anyway I can use one of these systems like Debian or whatever with my Red Hat 8? It would sure make installing stuff easier it seems.


I don't know of a way. A quick search on google for "redhat apt-get" revealed this. http://apt-rpm.tuxfamily.org/

First time i've heard of it though.

EDIT: Just ran into this. http://irtfweb.ifa.hawaii.edu/~ao/software/aoic/redhat_updates.html





I seem to be having a problem logging in as root in Red Hat. When it comes up to the point where it says "login:" or something like that, a few seconds later X starts and KDE loads up, not letting me type in root then my password. This is off topic but how the heck do I kill X so I can login as root in Red Hat 8 so I can install things like drivers or update the kernel (oh the horror http://www.opengl.org/discussion_boards/ubb/biggrin.gif) ?


TMTOWTDI

You should be able to choose a console login rather than a graphical login somewhere in your KDE menu.

Otherwise, go for a "$init 3". I believe RedHat uses runlevel 3 as single-user, console. Check /etc/inittab to be sure.

You could change the line in /etc/inittab that has X set to respawn to once. Then "$ps -x --forest", then "# kill xxxx" where xxxx is the process number for the xserver.

You could hit Ctl-Alt-F1 to switch to a virtual console. And ignore the fact that x is running. The drivers might complain though when you try to install them.

[This message has been edited by PK (edited 02-19-2003).]

pkaler
02-19-2003, 06:08 PM
Originally posted by SirKnight:
Coke eh?

And this children is why you DON'T do drugs. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

hehe


I meant coca cola. But I'm not sure which one is worse.

richardve
02-19-2003, 06:09 PM
Originally posted by SirKnight:
This is off topic but how the heck do I kill X so I can login as root in Red Hat 8 so I can install things like drivers or update the kernel (oh the horror http://www.opengl.org/discussion_boards/ubb/biggrin.gif) ?

- Open up a terminal/console (Konsole is good)
- Type without quotes ('su' to root first): "/sbin/telinit 3"
- Wait a few seconds, X will be killed and you'll be thrown into the mighty shell (probably bash in your case)
- Do whatever you should do to update those drivers (readme! and learn to use vi(m)!)
- When you're ready, type this without quotes: "/sbin/telinit 5; exit"
- Wait a few seconds, X will be started and so will GDM, KDM or just your DE/WM.

No need to reboot the system, eat that Bill! http://www.opengl.org/discussion_boards/ubb/biggrin.gif

[/offtopic]

SirKnight
02-19-2003, 06:39 PM
Originally posted by PK:
I meant coca cola. But I'm not sure which one is worse.


Ya I know, just couldn't resist. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

And thanks PK and richardve for the WAY off topic linux help. http://www.opengl.org/discussion_boards/ubb/wink.gif I should probably get a book on Linux, specifically Red Hat 8. One thing I still need to do is somehow downgrade the version of g++ that I have to an older verison. The one I have, which I think is the second from the newest or something, is a big pos. The only programs I can get working correctly on there is "hello world." http://www.opengl.org/discussion_boards/ubb/frown.gif

OpenGL FOREVER!! http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

zed
02-19-2003, 07:59 PM
>> I've heard that a great part of the administrative side of the countys (don't know if that's the right word) in Finland are actually turning to Linux<<

i know the german government is very support of KDE.

some recent good news for linux (from memory)
*in 2001 desktop (not server) growth was something like 20%. windows managed something like 2% (this is why ms is very worried by linux)
*in 2002 walmart (biggest usa department store) have started selling computers with linux preinstalled
*mandrake seems to be near dead which is a good thing (using mandrake8.2 at the mo btw)
*kde + gnome will talk a bit more with each other.

personally though at the moment linux(even lindows) is still not ready for the general populace

john
02-19-2003, 08:38 PM
Richard Stallman once said; VI is not a sin, it's a penance.

john
02-19-2003, 08:49 PM
SirKnight -

I think the problem you're having wiht whatever version of g++ you're using is that g++ now conforms to the offical C++ standard. From my understanding, C++ has been slowly evolving and has only ~very recently~ been locked to THE official ANSI C++ coding standard. I used to have some of the changes in my head, but now I can't think of anything. ;-(

Anyways, you could try using kgcc instead of g++. kgcc tends to be an older version to compile the kernel.

cheers,
John

masterpoi
02-19-2003, 09:18 PM
pfff, anyways my uptime IS longer ;-)

MtPOI

pkaler
02-19-2003, 09:32 PM
Originally posted by SirKnight:

One thing I still need to do is somehow downgrade the version of g++ that I have to an older verison. The one I have, which I think is the second from the newest or something, is a big pos. The only programs I can get working correctly on there is "hello world." http://www.opengl.org/discussion_boards/ubb/frown.gif


I'm gonna have to agree with john. I'm running gcc3.2.2 and everything works pretty dandy. Stuff like Koenig lookup works according to the spec. Make sure you use standard headers correctly. #include <iostream> instead of #include <iostream.h>. All iostream stuff is in the std namespace.

gcc has a new parser that fixes a bunch of bugs and precompiled headers are (finally!!!) coming pretty soon.

If you are still not satisfied you can try Intel's compiler. It is available for Linux for personal use.

PixelDuck
02-20-2003, 03:58 AM
>> "V-man: In the spec it says that this wasn't added because it would mean having to add more entry points plus GL has old tricks up it's sleeve that solves any future problem that comes up." <<

Duh! Now I remember that part, I read it through some time ago and seem to have forgotten it. Sorry http://www.opengl.org/discussion_boards/ubb/smile.gif But I still hope they will add it later on. And one thing is BADLY lacking: all brancing (and looping, for that matter) http://www.opengl.org/discussion_boards/ubb/frown.gif

Edit: But, NV_vertex_program has a plural version.

Edit: A.t.m. the lack isn't that bad, since DX has introduced it only recently in version 9.0. But for the future it might even become essential. Reducing shader objects, for instance.

PK: Is there a compiler by AMD? Since I'd prefer AMD optimized compilation :\

Cheers!

[This message has been edited by PixelDuck (edited 02-20-2003).]

DJSnow
02-20-2003, 06:29 AM
@PixelDuck:

>>PK: Is there a compiler by AMD? Since I'd
>>prefer AMD optimized compilation :\

yes, there is a 3DNow!SDK available; somewhere at amd.com/developers i think (not sure if this is the correct URL, but go on this site and look around)
you can find the Intel-Optimization-Routines on intel.com, called "the intel performance primitives" or something like this; on intel.com is also a SIMD-optimized-fully-usable-without-making-big-modifications matrix/vector library - use the "search-button".

SirKnight
02-20-2003, 07:35 AM
Ok well I have gcc 3.2.0 I think. I have tried doing both #include<iostream.h> and the other way like #include<iostream> using namespace std; and my programs still dont work. They are simple searching and hashing algorithms I am doing in my data structures class atm. See, these programs work perfectly if I trace the code myself, use VC++ 6.0 and .NET, and which ever verson of gcc is on Red Hat 7.3, I think its 2.8 or something like that. My code is standard c++ so I don't see what the problem is. Maybe i'll try to get the latest version and try that before downgrading. I would use another compiler but I'm not sure if my professor will let me use something other than g++. It's what he tells us to use. Maybe I'll go talk to him about that and see if I can use a different linux compiler.

-SirKnight

Ysaneya
02-20-2003, 07:37 AM
Flame wars, yeah!

Who's up for a small Iraq war vs peace debate ?

Y.

pkaler
02-20-2003, 08:28 AM
Originally posted by SirKnight:
Ok well I have gcc 3.2.0 I think.


Try "$gcc -v" to figure out what version you are using.



I have tried doing both #include<iostream.h> and the other way like #include<iostream> using namespace std; and my programs still dont work. They are simple searching and hashing algorithms I am doing in my data structures class atm. See, these programs work perfectly if I trace the code myself, use VC++ 6.0 and .NET, and which ever verson of gcc is on Red Hat 7.3, I think its 2.8 or something like that.


I believe it was gcc2.96 or something IIRC. I also remember that version of gcc being *VERY* buggy.

Are you having compile-time problems or run-time?

Use standard headers (#include <iostream> ). Don't use "using namespace std;". Prepend everything such as "std::cout". The using declaration kinda makes the use of namespaces pointless. You shouldn't have to use std:: for algorithms because of Koenig lookup.



My code is standard c++ so I don't see what the problem is. Maybe i'll try to get the latest version and try that before downgrading.

I don't recommend downgrading. gcc less than 3.0 is pretty buggy. gcc3.0 and 3.1 are not compatible with other versions. I haven't had any problems so far with gcc3.2.2.

You can try comeau online to test snippets. http://www.comeaucomputing.com/tryitout/

Also, try out www.stlport.org (http://www.stlport.org) and stlfilt. http://www.bdsoft.com/tools/stlfilt.html

[This message has been edited by PK (edited 02-20-2003).]

richardve
02-20-2003, 09:05 AM
Oh no! Please don't downgrade GCC, 3.2 is way much better than 2.96 (or whatever) and earlier versions.

(and like said before, GCC 3.whatever will get support for precompiled headers somewhere in the near future, finally! http://www.opengl.org/discussion_boards/ubb/smile.gif)

DJSnow
02-20-2003, 09:11 AM
wasn't the thread over the future of OpenGL ???

SirKnight
02-20-2003, 09:36 AM
Originally posted by DJSnow:
wasn't the thread over the future of OpenGL ???

Yeah but this has already been done to death. Off-topic is more fun anyway. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Back to off-topic. http://www.opengl.org/discussion_boards/ubb/biggrin.gif I'm having runtime problems. It's like most of my variables don't get changed even though they are supposed to. Like I had some globals counting the number of comparisons, number of successful searches, etc. Well they clearly should have had something other than 0 in them but for some reason compiling on 3.2 or whatever version I have that's comes w/ RH8 never changed their contents. They remained to my inital value of 0. And no they are not constants. The same with my array that held my numbers. The order should have been different at the end of the program but it wasn't. Like I said by looking at the code it should work, using the RH 7.3 version of g++ it works and VC++ 6.0 and .NET it works. Something is gimped!

-SirKnight

pkaler
02-20-2003, 10:01 AM
Originally posted by SirKnight:

Back to off-topic. http://www.opengl.org/discussion_boards/ubb/biggrin.gif I'm having runtime problems. It's like most of my variables don't get changed even though they are supposed to.

Hmm, works in VC6 and .Net but not gcc. I'd blame RedHat. They have their own patched version of gcc. I've had problems with 2.96 with redhat with code that kinda looked like this:




void foo( float f )
{
double d = f;
}


The value of d would not change, no matter what I passed in for f. I had to explicitly cast f.




void foo( float f )
{
double d = static_cast<double>(f);
}

SirKnight
02-20-2003, 02:38 PM
Hmm, that sounds exactly like my problem. Some of my variables won't change when they should. Maybe if I get the newest dist of gcc (I just checked I have 3.2) from the gnu site maybe it will work fine and get rid of whatever crap RedHat did. Or maybe somehow my copy just got corrupted or something, who knows.

-SirKnight

john
02-20-2003, 03:40 PM
Hello,

I'm using G++ v2.96 and not having any problems...?

i #include<iostream> (not iostream.h or iostreamh)...

<shrugs>

cheers,
JOhn

MichaelK
02-21-2003, 03:48 AM
Originally posted by PK:
[B] [quote]

void foo( float f )
{
double d = f;
}


What is foo? I cant find in english dictionary. Is it some sort of slang? Every time i read some example code in the net or some tutorial, there always must be a function named Foo!
Or at least, in place of "Hello Word!" it says "Foo Bar".

Tom Nuydens
02-21-2003, 04:42 AM
http://info.astrian.net/jargon/terms/f/foo.html

-- Tom

MichaelK
02-21-2003, 05:50 AM
Funny... You Americans...

DJSnow
02-21-2003, 07:23 AM
@MichaelK:

>>What is foo? I cant find in english
>>dictionary.

install babylon-translator;
with all internet/computer-slang dictionaries (recommended also the tax-dictionary, technical terms, mathematical terms, RFC-dictionary, IP-port, toplevel-domain-dictionary) and all other dictionaries needed for translating in <insertfavoritelangugaehere>.

this will help you understanding (the most time real-not-good) internet/forum/newsgroups-slangs...

give it a try - its really good.

SirKnight
02-21-2003, 09:23 AM
I noticed in that article it says FUBAR means "****ed Up Beyond All Repair." Funny how it also doesn't say that it also can mean "****ed Up Beyond All Recognition." That latter way is usually what I know it as. Both ways are correct though. Actually I think they say "... Recognition" in Saving Private Ryan, that's how I figured out what FUBAR meant back then. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

V-man
02-21-2003, 12:11 PM
Originally posted by PixelDuck:

Duh! Now I remember that part, I read it through some time ago and seem to have forgotten it. Sorry http://www.opengl.org/discussion_boards/ubb/smile.gif But I still hope they will add it later on. And one thing is BADLY lacking: all brancing (and looping, for that matter) http://www.opengl.org/discussion_boards/ubb/frown.gif

Edit: But, NV_vertex_program has a plural version.
[This message has been edited by PixelDuck (edited 02-20-2003).]

NV_vertex_program
NV_vertex_program1_1
NV_vertex_program2

The last one represents the future version of ARB_vp.

All that branching, looping, subroutine executing hot stuff is in there. I'm still reading it. Very long stuff isnt it?

Anyway, you have to remember the ARB is made of many companies collaborating. They are very careful in what gets accepted.

Besides, Radeon 9700 is old in terms of vp and fp. When the newer vp and fp get accepted, the R9700 won't be able to handle it. I'm betting the GFFX already has the necessary circuits layed out, and I'm sure they have beta drivers and experimenting with NV_f_p_2_1 or whatever it is.

That last paragraph is speculation on my part.

Julien Cayzac
02-22-2003, 02:38 AM
my guess about the future of GPUs & opengl:

- better programmability (the usual one)
- DMA access (like on the P10)
- HDM (like on the Parhelia)
- HOS coming back and generalized
- ability to generate vertices onboard instead of uploading them (primitive_programs?)
- hardware Perlin noise (Ken Perlin has a patent on a hardware implementation)

...hum I don't see anything else. Anyway, having all the above would keep me happy for two years or three, at least... http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Julien.

pkaler
02-22-2003, 12:08 PM
To add to the list:

- deeper colour buffers
- better floating point throughout pipeline

PixelDuck
02-23-2003, 10:14 AM
V-man: I took a look at the specs and they still said that vertex programs are programs that are executed without branching and/or looping :\ well I haven't read them through (I have R9700, so I can't test them anyway). Damn, so R9700 can't use branching at all? That is, not in OGL:\

I'm seriously considering buying an NV35 when it comes out :\ Well, that remains to be seen.

Cheers!

Antorian
02-23-2003, 01:27 PM
What a subject http://www.opengl.org/discussion_boards/ubb/smile.gif

What I don't understand, it's why so much people want an API to become the only one??

I use OpenGL and D3D a bit and it seems good for both....
Why Only One?
If they were 3 or 4 ...etc... it would be a pleasure to make its own choice and apply http://www.opengl.org/discussion_boards/ubb/smile.gif
Isn't it?

Ok I know, I know... I'm french ok...but....
why not ?
Those two Graphic Libraries work fine and can be improved.
So let's go... use OpenGL , use D3D whatever you want, and please still use your pencil to draw http://www.opengl.org/discussion_boards/ubb/smile.gif

Sorry for this Filozofikal" mind of a desperate crazy programer http://www.opengl.org/discussion_boards/ubb/biggrin.gif

rgpc
02-23-2003, 04:26 PM
Originally posted by SirKnight:
I noticed in that article it says FUBAR means "****ed Up Beyond All Repair." Funny how it also doesn't say that it also can mean "****ed Up Beyond All Recognition." That latter way is usually what I know it as. Both ways are correct though. Actually I think they say "... Recognition" in Saving Private Ryan, that's how I figured out what FUBAR meant back then. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight


Perhaps Recognition is the medical term and Repair is the technical term. I've heard both & Recog. tends to be relating to a person while repair tends to relate to equipment etc. (But they're interchangable) http://www.opengl.org/discussion_boards/ubb/smile.gif

Oh, and I'm really glad we're discussing the future of Gl again - it's been almost a week since the last thread...

MZ
02-23-2003, 04:51 PM
What I don't understand, it's why so much people want an API to become the only one??I personally think existence of 2 APIs is just as useful for the industry as existence of PAL and NTSC, meters and inches/foots/miles, grams and pounds, etc. That is, in some cases it doesn't bother you at all, in other cases it's a pain in ass, but it is _never_ a good nor useful thing.

If they were 3 or 4 ...etc... it would be a pleasure to make its own choice and apply http://www.opengl.org/discussion_boards/ubb/smile.gif
Isn't it? I imagine driver developers heve enough trouble with drivers for 2 APIs, and you would like them to have 3 or 4 times bugs to fix :p

Talisman
02-23-2003, 10:48 PM
Originally posted by john:
I think the problem you're having wiht whatever version of g++ you're using is that g++ now conforms to the offical C++ standard. From my understanding, C++ has been slowly evolving and has only ~very recently~ been locked to THE official ANSI C++ coding standard. I used to have some of the changes in my head, but now I can't think of anything. ;-(


And it only took the gcc maintainers 6 years to finally "support" the standard. My experience with gcc has been that after 2.8.1, it just has been garbage. Heck, even 2.8.1 sucked on many platforms (especially SPARC). A certain hardware/OS vendor's abominable mangling of gcc for their latest OS is even worse than the main branch as regards C++ "support". MS Visual C++ and Metrowerks' C++ compilers are far better. Too bad Metrowerks has apparently dropped their project to port their compiler to Linux.




Anyways, you could try using kgcc instead of g++. kgcc tends to be an older version to compile the kernel.




Yep, kgcc these days is usually one of the pre-2.8 stable gcc releases, since the kernel can't be compiled with experimental code generators.

gcc's quality of late has just been so shoddy that I'm embarrassed to have once been a fan of it.

As for Linux itself, I used Linux exclusively for about 8 years or so. I finally got tired of having to fight with my machine to do things that computers are supposed to make easier (this is why Linux is still mainly used by hobbyists and fringe elements). I also got tired of the way that these "open source" folks who scream at you about "freedom" and so forth are so adamant A) about forcing their views down everyone else's throats, and B) that they never ever code bugs and therefore any fixes or enhancements submitted to them are the work of Satan. I'm surprised anything gets done in the "open source community" given the fascist stance so many of these folks take. My other boxes don't get any use.

But that's just me.

Okay, off my soapbox now. http://www.opengl.org/discussion_boards/ubb/smile.gif

Oh, and please send flames to /dev/null. http://www.opengl.org/discussion_boards/ubb/wink.gif

velco
02-23-2003, 11:28 PM
Speaking of performance, is there something like EXT_compiled_vertex_array, NV_vertex_array_range, NV_vertex_array_range2 or ATI_array_object in D3D ?

~velco

Talisman
02-23-2003, 11:31 PM
Okay, now for my on-topic post. http://www.opengl.org/discussion_boards/ubb/smile.gif

The future of OpenGL... I think the fact that the OpenGL specification has been revised very little, overall, in what? over 10 years now?, says quite a lot for its stability and usefulness. As someone else pointed out, there are lots of "old tricks" in the existing spec that can be used rather creatively.

I'm pretty excited about the ratification by the ARB of the vertex_program and fragment_program extensions (and even more so that both Nvidia's and ATI's latest drivers support them, though fragment_programs is mysteriously absent from Nvidia's 41.09 release, while ATI has both), which is partly due, no doubt, to pressure from the competition of Microsoft's D3D8 and 9 features. However, there is probably more credit due to the games developers for pushing for these features in OpenGL. I know there are a great many developers who will most likely always refuse to use D3D for various reasons.

The main reason against D3D is portability, of course. If your application absolutely must run on the widest range of platforms possible with the minimum amount of re-coding, then the choice is easy: OpenGL. If you're only concerned about Windows, it's not so simple, since you no longer have the portability issue.

Another point strongly in favor of OpenGL: D3D is still, for all the work MS have done to simplify the API and make it easier to use than it used to be, a horribly complicated API to get up and running with. The amount of setup code you need to write -- even still -- is more than you really ought to have to do to get a darn library ready to work with. OpenGL is still the clear winner here, particularly since it's just as easy to use from C as from C++. D3D can be called from C, but, since it was designed as a C++ API, implemented in C++, you have to explicitly reference the vtable in the object whose methods you're calling. OpenGL doesn't make you do that, since it was designed as a C API, so it's usage is identical either way.

The biggest thing D3D has going for it, in my opinion, is it's being one part of a suite of multimedia APIs. While you can use any of the suite independently of the rest, I think a lot of developers who use DirectX exclusively do so because after all the setup work to get the DirectX libraries ready to work with, they probably want to "get their money's worth", so to speak.

I think that OpenGL will always be with us, and will continue to gain features as the industry drives it to. D3D will continue to improve, with new features added there, too, and that will also drive some of the new features OpenGL will gain in the future. But, keep in mind that recent versions of D3D had lots of stuff pruned from them, while OpenGL has never (to my knowledge) had official, "core" features removed. The graphics card vendors' driver implementations will also continue to improve, and all of us will improve our code, too.

I think there will always be some disagreement in the developer community as to which API is "better". Which is "better" is really a matter of opinion -- I happen to prefer OpenGL as a matter of course, while many of my colleagues prefer D3D. I'm probably biased because I learned OpenGL first, and when I first looked at D3D, it was really awful. That first impression is really difficult to overcome. Besides which, it's only been recently that I've been doing work on the Windows platform (everything before was on one version of UNIX or another), and, since OpenGL is there and I know it already, I'm comfortable continuing to use it. I've played a bit with D3D, but I don't think I'll be making the switch any time soon.

Ysaneya
02-24-2003, 12:26 AM
Velco: yeah, it's one of the most basic things in D3D. When you create a vertex or index buffer, you specify the pool it's going to be used into. The default pool is using video or AGP memory (static / dynamic VBs/IBs).

Talisman: agreed about portability. I'll also add my contribution: ugly, long type names. I disagree about the setup length: you can do very complex stuff if you want, but if you want to have it up running quickly, it's even faster than OpenGL. The C vs C++ thing is obviously true, but it can be seen as both an advantage and a disadvantage (ie. you've got an OOP API).

Y.

PixelDuck
02-24-2003, 03:02 AM
Originally posted by Antorian:
What a subject http://www.opengl.org/discussion_boards/ubb/smile.gif

What I don't understand, it's why so much people want an API to become the only one??

I use OpenGL and D3D a bit and it seems good for both....
Why Only One?
If they were 3 or 4 ...etc... it would be a pleasure to make its own choice and apply http://www.opengl.org/discussion_boards/ubb/smile.gif
Isn't it?

Ok I know, I know... I'm french ok...but....
why not ?
Those two Graphic Libraries work fine and can be improved.
So let's go... use OpenGL , use D3D whatever you want, and please still use your pencil to draw http://www.opengl.org/discussion_boards/ubb/smile.gif

Sorry for this Filozofikal" mind of a desperate crazy programer http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Antorian: To tell you the truth, I too want atleast two APIs to be available, widely available. Since I do D3D and OpenGL, both, I have no quarell against D3D, not at al, well... one thing: to tell you the truth, I don't like the COM style of it :\ Atleast I hope that both survive and that perhaps even some new ones would emerge. The reason I asked the question was that I just wanted to hear the opinion of people as to which MIGHT become dominant, I hope it will be neither, but dominance tends to occur, it happened with internet browsers, atleast on Windows. And since we're talking about Microsoft'S DX, I get a déjavu feeling :\ Anyways I like both and I do both and I hope both will survive.

Cheers!

velco
02-24-2003, 05:01 AM
Originally posted by Talisman:
And it only took the gcc maintainers 6 years to finally "support" the standard.

Huh? the C++ standard is from 1998, the C standard from 1999. What six years ?



Yep, kgcc these days is usually one of the pre-2.8 stable gcc releases, since the kernel can't be compiled with experimental code generators.

Way off. 2.95.x, RedHats 2.96, 3.0, 3.1, 3.2.x compile 2.5.x kernels just fine.



gcc's quality of late has just been so shoddy that I'm embarrassed to have once been a fan of it.


At least SPEC results (http://www.suse.de/~aj/SPEC) do not show agree with you

~velco

richardve
02-24-2003, 06:16 AM
Originally posted by Talisman:
gcc's quality of late has just been so shoddy that I'm embarrassed to have once been a fan of it.


Are you sure it's not your code that's shoddy?

GCC 3.2 is a bit picky on some stuff, but it's not a bad compiler at all.
(MSVC6 is/was worse)

DJSnow
02-24-2003, 07:11 AM
@richardve / @all:

BTW: has anyone already tested the new Intel C++ 7.0 compiler ? In an article, they said that it is up to 30% faster than gcc (and that means in unspoken text: xxxx% faster than msvc??)
i think there is a eval-version available; is it worth to give it a try ?
any experience ?

velco
02-24-2003, 07:17 AM
Originally posted by DJSnow:
[B]@richardve / @all:

BTW: has anyone already tested the new Intel C++ 7.0 compiler ? In an article, they said that it is up to 30% faster than gcc


It is not bad. Not at all.

I ran some integer benchmarks and it appeared somewhat worse than gcc-3.2.1.

I ran also some floating point benchmarks (couple of matrix multiplication algos). It has a distinctive advantage here, especially vectorizing loops. It was worse than handcrafted GCC builtins code (e.g., __builtin_ia32_mulps, etc), but then again one could code exactly the same with ICC builtins.

~velco

EDIT: In any case changing algorithms to optimize cache usage had way more impact than each compiler's optimization.


[This message has been edited by velco (edited 02-24-2003).]

pkaler
02-24-2003, 11:58 AM
icc is supposed to be better than gcc on (surprise, surprise) p4's. http://www.coyotegulch.com/reviews/intel_comp/intel_gcc_bench2.html

But supposedly gcc is catching up. http://kerneltrap.org/node.php?id=583 http://people.redhat.com/dnovillo/spec2000/gcc/global-run-ratio.html

I just emerged icc. I'll give it a fly (if it works with my makefiles). Don't expect as thorough a benchmark as the ones above.

Humus
02-24-2003, 02:21 PM
Originally posted by velco:
Speaking of performance, is there something like EXT_compiled_vertex_array, NV_vertex_array_range, NV_vertex_array_range2 or ATI_array_object in D3D ?

~velco



How about vertex buffers?

john
02-24-2003, 04:13 PM
to sirknight:

as of yesterday I used to have gcc2.96, now I have gcc3.20 and my s/w that compiled perfectly happily under the old gcc dies horribly when I try recompiling under 3.20. The reason my code was dying was that the STL stuff is bound under the stl namespace, and that namespace no longer appears to be with'd by default. My code compiled after I added

using namespace std;

after I #include'd some standard library stuff (like <iostream>, for instance). So, give that a shot and see if gcc3.20 is more ammenable to your code.

cheers
John

Talisman
02-24-2003, 05:44 PM
Originally posted by richardve:
Are you sure it's not your code that's shoddy?

GCC 3.2 is a bit picky on some stuff, but it's not a bad compiler at all.
(MSVC6 is/was worse)

I admit I haven't tried 3.2. Of the gcc versions I have played with post-2.8, they've mostly been junk.

Whomever posted the C++ spec date of 1998: my copy direct from ISO says 1997... could be a typo, I suppose. It's now 2003, and there was a post that gcc "only recently" came up to spec on C++, so without any other definition of "recently" (as well as 3.1 blowing major chunks on C++ "Hello World" when I tried it), I made the estimate of 6 years.

I don't want to get into any arguments over this, so if the current spec says 1998, cool, I'll assume my copy has a typo. As for gcc's quality, well, I gave up on Linux a couple years ago for various reasons, so except for Mac OS X (where gcc really severely sucks at C++), I don't have any reason to muck with gcc (and Metrowerks is better on Mac anyway). The Intel compiler is pretty good on Windows, and Microsoft's compiler is decent as well. I haven't yet run into any of the conformance problems that others keep harping on in MS compiler, so I can only assume that these problems either aren't particularly common, or they're just inconsequential. In any case, I can't argue that they don't exist, only that I haven't run into them. Besides, does anyone really use partial template specialization? I'm using primarily VC7, not 6, so I can't comment on 6 vs gcc.

The SPEC benchmarks have been used for both sides of all arguments, so I'm just not going to go there.

Incidentally, I agree with whomever posted the bit about redhat's modifications to gcc making it pretty much useless. I always had to get the unmodified source and bootstrap the compiler myself. It's a royal pain in the butt on non-Intel systems, but worth it if you're serious about developing on Linux.

SirKnight
02-24-2003, 05:55 PM
Yeah I'm going to have to get a REAL copy of gcc and overwrite this RedHat modification one I have. It looked kinda crazy the install process of gcc in linux, definitely not "click setup and watch it go" kind of thing. http://www.opengl.org/discussion_boards/ubb/frown.gif

Too bad I can't run VC.NET in linux. http://www.opengl.org/discussion_boards/ubb/biggrin.gif I like MS compilers a lot. I never have had any problems with them, easy to use, the best debugger there is, etc. I think they are great. To me Visual Studio is the best thing MS has. But that's just me. http://www.opengl.org/discussion_boards/ubb/smile.gif

-SirKnight

Julien Cayzac
02-25-2003, 09:03 AM
Originally posted by SirKnight:
It looked kinda crazy the install process of gcc in linux, definitely not "click setup and watch it go" kind of thing.

Don't write "Linux" when you should write "RedHat".

RPM based distros have been known to be crap. Whoever use one of them can't argue.

I have no problem installing gcc on a regular basis, I just type "emerge -u gcc" and I watch it being upgraded (every month or so). I don't need to do any other things (such as the mess with alternative versions on Mandrake or RedHat).

Julien.

SirKnight
02-25-2003, 11:17 AM
Well reguardless if it's redhat or not, it's STILL linux. I know there are differences between them all but ultimately it's still linux.

Anyway I figured out why my programs for my class weren't working. Ok this is what I was doing:

ifstream file;
file.open( filename );
...
Use the file
...
file.close();
...
file.open( differentfile );
...
Use the file with my data structure
...
file.close();
...
Stuff...

Ok turnes out that it wasn't opening my second file there, when it SHOULD have been. I have done programs like this on many differen't compilers before and it works fine, but not with gcc 3.2. What I had to do was create another object of ifstream type called file2. Then it would open my second data file and the program worked perfectly after that. Now WTF is up with this? I should not have too create an object of ifstream or ofstream, depending on what i'm doing, for each file I plan on opening. Once I finish w/ one file, I should be able to use the same ifstream object to open something else. Wierd.

BTW, I just finished d/l'ing the latest gcc, 3.2.2. I d/l'ed the whole package, it was a tar.bz2 at 19mb. Ok so now I have a folder called home/myusername/gcc322/gcc.3.2.2 (well I'm not sure exacly about that last folder name, it's something like that anyway). Ok so in that last folder contains all the gcc stuff. Now how the heck do I upgrade my current gcc with this new one? The install docs on the gcc.gnu website doesn't help at all. All they do is confuse me. http://www.opengl.org/discussion_boards/ubb/biggrin.gif I tried doing what they say but it didn't work. I don't know wtf they are talking about anyway so that doesn't help. So if someone can help me out here it would be great. If you want to email me this instead of keeping this threading growing then go ahead. Doesn't matter to me. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

velco
02-25-2003, 11:52 AM
SirKnight:

I. The following program works just fine with "c++ (GCC) 3.2.2 20021228 (prerelease)".
#include <iostream>
#include <fstream>

int
main()
{
int i;
std::ifstream f;

f.open ("foo");
f >> i;
f.close ();

std::cout << i << std::endl;

f.open ("bar");
f >> i;
f.close ();

std::cout << i << std::endl;

return 0;
}

II. Do not mess with package manager on your distro, i.e. do not override your distro's gc c installation, but either uninstall it (but after building a new compiler) or leave it alone.

III. Building GCC;

a) Assuming your sources are in gcc-3.2.2, choose some working directory, e.g. ~/build/gcc. It is important your working dir is other than the GCC source dir and is not a subdirectory of the GCC source dir.

b) Decide where you want the new GCC installed.
/usr/local, /usr/local/gcc, ~/opt/gcc are common choices. That dir is hereafter reffered to as $prefix.

c) go to your work dir

cd ~/build/gcc

d) configure GCC build

$srcdir/gcc-3.2.2/configure --prefix=$prefix --enable-languages=c,c++

e) build GCC
make bootstrap

f) install GCC
make install

g) add the $prefix/bin directory to the PATH

h) enjoy

~velco

Julien Cayzac
02-25-2003, 11:54 AM
Originally posted by SirKnight:
Well reguardless if it's redhat or not, it's STILL linux.
Linux is just a kernel. RedHat Linux is an operating system coming with a Linux Kernel and a whole bunch of GNU packages. Gentoo Linux (my current OS) is a different operating system, as well as are Debian Linux, etc.



Now how the heck do I upgrade my current gcc with this new one?

I'm not into RedHat so I don't know the details, sorry.
However, I don't know if you actually want to upgrade the version of gcc coming with your distribution: gcc's C++ ABI is moving, and it only recently got somewhat stabilized. Replacing your gcc with the latest one might break many things in your OS.
Play the safest move and install this version beside the other one (and create a small script, maybe, in order to update the /usr/bin/gcc, etc. links. Maybe RedHat uses the /etc/alternatives scheme?).

The good point of Gentoo is that we got one and only one version of gcc at a time (the bad(?) point being we must recompile everything with that new gcc if the ABI changed http://www.opengl.org/discussion_boards/ubb/wink.gif )

Julien.

SirKnight
02-25-2003, 02:28 PM
Cool, I finally got it to compile and stuff. Sure did take a long time to compile, at least 45min. http://www.opengl.org/discussion_boards/ubb/biggrin.gif So I added /usr/local/gcc322/bin to the profile file like so: PATH=$PATH:"/usr/local/gcc322/bin" but when I type g++ -v I still get 3.2. In order for me to use 3.2.2 i have to type /usr/local/gcc322/bin/g++ -o foo bar.cpp. I guess I have to make a script or something like Julien said, the bad thing is I don't know how. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

The bad thing is that my program just like that sample one posted above doesnt work right even with 3.2.2 for me. I made a file called foo and put a 5 in it. Made a file called bar and put a 10 in it. When I run that program above I get:
5
5

I checked for f.fail() but it didn't seem to fail since my cout never executed saying "error opening bar." This is so strange.

Thanks for all the help so far though. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

[This message has been edited by SirKnight (edited 02-25-2003).]

pkaler
02-25-2003, 02:54 PM
You can make a symbolic link.



# ln -s /usr/bin/gcc322 /usr/local/gcc322/bin/gcc


Then explicitly call gcc322 when you want to use your newly compiled compiler.



Linux is just a kernel. RedHat Linux is an operating system coming with a Linux Kernel and a whole bunch of GNU packages. Gentoo Linux (my current OS) is a different operating system, as well as are Debian Linux, etc.


If you're gonna be pedantic, that's GNU/Linux. http://www.opengl.org/discussion_boards/ubb/smile.gif

SirKnight
02-25-2003, 03:20 PM
A symbolic link, that sounds good. Thanks! http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-SirKnight

V-man
02-25-2003, 04:33 PM
>>>Sure did take a long time to compile, at least 45min.<<<

Compile what? gcc?

I had RH8 installed on another disk and it had a package called KDevelop. Nice looking and everything but I wasn't able to compile anything, GL and non GL.

You guys don't use an IDE for your work?

SirKnight
02-25-2003, 06:44 PM
Yes to compile gcc.

I usually use KDevelop to write the code but I use the console to compile. Right now i'm only using linux for my Data Structures class and we have to use the console to compile cuz he wants to see that we understand both the compile and linking stage. :-/ We have to use script to capture everything so that way he can see it compiled and too see the output.

The trick for me now is to get KDevelop to run my new g++ instead of the old version. I havn't yet found the spot to tell it where the new g++ exe is located.

-SirKnight

pkaler
02-25-2003, 06:52 PM
Originally posted by V-man:

You guys don't use an IDE for your work?



emacs. All the IDE that I need.

Actually I tried KDevelop a while ago. Didn't like it so much. I might give it another try when it hits 3.0.

velco
02-25-2003, 11:38 PM
Originally posted by SirKnight:
[B]Cool, I finally got it to compile and stuff. Sure did take a long time to compile, at least 45min. http://www.opengl.org/discussion_boards/ubb/biggrin.gif


That's parly because it did three compiles, that's what "maka bootstrap" does:

a) compile the new compiler with the old system compiler

b) compile the new compiler with the compiler built at a)

c) compile the new compiler with the compiler built at b)

d) compare compilers built by b) and c). If they are not identical (byte-for-byte) give an error.



So I added /usr/local/gcc322/bin to the profile file like so: PATH=$PATH:"/usr/local/gcc322/bin" but when I type g++ -v I still get 3.2.


you should put it in front of PATH




export PATH=/usr/local/gcc322/bin:$PATH


~velco