PDA

View Full Version : Number of textures in OpenGL



vince
09-05-2003, 07:40 AM
Hi,

I'm wondering if anybody knows why on a GeForce FX 5900, we can use only 4 textures simultaneously, while DirectX allows 8 textures on the same card.

Thanks.

AdrianD
09-05-2003, 08:16 AM
this is only true for fixed pipeline stuff.(it's just an API restriction)
when you are using fp's you can use more textureunits.

vince
09-05-2003, 08:41 AM
That's interesting, but if it's a API restriction, how come I can go up to 8 textures on a Radeon? This is the same API...

Ostsol
09-05-2003, 08:45 AM
Probably because that Radeon allows 8 textures in the fixed function pipeline.

Korval
09-05-2003, 09:45 AM
It isn't an API restriction. It is a restriction imposed by nVidia (for whatever reason).

jwatte
09-05-2003, 10:02 AM
nVIDIA didn't want to expose 8 classical texture environments, with the texture crossbar and everything. Thus, they said "fixed function is a thing of the past (GF4 and below) and we only support up to that level."

I can kind-of see where they're coming from. If you need more textures, then you probably want to be doing fragment programs anyway.

vince
09-05-2003, 10:29 AM
That's right, what I really wanna do is to use the fragment programs, but they are just way to slow to be usable for my kind of applications. So I'm stuck with fixed pipeline.

Jan
09-05-2003, 10:51 AM
Hm, i canīt remember a single post of the last two months, where someone said something positive about the GfFX.
Does it have any real good feature the 9700 or 9800 doesnīt have?

vince
09-05-2003, 10:58 AM
It has a better driver (ATI one-s is still very buggy), but other than that...

Ostsol
09-05-2003, 11:15 AM
Originally posted by vince:
It has a better driver (ATI one-s is still very buggy), but other than that...
Sorry, but I've just gotta ask: when was the last time you used ATI cards to come to that conclusion?

vince
09-05-2003, 11:18 AM
This morning...

Seriously, there are a few very anoying bugs in their driver, mostly if you use win2000. The winXP version seems a lot better.

Coconut
09-05-2003, 11:26 AM
I am not kidding, many of my friends have sworn they are not going to buy any ATI products..... and they are all using and developing software on PCs for different applications(other than OpenGL)

Ostsol
09-05-2003, 11:40 AM
Originally posted by vince:
This morning...

Seriously, there are a few very anoying bugs in their driver, mostly if you use win2000. The winXP version seems a lot better.
Ah. . . ok. . . (Strange, though, since 2000 and XP use the same driver set. . .)

vince
09-05-2003, 11:43 AM
Originally posted by Ostsol:
Ah. . . ok. . . (Strange, though, since 2000 and XP use the same driver set. . .)

Sure, but they probably use different code path inside the driver. ATI could tell you better than I can though ;-)

tellaman
09-05-2003, 07:13 PM
activetexture method seems a bit messy, can anyone provide some sample code or links different from the spec on how to get this to work?

[This message has been edited by tellaman (edited 09-05-2003).]

DopeFish
09-05-2003, 07:53 PM
My radeon9800 is running flawlessly. Using the Catalyst 3.4 drivers (3.6 and 3.7 have a bugged OpenGL driver, where its forced 16bit, you cannot use 32bit rendering).

What are these problems youve been having with them?

vince
09-05-2003, 09:42 PM
I had problems on my 9800 when minimizing a window (crashing), large texture coordinates causes noticable pixelisation, txf fonts appear screwy, and some unknown conditions freezes my computer.

There was more with the catalyst 3.6, but ATI already fixed mos of them in what will be the 3.7. So the good thing is they're progressing.

Ostsol
09-05-2003, 09:51 PM
Well, we seem to be getting off topic, but as long as Vince is helping us to go off topic, I guess it's ok. http://www.opengl.org/discussion_boards/ubb/wink.gif

Anyway, it's been mentioned before that it's just forcing 16 bit textures when you don't specify RGB8. If you do, it forces 32 bit. It seems that the "force 16 bit textures" option in ATI's drivers merely tells OpenGL to default to 16 bit. It doesn't really force it, as that would imply that there would be no way at all to get 32 bit textures.

Nutty
09-06-2003, 04:36 AM
Its to do with texture compression, and its not just when you dont specify RGB8.

From an ATI driver person;



It is a minor annoyance that only 16 people had reported to us!
When you go to high quality setting on the texture compression setting in the OGL control panel, we are mistakingly converting textures to 16 bit as opposed to 32 bit.

I have seen a fix today and we are at analyzing quality of the fix. It will be fixed next CATALYST release.



[This message has been edited by Nutty (edited 09-06-2003).]

jwatte
09-06-2003, 09:14 AM
> some unknown conditions freezes my computer

If you have a VIA based motherboard that's based on KT333 or newer, it's likely that the problem is with the VIA motherboard, not with the Radeon graphics card.

We've seen a lot of this, and that's the diagnosis we've arrived at. My next computer's an Intel! (unless those Athlon64s REALLY manage to convince me otherwise)

Nutty
09-06-2003, 12:32 PM
I currently have a via based motherboard, and its stable as hell with my gf4.

Of course my next mobo will most likely be an nvidia chipset..

vince
09-06-2003, 12:39 PM
Originally posted by jwatte:
[BIf you have a VIA based motherboard that's based on KT333 or newer, it's likely that the problem is with the VIA motherboard, not with the Radeon graphics card.

We've seen a lot of this, and that's the diagnosis we've arrived at. My next computer's an Intel! (unless those Athlon64s REALLY manage to convince me otherwise)[/B]


I actually have no clue what my mother bord is, except is a dual CPU. Both if iwas the mother board, I assume it would freeze no matter what graphic card is installed. But that problem is only with the Radeon 9800, so I assume it's the board (even the 9700 is ok).

Korval
09-06-2003, 01:54 PM
My next computer's an Intel! (unless those Athlon64s REALLY manage to convince me otherwise)

Slightly OT, but you can get an nForce-based motherboard for Athlons. Many people consider them superior to Via chipsets.

SirKnight
09-06-2003, 04:09 PM
AFAIK the nForce chipsets are only for Athlon chips. I have an nForce2 mobo with an Athlon XP 2600 right now with a GeForce 4 Ti 4400 in it and everything is prefectly stable...and fast might I add.

-SirKnight

Korval
09-06-2003, 04:41 PM
AFAIK the nForce chipsets are only for Athlon chips.

The point I was making is that he doesn't have to switch to Intel processors just for a more stable/higher quality motherboard.

davepermen
09-06-2003, 07:05 PM
well. long stories short end:

nothings perfect.

and, because we where all nvIdiots, we are still trained to
"you have an ati card and bugs? its the ati drivers"
en contraire to
"you have an nvidia card and bugs? your system is a bad setup"

this was true for a long time.

today we can say that BOTH run rockstable in a rockstable system, and BOTH can get rather instable in bad configured systems. i have crashing systems with nvidia and ati. but only in bad systems.

i never got any crash for example compaq/hp pc's at work, no mather what mess i tried.


and yes, there is not much good about the gfFX wich anybody cares anymore. it has one big problem: it performs bad in dx9 code, or ARBfp, and it has some missing features wich are important for real futurestyle effects programming. real float texture support for example.

well well.. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

i go to sleep now. night.

M/\dm/\n
09-07-2003, 03:08 AM
The only problem with FX is that you must learn to write program in a new style, otherwise FX=~=R

davepermen
09-07-2003, 03:36 AM
well, thats the "only" problem for developers. for gamers its the problem that the card will never perform well for anything except topseller games. wich is nice for topseller games, but for all smaller things it isn't.

yeah, GlideFX rocks. a mix between opengl extensions, cg, huge tool/sdk packages.

fun http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Nutty
09-07-2003, 10:37 AM
A mix of opengl extensions, cg etc...

what?? Cg has nothing to do with the performance of the GF-FX. It's a tool to write shaders more simply. If you dont like its output, you dont have to use it. Simple as that.

There isn't much difference between NV_FP and ARB_FP, except the precision issues, which dont account for the performance differences, that much. It's all to do with register usage apparently... Hopefully Det50 will alleviate this problem somewhat..

davepermen
09-07-2003, 11:21 AM
you haven't informed yourself well about the extensions, and about the hw actually.

yes, register usage is a main point, but its not at all all you have to take care. cg is about needed to get good performance, and is a proprietary addon over the nv extensions => part of it.

jwatte
09-07-2003, 08:05 PM
Actually, NV_fp is much more powerful than ARB_fp, because NV_fp has derivatives "for free" at any point, and also supports predication ("only do this instruction if this status bit is set").

I _really_ wish ARB_fp had the same smarts. Of course, once the R400 is available, it'll probably leapfrog the NV35, and then we'll all wait to see how the NV40 performs :-)

tfpsly
09-07-2003, 11:53 PM
Originally posted by davepermen:
[cg] a proprietary addon over the nv extensions => part of it.

At least it makes it possible to write shaders that can run quite fast, and with more hardware support than arb_fp

Nutty
09-08-2003, 12:39 AM
and is a proprietary addon over the nv extensions => part of it.

Its nothing of the sort. You can use Cg, without ever having to use any NV extensions at all.. I dont understand why you criticise it soo much, at least they got off their arses, and actually made a working HLSL compiler that works right now, some games are even shipping with it. Can that be said for the GL2 shading language? No it cant.

If ATI had produced it, you'd probably think its the best thing since sliced bread.

davepermen
09-08-2003, 01:20 AM
no, definitely not. at least not if it would be an ati proprietary marketing tool only.

sure cg is not officially there to work for gfFX, but they work very hard on making people believe cg and gfFX are one. i see tons of people asking what gfFX they should buy to develop for cg, and tons of people who buy gfFX and don't even know you "could" work without cg for it. and all of those are newbies 16 upwards.

sure, cg is not needed and not nvidia only. but its a psychological need they want to built with marketing.

bah, nvidias marketing is crap anyways. they put dawn onto the gfFX 5200 package.. i want to see at wich res you can watch her smooth http://www.opengl.org/discussion_boards/ubb/biggrin.gif

tfpsly
09-08-2003, 04:14 AM
Originally posted by davepermen:
i see tons of people asking what gfFX they should buy to develop for cg[...]

Indeed... http://www.opengl.org/discussion_boards/ubb/confused.gif
The very thing I like about Cg is that it is not limited to a few cards.


They put dawn onto the gfFX 5200 package.. i want to see at wich res you can watch her smooth http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Very slowly. Strangely, it works at the same slow rate, whatever the resolution is. I do not see much imporvement (or none) when reducing the resolution.

At least I do not need a hacked naked version to make it run on a ATI http://www.opengl.org/discussion_boards/ubb/wink.gif

EDIT : presentation/quoting fix

[This message has been edited by tfpsly (edited 09-08-2003).]

davepermen
09-08-2003, 06:52 AM
hehe, but the hack rocks http://www.opengl.org/discussion_boards/ubb/biggrin.gif i can at least run it smooth at every res.. independend on res, too, yep http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Ostsol
09-08-2003, 07:46 AM
Originally posted by tfpsly:
At least I do not need a hacked naked version to make it run on a ATI http://www.opengl.org/discussion_boards/ubb/wink.gif
Actually, it works perfectly fine with her clothed, on ATI cards. . .