3D Texturemap Verdict

Well, sorry everyone.

I have tested my simple 3d texturemapping code in both windows (12.10 drivers) and under Linux (0.9-769 drivers) with my visiontek Geforce3 and it is NOT hardware accelerated .

I can tell for sure as I have seen this code run with hardware support (Onyx around 50 fps) and without (TNT2 around 0.2 fps) and it is the same speed as my old TNT2.

This sucks since it was one of the features I was most looking forward to. I guess Nvidia must have changed (removed?) it since Carmack mentioned it.

Just thought I’d let everyone know so that you don’t go buying a GF3 for that ability.

– Zeno

The bastards!

In that case, I’m going to have to buy me a Radeon.

  • Tom

Yeah I read that some where, because of time constraints, money and that NV is lazy thy didn’t put it in. This kind of sucks!!! I was wanting to show a 4 dimensional shape on a 3d cube (you can show 3d on 2d so supposedly 4d on 3d) and 3d textures would help to show this - the 4d change changeing along the 3d cube. With some transparency it would look cool.

P.S I have found a GeForce 3 really cheap, an Elsa Gladiac Tv-out 64 MB card for < £300 while GeForce 3’s usualy sell for £450. Does any one know what it is like, is it a good implementation? it says it has 460MHz DDR ram = 230 MHz while some of the others have only had 200Mhz Ram (DDR)?

Elsa cards were always top quality, so I’ll buy Elsa one in your situation. I personally place Elsa, Asus (and Creative before they’ve gone out of this business) on top of the ranks

And about 3D texture - don’t say “it doesn’t work!” etc. Let’s ask Matt or Cass for the qualified response and rely on that. Maybe it’s buggy driver or mb or what? Remember NV_vertex_array_range … it worked perfectly on intel chipsets, but on some via-based mobos it failed to allocate more than one chunk of AGP memory - it does not mean that the card is bad, it’s just a combination of problems.

Also, I want to say that imho, nVidia is the best company out there on the consumer cards market, and it gave us ability to play so wonderful modern games, do our work faster in modelling software, support our game and CAD development efforts, and help the graphics software industry to push it up to it’s limits, opening new markets and possibilities for developers - so we should be forgiving to company that gave us all that and gives it today.
Also, OpenGL related - remember who gave consumers and hobbyist developers mainstream and bugfree OpenGL support at affordable cost? ATI? Matrox? 3DLabs? 3DFX? No - it was nVidia )

I have been browsing nVidia’s web site before answering this one: why the hell did you assume that GeForce3 would support 3D textures in Hardware ??? I don’t remember anything/anyone from nVidia stating that ! I even remember mccraighead saying that GeForce3 DOES NOT support them in HW…

OK, Radeon does it… So if you NEED 3D textures, buy a Radeon !

I really don’t understand it when people buy a piece of hardware and then complain because a functionality that is not supposed to be there is actually missing… Why don’t you read the technical briefs before buying ???

Granted though: as far as I remember, the box of the ELSA Gladiac 920 has got “3D Textures” written somewhere on it, which can be misleading…

Regards.

Eric

3D Textures were supposed to be in the GF3 but I hear due to some difficulties they were disabled.

Shame as they would have been a more complete implementation than the Radeon’s, and would have gone well with the new shader capabilities.

Still Radeon2 & GF3.5 GTS anyone?

Originally posted by Tim Stirling:
P.S I have found a GeForce 3 really cheap, an Elsa Gladiac Tv-out 64 MB card for < £300 while GeForce 3’s usualy sell for £450.

Damn. I just ordered one of those for $500. It’s probably because I live in Belgium though - all hardware seems to be more expensive here. Hercules’ GF3 cards, for example, are shipping at prices of almost $700 (!) over here.

[EDIT] Oops - you were talking pounds, not dollars. That means we end up in about the same price range

  • Tom

[This message has been edited by Tom Nuydens (edited 05-08-2001).]

I really don’t understand it when people buy a piece of hardware and then complain because a functionality that is not supposed to be there is actually missing… Why don’t you read the technical briefs before buying ???

Here is an exerpt from JC’s .plan that I was referring to:

The depth buffer optimizations are similar to what the Radeon provides,
giving almost everything some measure of speedup, and larger ones
available in some cases with some redesign.

3D textures are implemented with the full, complete generality. Radeon
offers 3D textures, but without mip mapping and in a non-orthogonal
manner (taking up two texture units).

Vertex programs are probably the most radical new feature, and, unlike
most “radical new features”, actually turn out to be pretty damn good.
The instruction language is clear and obvious, with wonderful features
like free arbitrary swizzle and negate on each operand, and the obvious
things you want for graphics like dot product instructions.

We are not “assuming” the idea of 3d textures and then complaining about its loss. We are complaining because it was obviously there and then later dropped. Have you noticed how quiet Matt and Cass are about this?

– Zeno

Here is a link to the whole thing if you want:
http://finger.planetquake.com/plan.asp?userid=johnc&id=15205

[This message has been edited by Zeno (edited 05-08-2001).]

Originally posted by Zeno:
We are not “assuming” the idea of 3d textures and then complaining about its loss.

I am really sorry but you ARE !

AFAIK, John Carmack doesn’t work for nVidia and hence his words shouldn’t be taken as words from an nVidia PR guy…

The early prototype of NV20 had 3D textures built-in ? Fine…

It was removed in the final GeForce3 design ? Shame, I agree…

Neverthelss, NOWHERE IS IT WRITTEN THAT THE GEFORCE3 SUPPORTS 3D TEXTURES IN HARDWARE ! Hence, when you bought it, you “assumed” the 3D textures were supported in HW…

If you had looked at the OFFICIAL spec of the Geforce3 chip, and not at JC’s words about the NV20, you would have known that… And then you would have gone for a Radeon… or wouldn’t you ? .

If you are complaining about the drop of 3D textures (but are you even sure they dropped it ???), I can see what you mean… If you are complaining because nVidia sold you a chip that doesn’t support 3D textures, that’s the only thing they meant to sell…

Regards.

Eric

[This message has been edited by Eric (edited 05-08-2001).]

I am taking up a collection of GeForce 3 cards that you want to dump because it doesn’t have 3D texture support. I will even give you a written receipt from a non-profit organization, so you may be able to write it off on your tax return (some restrictions apply).

So, don’t complain, donate!

Look, I can’t speak for other people, but here is a timeline about how it personally worked for me:

I saw JC’s .plan long before nvidia had released anything official. I was very excited about 3d texturemapping being hardware accelerated.

Later, I did see something from nvida claiming full OpenGL 1.2 support in hardware. This confirmed my belief that 3d texmap would be there.

Someone on this board later noticed that it says “sw” in nvida’s opengl extensions pdf, meaning that the gf3 would only support them in software.

Cass confirmed this, also on this board. At this point, I was sure they would NOT be supported.

After this, I saw a review of herc’s 3d prophet III that said that it supported dx8 volumetric textures and volumetric texture compression in hardware. Confusion about whether or not it was there was re-introduced in my mind.

At this point, I bought the card. I was assuming 3d texturemapping WASN’T supported, but hoping that it was, that’s all. I tested it and this feature is missing. The card is amazing for a huge variety of other reasons and I am not a bit upset with the purchase. I wouldn’t trade it for a Radeon.

I would, however, like to know what happened with that one feature. Why isn’t there an official explanation? Is MS forcing this to be dx8 only? Xbox only? Is it for quadros only? Ultra version of the card?

JC doesn’t work for Nvidia, but he may as well. They optimize their drivers around his code. They give him alpha versions of their cards to use. He has input on the future direction of hardware. Therefore, if he says a feature is there, I believe it. His words are certainly better than any from a “PR guy”, who would just spew out cool sounding words like quncunx and nfinite fx without even really knowing what they mean.

nuff said
– Zeno

Zeno, I understand what you meant: sorry if I did sound angry at you… That wasn’t the purpose !

As far as 3D textures in HW are concerned, I suppose it is up to nVidia to make an official statement… I have been told things about that but I think it is under NDA (and it was unofficial anyway…).

Hey, it is possible to turn a GeForce into a Quadro so perhaps there’s a way to activate 3D textures on a GeForce3 !

Regards.

Eric

Note that no one like Cass who works for NV are saying anything.

I think there maybe 3d texturing in some furtue GeForce 3 in a kind of Ultra edition or something. If not then it better be in the GeForce 4 or what ever is next.

Agreed. The fact that they are not allowed to make statements about it probably means that it’s not as simple as it being “dropped”.

The marketing types probably didn’t want to introduce all their new features at once. Instead they wanted to save it as a selling point on a future product, particularly since the feature will be free at that point (already in the silicon).

It’s sorta weird to think that my chip may be capable of doing something that I’d like it to do but it has just been crippled…I would have at least liked the option to pay more to have it enabled :stuck_out_tongue:

– Zeno

An official statement will be made soon that will clear things up. In the interim, I’m sorry about the confusion.

Cass

Thanks cass,

This is why I love nVidia hardware - the support and interest in us as developers. I personally would love 3D textures in hardware. My thesis is on volume rendering and I use 3D texturing already. It would be great to get it accelerated on nVidia before the end of the year.

Thanks for responding Cass. I’m looking forward to the statement

Like ffish, my work would benefit to no end by no longer having to emulate the effects that could easily be achieved with 3d texture mapping.

I also wanted to say that except for this one missing feature, I have been completely happy with every nvidia product I’ve used. The stable drivers, linux support, and responsiveness of Cass and Matt are all great.

In comparison, it’s been a nightmare when I’ve had to port code to a wildcat or voodoo card at work. I don’t have any experience with ATI, so I can’t comment on that.

I just made my first GF3 vertex program last night (thanks for the example programs Nutty) and I REALLY like this feature. Do you think pixel programs could ever work in a similar way with this much flexibility?

– Zeno

As a registered dev. at both NVidia and ATI and can say this :
NVidia support is good. It always was, and therefore I understand people who use it intensively.
ATI support was nonexistent. But they changed their crew, and now the ATI dev. support is in my opinion very good. I always has quick answers to my questions, their SDK is now much better with all the OpenGL extensions documented. Also the GL drivers are now really awesome.
And the RadeON’s price is unbeatable.