PDA

View Full Version : GeForce4...



Toinou
02-20-2002, 07:16 AM
Hello, what do you think of Nvidia's new product GeForce4? I was told it was engineered to work with Direct3D, but would it work well with OpenGL?

richardve
02-20-2002, 07:52 AM
Uhm.. hello? .. you're talking about NVIDIA.. the one and only company on this planet having the best OpenGL support (thumbs up!), and you're asking if it'd work well with OpenGL??!?!


[This message has been edited by richardve (edited 02-20-2002).]

Humus
02-20-2002, 08:17 AM
Bah, a driver without WGL_ARB_render_texture is not complete.

Diapolo
02-20-2002, 08:19 AM
Of course GF4 will work GREAT in OpenGL.

But thatīs in NO WAY a question for the advanced OpenGL forum!

During the last days quite a few questions were posted here, that donīt need to be posted in the advanced forum ... itīs really annoying http://www.opengl.org/discussion_boards/ubb/frown.gif.

@Humus:
I guess we will VERY soon see a NVIDIA driver, that supports: WGL_ARB_render_texture http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

[This message has been edited by Diapolo (edited 02-20-2002).]

Toinou
02-20-2002, 08:54 AM
Yes, you're right, I should have posted my request on the beginners forum.
Anyway, thanks a lot for answering so quick!

Humus
02-20-2002, 09:35 AM
Originally posted by Diapolo:
@Humus:
I guess we will VERY soon see a NVIDIA driver, that supports: WGL_ARB_render_texture http://www.opengl.org/discussion_boards/ubb/smile.gif.


I sure hope you're right http://www.opengl.org/discussion_boards/ubb/smile.gif

cass
02-21-2002, 06:12 AM
Yes, the next driver update will support WGL_ARB_render_texture, WGL_NV_render_texture_rectangle, WGL_NV_render_depth_texture,
GL_NV_depth_clamp, GL_NV_occlusion_query, GL_HP_occlusion_test, GL_NV_point_sprite, GL_NV_texture_shader3.

These extension specs should be posted on the developer.nvidia.com in the next day or so.

Thanks -
Cass

Diapolo
02-21-2002, 06:31 AM
Thanks for your reply cass http://www.opengl.org/discussion_boards/ubb/smile.gif.

Will all these extensions be hw accelerated on GF3 or are some of them GF4 only?

What about GL_NV_vertex_program1_1?

Diapolo

cass
02-22-2002, 10:49 AM
Most of the new extensions will be accelerated on GeForce3. Texture shader 3 is GeForce4 Ti only.

The new extension specs will be posted to the web this evening.

Thanks -
Cass

zed
02-22-2002, 11:15 AM
Originally posted by cass:

Yes, the next driver update will support
WGL_ARB_render_texture,


wasnt render to texture 'just around the corner' a year ago http://www.opengl.org/discussion_boards/ubb/wink.gif

NitroGL
02-22-2002, 06:08 PM
Originally posted by cass:

Most of the new extensions will be accelerated on GeForce3. Texture shader 3 is GeForce4 Ti only.

The new extension specs will be posted to the web this evening.

Thanks -
Cass

LIES!!!!! It's 10:00PM here, and it still hasn't been posted!
http://www.opengl.org/discussion_boards/ubb/smile.gif

Diapolo
02-22-2002, 06:19 PM
Perhaps Iīm dreaming, because itīs 5 oīclock in the morning here in germany, but I swear I saw the new extensions listed on this page (http://developer.nvidia.com/view.asp?IO=nvidia_opengl_specs) and the new PDF file for download, but after I refreshed the browser window, the update was gone and the old version was there again?

Guess they are working on the page, at the moment http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

NitroGL
02-22-2002, 06:50 PM
Originally posted by Diapolo:
Perhaps Iīm dreaming, because itīs 5 oīclock in the morning here in germany, but I swear I saw the new extensions listed on this page (http://developer.nvidia.com/view.asp?IO=nvidia_opengl_specs) and the new PDF file for download, but after I refreshed the browser window, the update was gone and the old version was there again?

Guess they are working on the page, at the moment http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

LOL, I just got the same!

dorbie
02-22-2002, 10:40 PM
I have to agree, NVIDIA has outstanding OpenGL support IMHO.

The issue of D3D vs OpenGL is an interesting one when it comes to hardware design. Much of this is politics and PR guff. Every IHV needs to come out and say "3 bags full" to stroke Microsoft and to sell cards, but we know that DX8.1 is out and DX9 is not here yet. OpenGL is the only way to expose new features in the mean time. In addition it's the only way for a vendor to release new features without signing in blood with Microsoft. The ARB may be slow but an IHV needn't wait on the ARB. With D3D they not only have to wait on Microsoft, they have to persuade them, and that's IF there's a handy DX release you're onboard with.

Who knows what Microsoft is demanding from IHVs in exchange for inclusion in DX9. It can't be pleasant based on the stories of friction that leaked to the press.

Microsoft seems to play favourites on various releases, perhaps they pick the underdog or maybe other factors motivate them, but the massaging of APIs to meet 3D hardware design as envisioned by the real experts at NVIDIA, ATI & elsewhere is not what drives graphics innovation. D3D is a vehicle for Microsoft control and nothing else, I suspect the more intelligent people in the IHVs realize this.

davepermen
02-22-2002, 11:39 PM
i hope depth_clamp will be available for gf2mx, too..

PH
02-23-2002, 12:15 AM
The depth_clamp extension is certainly useful but it won't make implementing shadow volumes a piece of cake. There's still one "hard" case that nobody seems to worry about and that's the problem of lights too close to occluders. This requires clipping the shadow volumes ( not that difficult but moderately expensive ).

[This message has been edited by PH (edited 02-23-2002).]

davepermen
02-23-2002, 12:50 AM
hm.. since when? any documents about this?

PH
02-23-2002, 01:08 AM
Since always http://www.opengl.org/discussion_boards/ubb/smile.gif. One of the papers/presentations on NVIDIA's site briefly mentions this. I don't remember exactly which one, but all the documents there contain some important details that are easily missed.

davepermen
02-23-2002, 01:10 AM
hm.. never heard of this. you mean if i'm near with a lightsource to an object that expands shadowvolumes i get errors?

PH
02-23-2002, 01:26 AM
The closer a light is to a surface, the farther you need to extend the silhouette quads and back capping triangles. This paper mentions the clipping required,
http://developer.nvidia.com/view.asp?IO=cedec_stencil

davepermen
02-23-2002, 03:13 AM
this is not at all a problem for faked shadow volumes. there where you expand the backfacing vertices.. there you always generate a volume behind yourself. it just has to be expanded far enough (onto a far away sphere for example..)

PH
02-23-2002, 05:11 AM
Extending the vertices is not enough. The edges and triangles that use these vertices is the real concern. These will intersect the sphere, even if the vertices are extended far enough.
Imagine what happens when the lights distance from a surface approaches zero ( how far would you need to extend ? And it won't be the same for all vertices. )

davepermen
02-23-2002, 05:21 AM
the face near to the light does not get affected at all cause the normals face to it. only the faces watching into the other direction get stretched away. and, for normaal meshes, they are reasonably far away from the lightsource to get squished enough. you can even stretch by the inverse of the length if you want or something http://www.opengl.org/discussion_boards/ubb/smile.gif when i have my gf4 i'll see myself how its done the best way..

cass
02-23-2002, 07:01 AM
Originally posted by PH:
The closer a light is to a surface, the farther you need to extend the silhouette quads and back capping triangles. This paper mentions the clipping required,
http://developer.nvidia.com/view.asp?IO=cedec_stencil

The advice in the above document is not valid in the context of NV_depth_clamp. You can extend your shadow volumes to infinity without worrying about them being clipped by the far plane.

Cass

PH
02-23-2002, 07:09 AM
Yes, but I didn't want to mention anything about infinity ( you know why http://www.opengl.org/discussion_boards/ubb/smile.gif ).

EDIT:
I would still recommend clipping the shadows when the light becomes stationary and extending to infinity when/if they move ( my current approach ).

[This message has been edited by PH (edited 02-23-2002).]

SirKnight
02-23-2002, 07:31 AM
What cards will support the GL_NV_depth_clamp ext? I know the gf4 ti but will, but will it also work with the gf3 and possibly gf2? Maybe i just dont completely understand but if we use nv_depth_clamp with shadow volumes will this eliminate all the fancy stuff we have to do if the volume intersects the near/far clip planes? If so that would be great! http://www.opengl.org/discussion_boards/ubb/smile.gif

-SirKnight

PH
02-23-2002, 07:41 AM
The GeForce3 supports it, don't know about the others. But even if they do, there are still cards from other companies that mat not.

jwatte
02-23-2002, 08:03 AM
A quick and dirty hack that avoids fancy clipping is to use the "min" operation on the transformed Z component to pull it in closer than the far Z plane. You can do this with a single instruction in a vertex shader. Thus, you can generate the shadow volume geometry entirely in a vertex shader, by extruding vertices with away-facing normals, and clamping the Z component hither of yon. Both these techniques are imprecise hacks, but I think the gain of keeping it all in hardware might be worth it.

mcraighead
02-23-2002, 09:55 AM
GF3 and beyond support NV_depth_clamp in hardware.

- Matt

SirKnight
02-23-2002, 02:33 PM
GF3 and beyond support NV_depth_clamp in hardware.


Darn i was afraid of that. O well.

Also Matt i have a question for you. I was just recently messing with some (well trying to anyway) different extensions in that opengl extension pdf thingy and i came across two that to me seems odd that i couldnt use them. They are: EXT_texture_compression_s3tc and EXT_multi_draw_arrays. I thought my GeForce 256 supported s3tc texture compression but i get an error back from the glh init extension func. Am i just wrong about the gf 256 supporting s3tc? And about the other extension, why couldnt my card support this? Its not like a special feature like texture shaders that is a special part on the chip. It just allows us to draw multiple lists of vertices with one function call. This seems like something simple that any geforce could do in the drivers. What is up with this? Thanks.

P.S. I'm using driver version: 23.11. I think that is still the newest drivers for XP.

-SirKnight

[This message has been edited by SirKnight (edited 02-23-2002).]

cass
02-24-2002, 05:03 AM
SirKnight,

Make sure this isn't a glh problem. Did it work on previous drivers and then stop?

Thanks -
Cass

SirKnight
02-24-2002, 12:40 PM
Well Cass i never tried it on any other drivers. The ones i use are the latest official ones from nvidia for XP. It works fine for the other extensions i use like NV_vertex_program, NV_register_combiners, etc. Maybe ill try out some of those "leaked" drivers. http://www.opengl.org/discussion_boards/ubb/wink.gif A while back i tried some of those "leaked" ones, i dont know what version, 27.00 or something and it made my main opengl test demo all screwed up. It even made my games slow but its beta and unoffical so i cant say too much. http://www.opengl.org/discussion_boards/ubb/smile.gif

Im not sure what version of drivers i should try. But i guess ill try something. http://www.opengl.org/discussion_boards/ubb/smile.gif

Also Cass i want to ask you something, about where is the nvidia office you work at in Austin located? Every day i drive through south austin and downtown since i go to the Rio Grande ACC campus and even when i sometimes go to other parts of austin i have never seen an nvidia building. Im just curious thats all. http://www.opengl.org/discussion_boards/ubb/smile.gif

-SirKnight