PDA

View Full Version : Any tips for converting from Direct3D



04-11-2002, 12:57 AM
Note this isn't a troll, I'm just frustrated.

Lets see, problems:

1. No AGP/Video Ram usage builtin
2. I get one big vertex buffer with wglAllocMemoryNV? wglAllocMemoryNV returns NULL for almost everyting

3. Interleaved vertices don't seem to be a priority, but I though that was what made D3D fast.
4. I can't find the HTML specs for 1.3 or 1.2.1, only Unix PS and PDFs written with bad fonts.
5. D3D spits out error messages to the debug window, Does OGL have error messages?
6. glGetError has a total of 8 error types, which doesn't help me find out what I'm doing wrong
7. Extension spec format is almost worthless for finding out how to use the extensions. What am I supposed to do, merge it with the spec and then read the whole thing? Specs don't append GL_ to any identifiers, its implied.
8. I can't set a texture stage to use the coordinates from another stage, So 3 stages = 3 texcoords and huge vertices.
9.TEX_ENV_COMBINE doesn't seem to work for stages without a texture, even if im using GL_PRIMARY_COLOR_EXT
10. Display lists are good, but don't seem to support any sort of indexed primitives.
11. After I learn NVidia, I have to start over with ATI?
12. Whens 2.0 coming out?

If you've got the answers to these, that would be greatly appreciated. I get the feeling the only reason Quake 3 got made was because id had driver source code for the cards. OpenGL would be great for a quick class project, but game ready it is not.

zeckensack
04-11-2002, 03:06 AM
I'll pick up the easier ones http://www.opengl.org/discussion_boards/ubb/wink.gif

3. Pick one of the premade interleaved vertex array formats, or construct an arbitrary interleaved vertex format with the gl***Pointer and glEnableClientState functions. You can then submit single vertices with glArrayElement if you want to do that. glDrawElements and glDrawRangeElements should be reasonably fast.

4. Err, looks good enough for me. http://www.opengl.org/discussion_boards/ubb/confused.gif

5/6. Some things are allowed, some are not. It's precisely noted in the specs which arguments are valid for which function. The only other error you have to worry about is GL_OUT_OF_MEMORY ... or maybe not http://www.opengl.org/discussion_boards/ubb/wink.gif

7. The extensions specs always state what they change. You should understand the basic behaviour of that part of OpenGL before trying to extend it, no? I like it, because I've read enough of the core spec to just know it, having it all repeated in the extension specs would be redundant. Learn the ins and outs of core GL first, it'll get easier http://www.opengl.org/discussion_boards/ubb/wink.gif

9. This is clearly noted. You have to bind a texture in all cases, I like to call it GLuint dummy_texture;.

10. They do. Basically they support everything. You just have to understand that display lists are a snapshot in time, data is copied at list creation and can't be changed. They are used best for static geometry and - my favourite - blocks of state changes. Eg it won't have any effect on the display list if you 'update' a vertex array contained in it.

11. There are some resources available, for example Delphi3D (http://www.delphi3d.net/) where you can get an overview over the supported extensions of different cards. For trouble-free portability between ATI and NVIDIA (all caps!), only use stuff that is supported by both.

Lucretia
04-11-2002, 04:11 AM
Originally posted by yoyoyo:
Note this isn't a troll, I'm just frustrated.

Lets see, problems:

1. No AGP/Video Ram usage builtin


The OpenGL driver does the management.



2. I get one big vertex buffer with wglAllocMemoryNV? wglAllocMemoryNV returns NULL for almost everyting


Haven't used that yet, so I cannot comment.



3. Interleaved vertices don't seem to be a priority, but I though that was what made D3D fast.


Use compiled vertex arrays for the models. I do this, and keep seperate arrays for each, that way if we have weights we can enable the array if the extension exists. This applies to anything else that could be added extension-wise.

Plus, it's really fast.



4. I can't find the HTML specs for 1.3 or 1.2.1, only Unix PS and PDFs written with bad fonts.


The 1.3 spec I have has a really nice font. It looks like it was made with LaTeX, which has some gorgeous fonts - although some translations from PS to PDF do screw up. Mine is fine.



5. D3D spits out error messages to the debug window, Does OGL have error messages?


Yup, glGetError().



6. glGetError has a total of 8 error types, which doesn't help me find out what I'm doing wrong


This be enough to go by, because you should know where it's crashing and use that as your context.



7. Extension spec format is almost worthless for finding out how to use the extensions. What am I supposed to do, merge it with the spec and then read the whole thing? Specs don't append GL_ to any identifiers, its implied.


You really should have the spec next to you as well as the extension docs. I had no problem implementing stuff from extension docs so far ;-)



8. I can't set a texture stage to use the coordinates from another stage, So 3 stages = 3 texcoords and huge vertices.


OpenGL doesn't have texture stages like D3D, but there are extensions to do something similar. I think most of them are NV extensions.



9.TEX_ENV_COMBINE doesn't seem to work for stages without a texture, even if im using GL_PRIMARY_COLOR_EXT


Dunno :-/



10. Display lists are good, but don't seem to support any sort of indexed primitives.


The only thing they don't actually support is anything that can change the state of OpenGL - look at the red book for info.



11. After I learn NVidia, I have to start over with ATI?


Depends, if you're using shaders/vertex programs, then yes. NVidia have their own language (which is based on assembly language) and ATI have theirs which is a high level language.

OpenGL 2.0 will have a standardised language for all programmable interfaces.



12. Whens 2.0 coming out?


Dunno.




If you've got the answers to these, that would be greatly appreciated. I get the feeling the only reason Quake 3 got made was because id had driver source code for the cards. OpenGL would be great for a quick class project, but game ready it is not.



Nope, Q3A was written by people who know OpenGL inside out. There is an "article" somewhere on the net in which John Carmack describes most of the problems and how their engine falls back to different modes of operation depending on what extensions you have in the driver for your graphics card.

This is what you'll need to do to get a decent driver working.

Luke.