ATI and lack of h/w support

So I have this gfx card (running on xp latest drivers I think)

OpenGL Version : 2.1.7293 FireGL Release
OpenGL Renderer : ATI FireGL V7300
OpenGL Vendor : ATI Technologies Inc.

It says it supports opengl 2.1. Great !
But in the extension list it doesn’t list non power of two textures, even though that is a core opengl 2.0 feature. I also remember my old 9600se card didn’t support non power of two textures, even tho it claimed opengl 2.0 support. So what is going on, have i overlooked something ? Or have ATI just claimed they support opengl 2.0/2.1 even though they haven’t bothered to support all the features ?

And yeah i tried using non power of 2 textures on the firegl card, it just crashes the driver.

It could be that the problem is in your code. Are you calling glPixelStorei(GL_UNPACK_ALIGNMENT, 1)?

PS : It’s not an obliglation to have GL_ARB_texture_non_power_of_two in the extension list when you clame GL 2.1 support.
More info here http://www.opengl.org/wiki/index.php/NPOT_Textures
“OpenGL 2.0 and GL_ARB_texture_non_power_of_two”

no i wasn’t calling glPixelStore
I was simply trying to render a textured quad, and it crashed the driver. Making the texture a power of 2 texture solved it.

According to
http://www.opengl.org/documentation/specs/version2.1/glspec21.pdf
“The name string for non-power-of-two textures is
GL ARB texture non power of two. It was promoted to a core feature in OpenGL 2.0.”

Are you allowed to implement core features in software ? Or simply not bother to implement them at all ?

It says on the wiki u pasted

If you don’t have GL_ARB_texture_non_power_of_two, then you can make NPOT texture

Okay, but what happens if you have an opengl 1.5 card, and the manufactures want to just throw in the non power of 2 extension. If the fact the extension is listed means nothing how are you supposed to know the card supports it lol ?

I think basically ATI have f*cked up here. No doubt they realised their older generation of cards have no hardware support for non power of 2 textures… but just want to claim opengl 2.0 support anyway.

http://www.opengl.org/registry/specs/ARB/texture_non_power_of_two.txt
says nothing about the extension string being only an indicator as to whether it is supported or not. No other extensions work like this …

I am so close to buy a POS ATI card, and throwing it in the girlfriends PC, just so I can post the dam driver state here for people to see what is what. I myself am sick of not having a extension listing by ATI for extension they support. Nvidia has one, come on ATI get your act together. Until ATI supports texture arrays, VTF, FBO completely, and GS in OpenGL I am on Nvidia till then.

As i already stated in another thread, a friend of mine just got a Radeon 4850. It is the biggest nightmare i have EVER seen. This card should be able to run our application out of the box with ease, but it does a whole lot of not-funny things. First it compiles all our drivers flawlessly, then crashes, because we do things in the shaders, that ATI doesn’t seem to like. Great, figuring out what that is takes ages.

Then the funniest problem of all is, that it renders our image up-side-down (or mirrored, not sure). This is really a strange problem.

Render-to-texture is pretty much impossible. It renders all sorts of artifacts. We use glBlitFramebuffer and stencil masks and i don’t know which of those is responsible (maybe all) but it ends up rendering a regular grid of white dots on the screen on top of our image. Well, that’s only the part of the story that we were able to find out in two days.

Oh yes, and ATI simply did this: They checked what hardware features OpenGL 2.1 requires (DAMN you ARB for making them so low!), and then said “hey we DO support 2.1, lets write that on the package”.
Thing is, their OpenGL driver is artificially castrated, so even though the hardware would be capable (D3D10.1 requirements all fulfilled) we OpenGL users are left with nothing.

For example, it says 8 MRTs on the package, but gl only allows 4. I am pretty sure D3D10 required 4096 shader uniforms (or the DX equivalent), but gl gives you 512. Etc. etc. etc.

My frined asked his hardware store and they will exchange his Radeon 4850 for a Geforce 9600 tomorrow.

Jan.

Unless the core feature is optional, in which case the api contains some form of check for the feature itself or for some limits related to it, the driver must implement that feature. If the HW does not support it, it must be emulated in SW. At least that would be in ideal world, in real world the SW emulation is often buggy.

Okay, but what happens if you have an opengl 1.5 card, and the manufactures want to just throw in the non power of 2 extension. If the fact the extension is listed means nothing how are you supposed to know the card supports it lol ?

If the GL_ARB_texture_non_power_of_two string is present in the extension list, the NPOT textures are supported regardless of version of the OGL itself.

What the wiki page is trying to say is: If you find GL_ARB_texture_non_power_of_two in extension string, the NPOT textures are supported in HW. If the OpenGL has version 2.0, the NPOT textures are supported (you can create them and use them for rendering) however they might run in SW (always or unless some special conditions are meet) so if you wish be “sure” that they are HW accelerated, you should check for the extension string.

Here is a listing of what is supported on ATI Radeon 3650. I put it in the girlfriends HP desktop, as an upgrade from a GF6 series XT model. A nice upgrade, but still the GL support SUCKS.

Renderer: ATI Radeon HD 3600 Series
Vendor: ATI Technologies Inc.
Memory: 512 MB
Version: 2.1.7659 Release
Shading language version: 1.20
Max number of light sources: 8
Max viewport size: 8192 x 8192
Max texture size: 8192 x 8192
Max anisotropy: 16
Max samples: 8
Max draw buffers: 4
Max texture coordinates: 8
Max vertex texture image units: 16

Extensions: 102

GL_AMD_performance_monitor
GL_AMDX_vertex_shader_tessellator
GL_ARB_depth_texture
GL_ARB_draw_buffers
GL_ARB_fragment_program
GL_ARB_fragment_shader
GL_ARB_multisample
GL_ARB_multitexture
GL_ARB_occlusion_query
GL_ARB_pixel_buffer_object
GL_ARB_point_parameters
GL_ARB_point_sprite
GL_ARB_shader_objects
GL_ARB_shading_language_100
GL_ARB_shadow
GL_ARB_shadow_ambient
GL_ARB_texture_border_clamp
GL_ARB_texture_compression
GL_ARB_texture_cube_map
GL_ARB_texture_env_add
GL_ARB_texture_env_combine
GL_ARB_texture_env_crossbar
GL_ARB_texture_env_dot3
GL_ARB_texture_float
GL_ARB_texture_mirrored_repeat
GL_ARB_texture_non_power_of_two
GL_ARB_texture_rectangle
GL_ARB_transpose_matrix
GL_ARB_vertex_buffer_object
GL_ARB_vertex_program
GL_ARB_vertex_shader
GL_ARB_window_pos
GL_ATI_draw_buffers
GL_ATI_envmap_bumpmap
GL_ATI_fragment_shader
GL_ATI_meminfo
GL_ATI_separate_stencil
GL_ATI_texture_compression_3dc
GL_ATI_texture_env_combine3
GL_ATI_texture_float
GL_EXT_abgr
GL_EXT_bgra
GL_EXT_blend_color
GL_EXT_blend_func_separate
GL_EXT_blend_minmax
GL_EXT_blend_subtract
GL_EXT_compiled_vertex_array
GL_EXT_copy_texture
GL_EXT_draw_range_elements
GL_EXT_fog_coord
GL_EXT_framebuffer_blit
GL_EXT_framebuffer_multisample
GL_EXT_framebuffer_object
GL_EXT_framebuffer_sRGB
GL_EXT_gpu_program_parameters
GL_EXT_multi_draw_arrays
GL_EXT_packed_depth_stencil
GL_EXT_packed_float
GL_EXT_packed_pixels
GL_EXT_point_parameters
GL_EXT_rescale_normal
GL_EXT_secondary_color
GL_EXT_separate_specular_color
GL_EXT_shadow_funcs
GL_EXT_stencil_wrap
GL_EXT_subtexture
GL_EXT_texgen_reflection
GL_EXT_texture3D
GL_EXT_texture_compression_s3tc
GL_EXT_texture_cube_map
GL_EXT_texture_edge_clamp
GL_EXT_texture_env_add
GL_EXT_texture_env_combine
GL_EXT_texture_env_dot3
GL_EXT_texture_filter_anisotropic
GL_EXT_texture_lod_bias
GL_EXT_texture_mirror_clamp
GL_EXT_texture_object
GL_EXT_texture_rectangle
GL_EXT_texture_shared_exponent
GL_EXT_texture_sRGB
GL_EXT_vertex_array
GL_KTX_buffer_region
GL_NV_blend_square
GL_NV_texgen_reflection
GL_SGIS_generate_mipmap
GL_SGIS_texture_edge_clamp
GL_SGIS_texture_lod
GL_WIN_swap_hint
WGL_ARB_buffer_region
WGL_ARB_extensions_string
WGL_ARB_make_current_read
WGL_ARB_multisample
WGL_ARB_pbuffer
WGL_ARB_pixel_format
WGL_ARB_render_texture
WGL_ATI_pixel_format_float
WGL_ATI_render_texture_rectangle
WGL_EXT_extensions_string
WGL_EXT_framebuffer_sRGB
WGL_EXT_pixel_format_packed_float
WGL_EXT_swap_control

Core features
v1.1 (100 % - 7/7)
v1.2 (100 % - 8/8)
v1.3 (100 % - 9/9)
v1.4 (100 % - 15/15)
v1.5 (100 % - 3/3)
v2.0 (100 % - 10/10)
v2.1 (100 % - 3/3)

OpenGL driver version check (Current: 6.14.10.7659, Latest known: 2.1.7536 Release):
Latest version of display drivers found
According the database, you are running the latest display drivers for your video card.

Compiled vertex array support
This feature improves OpenGL performance by using video memory to cache transformed vertices.

Multitexture support
This feature accelerates complex rendering such as lightmaps or environment mapping.

Secondary color support
This feature provides an alternate method of coloring specular highlights on polygons.

S3TC compression support
This feature improves texture mapping performance in some applications by using lossy compression.

Texture edge clamp support
This feature improves texturing quality by adding clamping control to edge texel filtering.

Vertex program support
This feature enables a wide variety of effects via flexible vertex programming (equivalent to DX8 Vertex Shader.)

Fragment program support
This feature enables a wide variety of effects via per pixel programming (equivalent to DX9 Pixel Shader.)

Texture anisotropic filtering support
This feature improves the quality of texture mapping on oblique surfaces.

Occlusion test support
This feature provides hardware accelerated culling for objects.

Point sprite support
This feature improves performance in some particle systems.

OpenGL Shading Language support
This feature enables high level shading language for shaders.

Frame buffer object support
This feature enables render to texture functionality.

Extension verification:
GL_ARB_color_buffer_float was not found, but has the entry point glClampColorARB
GL_ARB_imaging was not found, but has the entry point glBlendEquation
GL_EXT_vertex_shader was not found, but has the entry point glBeginVertexShaderEXT
GL_EXT_vertex_shader was not found, but has the entry point glBindLightParameterEXT
GL_EXT_vertex_shader was not found, but has the entry point glBindMaterialParameterEXT
GL_EXT_vertex_shader was not found, but has the entry point glBindParameterEXT
GL_EXT_vertex_shader was not found, but has the entry point glBindTexGenParameterEXT
GL_EXT_vertex_shader was not found, but has the entry point glBindTextureUnitParameterEXT
GL_EXT_vertex_shader was not found, but has the entry point glBindVertexShaderEXT
GL_EXT_vertex_shader was not found, but has the entry point glDeleteVertexShaderEXT
GL_EXT_vertex_shader was not found, but has the entry point glDisableVariantClientStateEXT
GL_EXT_vertex_shader was not found, but has the entry point glEnableVariantClientStateEXT
GL_EXT_vertex_shader was not found, but has the entry point glEndVertexShaderEXT
GL_EXT_vertex_shader was not found, but has the entry point glExtractComponentEXT
GL_EXT_vertex_shader was not found, but has the entry point glGenSymbolsEXT
GL_EXT_vertex_shader was not found, but has the entry point glGenVertexShadersEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetInvariantBooleanvEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetInvariantFloatvEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetInvariantIntegervEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetLocalConstantBooleanvEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetLocalConstantFloatvEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetLocalConstantIntegervEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetVariantBooleanvEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetVariantFloatvEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetVariantIntegervEXT
GL_EXT_vertex_shader was not found, but has the entry point glGetVariantPointervEXT
GL_EXT_vertex_shader was not found, but has the entry point glInsertComponentEXT
GL_EXT_vertex_shader was not found, but has the entry point glIsVariantEnabledEXT
GL_EXT_vertex_shader was not found, but has the entry point glSetInvariantEXT
GL_EXT_vertex_shader was not found, but has the entry point glSetLocalConstantEXT
GL_EXT_vertex_shader was not found, but has the entry point glShaderOp1EXT
GL_EXT_vertex_shader was not found, but has the entry point glShaderOp2EXT
GL_EXT_vertex_shader was not found, but has the entry point glShaderOp3EXT
GL_EXT_vertex_shader was not found, but has the entry point glSwizzleEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantPointerEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantbvEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantdvEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantfvEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantivEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantsvEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantubvEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantuivEXT
GL_EXT_vertex_shader was not found, but has the entry point glVariantusvEXT
GL_EXT_vertex_shader was not found, but has the entry point glWriteMaskEXT
GL_ARB_half_float_pixel was not found, but is available in driver version 2.1.7536 Release

Does anyone even update delphi3d anymore? I would send the data of the 3650 if I knew the database would be updated…

Few days ago I tried upload the data for 3870 however the link for automatic upload did not work.

I’m seeing this too on all Radeon HD chips. It happens with both FBOs and pbuffers, and it seems to me that the driver is simply putting the texcoord origin at the upper-left corner when texturing from an off-screen render target, which is wrong. It does not happen on earlier chips (like the X1800), but I get different problems there.

Direct from ATI/AMD HQ…

• Non-Power-of-two, as stated in the thread - it is in the core so there is no need to export the string. Older AMD directx9 HW could support npot textures although with some HW restriction. Please report any crash issue with a sample code to AMD.

• Regarding the problems mentioned (image up-side-down, render-to-texture), we are not aware of any problems. Please send any sample code and we can help you debug the issue.

• HW Capabilities (MRT, shader uniform) - Full HW Capability will be implemented and supported in our driver shortly.

• Regarding the missing extensions, we plan to support these features once they are standardized by the ARB and not using proprietary extension. These extensions are being standardized and we will support them soon.

If you look at the current posted Viewperf 10 results, http://www.spec.org/gwpg/gpc.data/vp10/summary.html, ATI graphics
owns the top spots for almost all benchmarks, the FireGL V7700 posted the fastest numbers ever. The AMD press release
on 6/26/2008 http://www.amd.com/us-en/Corporate/VirtualPressRoom/0,51_104_543~126871,00.html shows that ATI graphics
Viewperf scores are industry leading for Linux as well. If you look at the list of 3rd party applications that have certified
ATI OpenGL drivers at http://ati.amd.com/products/workstation/ISVCertsFireGL.pdf, it is difficult to believe that the ATI drivers
are horribly broken or non-functional, as the drivers have been tested by dozens of third party software developers for hundreds
of applications.

I think that many programs use the presence of the extension string to detect if the NPOT textures are really supported. Without that string, such programs would consider even latest ATI card to be unsupported and might use less effective workarounds.

If you look at the list of 3rd party applications that have certified ATI OpenGL drivers at http://ati.amd.com/products/workstation/ISVCertsFireGL.pdf, it is difficult to believe that the ATI drivers
are horribly broken or non-functional, as the drivers have been tested by dozens of third party software developers for hundreds
of applications.

Hmm. And which applications have certified drivers for consumer range of ATI cards (e.g. the Radeon 4850 mentioned in the post you are replying to)? Consumer cards are the target for many people on this board. The CAD type application also likely utilize different subset of the OpenGL than the gaming applications.

“Direct from ATI/AMD HQ…”
Is this meant to tell me you are a PR guy and now got the answer from the people, who know what they are talking about? (Sorry for being so rude, but i hate PR guys, and they definitely do not belong in a technical discussion).

“• Regarding the problems mentioned (image up-side-down, render-to-texture), we are not aware of any problems. Please send any sample code and we can help you debug the issue.”

I would, if i needed to, but since we have a bit of control over the hardware used in the end, it is simply easier to tell people to use nVidia hardware. It should run on ATI’s Radeon X1xxx series pretty fine, though (last time i checked was ~4 months ago).

I assume you are having a closer look at these forums for a longer time, so you might have noticed, that i have been always telling people “ATI isn’t really worse than nVidia - both have their problems”, because i was actually using ATI for a long time and have usually had no more problems than with nVidia (cards i used: Radeon 9600, Mobility Radeon 9700, Mobility Radeon X1600). Most other people kept complaining about ATI, although they all used mainly nVidia, so when they made their app work on ATI there were of course a few issues. Now i switched to nVidia and can assure you, i was right, ATI was not worse than nVidia.

All until the 3xxx and 4xxx series. Seeing that the 3xxx series has the same MRT/uniform limitations as the 4xxx, i really DOUBT that those are limitations “which will be lifted soon”, because the 3xxx series is out quite some time.

Also i don’t quite get were there is a problem to support 512 or 4096 uniforms, if the hardware can do it? I have no knowledge about hardware, but even if the driver would need to do some bookkeeping, which might not be efficient enough to allow all 4096 uniforms to be used, it should still be possible to allow at least 2048.

Performance is an entirely different matter. No one said the cards were slow. They are pretty amazingly fast in fact. The 4xxx series is definitely worth it’s money for gamers and D3D developers. But for OpenGL developers the situation isn’t really great, simply because there are so many problems regarding unsupported framebuffer formats, depth/stencil issues and of course a long list of unsupported extensions.

“• Regarding the missing extensions, we plan to support these features once they are standardized by the ARB and not using proprietary extension. These extensions are being standardized and we will support them soon.”

That is great to hear, i actually assumed, that ATI is simply waiting for GL 3. Of course 2.1 with standardized extensions is also good, for starters. But still, it is not there YET, so people WILL complain.

Oh, and btw: 2 of your 3 links are broken (404, etc.). And the ONE link that is working tells me that AMD improves performance on Linux considerably. Sorry, but i am used to too much PR talk, i really don’t believe this. Simply throttle the drivers more and more over a year or so, and then release it’s full speed again and you got a nice PR statement! If AMD did it or not, i don’t care, those are simply news that i don’t buy, no matter whether they are true or not (it’s not AMDs fault, it’s the whole industry, that i don’t trust).

“it is difficult to believe that the ATI drivers
are horribly broken or non-functional, as the drivers have been tested by dozens of third party software developers for hundreds
of applications”

Yes, the older drivers are very good. And even the new drivers are certainly quite good, if you only use 95% of it’s feature-set, or only develop on ATI, so that you know it’s kinks. The problem is simply, that when a new hardware generation is released, and your app worked well on the previous generation, you simply expect, that the new generation has no NEW/OTHER issues than the previous one. You expect that new features might not work well, but all OLD features should work as on older hardware. THAT’s the real problem with the 3/4xxx series. It introduced several problems/changes that the previous generation did not have. And THAT’s what pissed ME off, because i simply expect a working app to continue to work on new hardware from the same vendor, only faster.

Well, we will see, whether ATI gets it done “soon”. “Soon” does have quite a different definition in the OpenGL world…

Jan.

Personally, I’m also on “Wait & See to believe” mode. After all, how many years did it take ATI/AMD to add multisampled FBOs/blit to their driver ? And that’s not a proprietary vs non-proprietary extension problem, the ARB version has been out for years.

Last time I contacted AMD devrel, I received amazing answers like “it is normal that your shader isn’t working, ATI X1xxx don’t support loops and conditionals in hardware”. WTF ? If I contact devrel, I don’t expect to speak to a PR guy who doesn’t know anything about his own hardware…

Despite this, I still like ATI. I agree their cards are really good, hardware wise, and I agree with Jan that NVidia drivers aren’t necessarily better. I’ve seen my share of bugs that happen on NVidia cards after switching from ATI ones…

Y.

The inner child in wants me to believe that the only reason for the Ati situation is that GL 3.0 is around the corner…