View Full Version : NVIDIA's GL2 drivers
I just took a look at nvidia's new OpenGL2.0 drivers for win32.
While I'm very proud I can finally run 3Dlabs demos on my PC, I notice the lack of shading Perlin noise.
Does anyone know if it's going to ever be implemented?
Is there a possibility the hardware won't support it directly (I'm acually on NV43)?
Is there a chance there will be an "emulator" like nvEmulate just to have it in place, maybe at reduced performances?
07-18-2005, 02:05 PM
I believe 3Dlabs is the only vendor that currently supports the noise function.
07-18-2005, 05:04 PM
Does anyone know if it's going to ever be implemented?I doubt it. If it were up to me, supporting noise() and other such functions would be the absolute last thing on my list of things to do for glslang functionality.
Is there a possibility the hardware won't support it directly (I'm acually on NV43)?I, for one, would certainly not bother putting that in my hardware.
Originally posted by Korval:
I doubt it. If it were up to me, supporting noise() and other such functions would be the absolute last thing on my list of things to do for glslang functionality.Well, you're right for now but considering the amounts of extra processing power and bandwidth, I fear we'll have to switch to procedural texturing pretty soon to avoid being bandwidth limited.
However, pixel derivatives are another not really well supported operation so procedural texturing in not possible anyway.
Originally posted by Korval:
I, for one, would certainly not bother putting that in my hardware.Ok, I'll check again... next year I think. It would take less than 400bytes of caches I guess since noise is basically a LUT, plus some extra transistors in the decode unit.
For 16bit noise, then we shall raise to 128k. It does not seem to be a very expensive thing.
Anyway, thank you both for the replies!
07-19-2005, 01:50 AM
You can implement your own. Just search this forum for "simplex noise" and pick what you need.
That's sure, I already implemented it on CPU so I have only a doubt.
The problem lies in encoding the permutation table in textures or uniform array. I guess most people is doing this using a texture however, being random this method will kill texture caches.
I don't think the same applies to a uniform array, since I guess all the uniforms are loaded to fast registers so this could be a win.
Do you think I should care about that?
The point is that the performance of a "overriden" noise function could be very different from a hardware accelerated one.
I understand however for development purposes this is the only way so I guess I'll do that but I want it to be as similar as possible (performance wise) to a hardware accelerated noise.
07-19-2005, 10:38 AM
Well, you're right for now but considering the amounts of extra processing power and bandwidth, I fear we'll have to switch to procedural texturing pretty soon to avoid being bandwidth limited.I can find stuff to do between texture lookups. Like high-quality lighting and so forth.
Besides, procedural textures don't look good for everything. Indeed, they don't look terribly good for most things. It's only really a select few kinds of objects (marble, wood-grain, etc) where procedural textures provide a decent visual output.
It's only really a select few kinds of objects (marble, wood-grain, etc) where procedural textures provide a decent visual outputehem (instancing) cough ;)
seroiusly though im in agreement, noise is about the last thing i wanna see the hardware do, its a gimmick.
07-19-2005, 09:55 PM
Noise may be a gimmick, and there seems no incentive to make it more than that.
GLSL (and Cg?) define noise by characteristics, not algorithms. Which noise du jour should GPU vendors build into their compilers let alone hardware? Does it matter if Nvidia doesnít match ATI? Maybe not for a game where framerate trumps assuming sufficient quality. Maybe yes for a production flow where initial work is hardware accelerated but final rendering is software (possibly non real time shader programs as GPUís improve). It may matter the exact noise algorithm is known so results track through the production flow.
A framework which allowed for a variety of noise functions (algorithms, derivatives, precision) might allow compilers to substitute tuned code as dictated by market response. (Isnít something like this used for compressed textures?) As a gimmick, none of this matters. It's just noise. :)
Powered by vBulletin® Version 4.2.2 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.