Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 6 of 7 FirstFirst ... 4567 LastLast
Results 51 to 60 of 69

Thread: GLSL noise fail? Not necessarily!

  1. #51
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    Your benchmark is 25% lower than mine on the same hardware and software (MacBook Pro, GF9400M, MacOS X, 1280x800 fullscreen). Did you run the demo on a single screen? Mirroring, or just having a second display active, tends to slow down the display subsystem rather a lot on MacOS X. An earlier post on page 4 of this thread contains my results.

  2. #52
    Member Regular Contributor
    Join Date
    Mar 2003
    Location
    Spain
    Posts
    273

    Re: GLSL noise fail? Not necessarily!

    Yes, I ran it on a single screen, same resolution, same gfx, OS: MacOSX Snow Leopard, but my Macbook is not the "Pro" version. Perhaps there are slight differences in the CPU speed. I will try again assuring that there aren't any background application that could slow down the system.
    "!I don't know... fly casual"

  3. #53
    Junior Member Newbie
    Join Date
    May 2005
    Posts
    17

    Re: GLSL noise fail? Not necessarily!

    On my Giaida N20 nettop with...

    Ubuntu 10.10 32 bit
    Intel Atom D525 (1.8 GHz, dual core)
    NVIDIA ION2 with 512MB Graphics

    Hooked into my gaming/browsing TV

    Code :
    GL vendor:    NVIDIA Corporation
    GL renderer:  GeForce 210/PCI/SSE2
    GL version:   3.3.0 NVIDIA 260.19.06
    Desktop size: 1280 x 720 pixels
     
    2D simplex noise, version 2011-03-25, 133.1 Msamples/s
    3D simplex noise, version 2011-03-25, 64.3 Msamples/s
    4D simplex noise, version 2011-03-25, 36.0 Msamples/s

    Even though the console shows this as the first output line, the results looked as expected:
    "Fragment shader compile error:"

  4. #54
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    For some reason, that "Fragment shader compile error:" shows up on many platforms, although the error message you get when you ask what went wrong is an empty string. On some systems I have tried, the "error" reported is even "Shader successfully compiled", so I think the notion of when to signal an error is kind of hazy to many GLSL compilers.

  5. #55
    Member Regular Contributor trinitrotoluene's Avatar
    Join Date
    Sep 2008
    Location
    Montérégie,Québec
    Posts
    362

    Re: GLSL noise fail? Not necessarily!

    I have not read the recent source code of the program and this is only a suggestion.But the program should not rely for shader compile failure on the info log. But it should rely on the compile status.

    Code :
    GLint is_compiled;
    void glGetShaderiv(theShader,GL_COMPILE_STATUS,&is_compiled);
     
    if(is_compiled != GL_TRUE)
    {
     cout<<"Fragment shader compile error: "<<shaderLog<<endl;
    }
    else
    {
     cout<<"Fragment shader compile success:"<<shaderLog<<endl;
    }

    Oups, I should read the code before I post because the compile status check is done in the noisebench.c file.

  6. #56
    Junior Member Newbie
    Join Date
    Apr 2009
    Posts
    1

    Re: GLSL noise fail? Not necessarily!

    His permutation function drops 0 on 0. Otherwise it is good.
    I do not understand the replacement for the gradient table though...how does that work ?

  7. #57
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    His permutation function drops 0 on 0. Otherwise it is good.
    That 0->0 mapping is not a problem. It is perfectly alright for a permutation to have one or even several fixed points that map to themselves, as long as they do not appear in a too regular pattern.

    The permutation is a permutation polynomial: permute(x) is computed as (34*x^2 + x) mod 289.
    This is one of the two neat and original ideas in Ian's implementation. (The other one is the clever generation of gradients.)
    You can read about permutation polynomials on Wikipedia.
    It is not a new idea in mathematics, it is just new for this application. A proper journal article on this noise implementation is on its way, but please have patience.

  8. #58
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    The github repository has now been updated with some slight speedups, code cleanups and classic Perlin noise in regular and periodic versions.

    2D simplex noise is now only about 20 mult and add operations (including five dot operations), one division, three mod, two floor and one each of step, max, fract and abs.

    I get 1,5 billion 2D noise samples per second on my relatively measly Nvidia GTX260. An ATI HD5870 spits out 5 billion samples per second.

  9. #59
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    The 2D simplex noise was just optimized some more. I replaced a division with a multiplication and removed one multiplication and one addition by introducing two more constants. The speedup I see on my system (ATI HD4850) is about 5%.

    The level of hand feeding you need to do to optimize GLSL code reminds me of C compilers from the early 1990's.

  10. #60
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    3,183

    Re: GLSL noise fail? Not necessarily!

    Quote Originally Posted by StefanG
    ...The speedup I see on my system (ATI HD4850) is about 5%. The level of hand feeding you need to do to optimize GLSL code reminds me of C compilers from the early 1990's.
    I'm curious if this was your general GLSL experience with ATI, NVidia, and Intel drivers, or just regarding ATI drivers in particular.

    With NVidia, I've been amazed at how much complexity/infrastructure you can stack on top, but yet how effectively it aggressively throws away things and transforms the code into something very efficient.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •