Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 2 12 LastLast
Results 1 to 10 of 16

Thread: GeForce GTX 280... what for us, programmers? :)

  1. #1
    Super Moderator Frequent Contributor Groovounet's Avatar
    Join Date
    Jul 2004
    Posts
    937

    GeForce GTX 280... what for us, programmers? :)

    This is it, GeForce GTX 280 announced, reviews read, well the real excitement is: "what could we do and how?" I want OpenGL extensions! nVidia is always fast for this so I guest it will come soon...

    if any!

    I mean, it seems that the only new feature is double precision floating numbers... Not really exiting for real-time but I'm sure it could be useful for some persons... for scientists... for CUDA.

    Moreover it doesn't seem that double precision images is possible... so no render to double precision framebuffer. It seems that the stream-out could be the only way for double precision result. I guest double precision buffer could be available.

    I hope they will be more, let's see, but if anyone have an extension log from a GeForce GTX 280... please share it on this post ^_^.

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Sep 2004
    Location
    Prombaatu
    Posts
    1,386

    Re: GeForce GTX 280... what for us, programmers? :)

    That is undoubtedly the best part of extensions, getting a new card (and sometimes just a new driver), and watching that extension list roll for the first time.

  3. #3
    Junior Member Regular Contributor
    Join Date
    Aug 2007
    Location
    USA
    Posts
    243

    Re: GeForce GTX 280... what for us, programmers? :)

    I thought the GTX 280 didn't support anything new over the G80 core?

    EDIT: Ohh nice, doubles... does this mean we'll see 64-bit integer textures too???

  4. #4
    Senior Member OpenGL Guru
    Join Date
    Dec 2000
    Location
    Reutlingen, Germany
    Posts
    2,042

    Re: GeForce GTX 280... what for us, programmers? :)

    Great! Now, if they were actually able to implement 32 Bit float textures that don't kill your performance, i might see a point in adding 64 Bit support.

    Oh and another thing: Render-to-texture that INCLUDES early-z and early-stencil tests in non-trivial use-cases. Now THAT would make their hardware somewhat useful.

    The whole GXT 280 thing reminds me of 3dfx: "hey, we don't need to improve our chips, lets just put MORE of them on one board!"

    I doubt that new chip is anything good. It's just brute-force. And that never worked in this kind of industry.

    Jan.
    GLIM - Immediate Mode Emulation for GL3

  5. #5
    Junior Member Regular Contributor
    Join Date
    Aug 2007
    Location
    USA
    Posts
    243

    Re: GeForce GTX 280... what for us, programmers? :)

    Is it possible to do ANYTHING out of the ordinary that doesn't kill the optimizations such as early-z/etc.? I hate having to design around that. Too much to remember.

  6. #6
    Advanced Member Frequent Contributor
    Join Date
    Feb 2006
    Location
    Sweden
    Posts
    744

    Re: GeForce GTX 280... what for us, programmers? :

    Technically speaking i suspect that the GXT 280 does not only have a new naming system and double precision floating numbers, it might have gotten rid of the last fixed function hardware, and now does graphics in "software", that might be why there is little mention of new extensions.
    Any nvidian can correct me if im way out of line here, but if that is true it means that as fast as they can create new standards for new extensions they can implement it with the only restriction being total system processing power.

  7. #7
    Super Moderator Frequent Contributor Groovounet's Avatar
    Join Date
    Jul 2004
    Posts
    937

    Re: GeForce GTX 280... what for us, programmers? :

    "it might have gotten rid of the last fixed function hardware"

    Do you mean blending operations and maybe filtering? Because that the only parts I can see that would be programmable. I guest that on Radeon HD cards the blending is already programmable... by them hardware ready, extension isn't.

    Maybe the input assembler would become programmable (with DX11) for tessellation unit purpose but I don't think that most of the ROPs would ever be (except blending.)

  8. #8
    Intern Newbie
    Join Date
    Mar 2008
    Posts
    36

    Re: GeForce GTX 280... what for us, programmers? :

    I've just read in an article of an german gamer magazine (www.gamestar.de) that the gtx 280 partly supports directx 10.1. So there should be some new features, but I couldn't find any more detailed information on that yet.

  9. #9
    Super Moderator Frequent Contributor Groovounet's Avatar
    Join Date
    Jul 2004
    Posts
    937

    Re: GeForce GTX 280... what for us, programmers? :

    I think that G80 also support partly DX10.1 because I think that GL_EXT_draw_buffers2 is a feature of DX10.1 and supported by G80 already

  10. #10
    Junior Member Regular Contributor
    Join Date
    Aug 2007
    Location
    USA
    Posts
    243

    Re: GeForce GTX 280... what for us, programmers? :

    Actually, it doesn't. From the spec:

    While this extension does provide separate blend enables, it does not
    provide separate blend functions or blend equations per color output.
    I believe DX10.1 allows separate blend functions/equations per output.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •