Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 2 of 2

Thread: Bad bit depth when telneted from an SGI

  1. #1
    Junior Member Regular Contributor
    Join Date
    Nov 2000
    Location
    State College, PA
    Posts
    186

    Bad bit depth when telneted from an SGI

    Okay, I'm not sure this is an OpenGL issue, but if someone out there can direct me to an answer, I'd appreciate it.

    I am generating an application that uses OpenGL, and it works great. I run it natively on Linux machines with varying video cards, and remotely logged in from Linux machines, and it works fine. But on SGIs, I don't get enough colors when I run it remotely (I telnet from the SGI to the Linux to run it). It works fine when run natively on the SGI! Aargh!

    Any suggestions?

    Chris

  2. #2
    Junior Member Regular Contributor
    Join Date
    Nov 2000
    Location
    State College, PA
    Posts
    186

    Re: Bad bit depth when telneted from an SGI

    Clarification - when I say "telnet", I mean "ssh"...

    Chris

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •