PDA

View Full Version : OpenGL 2.0 drivers



Csiki
05-20-2003, 11:53 PM
I didn't know where to open this topic, may be here is the best.
So. Will be any OpenGL 2.0 drivers for the older graphics cards?
I mean Geforce4 and ATI Radeon8xxx cards are great, but don't support fragment shaders.
I don't need this feature yet, but I would likely use the GL2 on these cards (because of the new opengl objects).

IDen
05-21-2003, 01:40 AM
Originally posted by Csiki:
I didn't know where to open this topic, may be here is the best.
So. Will be any OpenGL 2.0 drivers for the older graphics cards?
I mean Geforce4 and ATI Radeon8xxx cards are great, but don't support fragment shaders.
I don't need this feature yet, but I would likely use the GL2 on these cards (because of the new opengl objects).


It looks like there will be no OpenGL2 driver for old video cards.

If you have something like R9500 or later you may download 3.4 driver. There is a beta OpenGL 2.0 (Only glSlang GL2_fragment_shader GL2_vertex_shader GL2_shader_object) you may found them if you view atioglxx.dll with FAR or other text editor. Unfortunately they’re no exact specifications on it yet. But it's very interesting. It looks like there is a glslang1.0 implementation in it (3.4) where gl_FBColor and others are supported.

[This message has been edited by IDen (edited 05-21-2003).]

Csiki
05-21-2003, 01:45 AM
Originally posted by IDen:
It looks like there will be no OpenGL2 driver for old video cards.

If you have something like R9500 or later you may download 3.4 driver. There is a beta OpenGL 2.0 (Only glSlang GL2_fragment_shader GL2_vertex_shader GL2_shader_object) you may found them if you view atioglxx.dll with FAR or other text editor. Unfortunately they’re no exact specifications on it yet. But it's very interesting. It looks like there is a glslang1.0 implementation in it (3.4) where gl_FBColor and others are supported.

[This message has been edited by IDen (edited 05-21-2003).]

I have a Geforce4 Ti4200. http://192.48.159.181/discussion_boards/ubb/frown.gif

Mihail121
05-21-2003, 04:15 AM
Hmmm...my info is a little bit different.I heard that there will be OpenGL 2.0 for all accelerators(ofcourse if the vendors want it).The thing is that OpenGL 2.0 what be separate with OpenGL 1.x.It will cover the old features and will add new ones.

Humus
05-21-2003, 07:39 AM
There is always the option of running in software mode. For older cards like the R8500 where it will always run in software mode, even for simple shaders, because it lacks floating point capabilities it is questionable if adding GL2 shader support is useful. Then on the other hand you'll need to have the software fallback path for cases where you can't accelerate things on highend chips either, so it may be more or less for free to support a software path for older chips.

NitroGL
05-21-2003, 08:27 AM
Originally posted by Humus:
There is always the option of running in software mode. For older cards like the R8500 where it will always run in software mode, even for simple shaders, because it lacks floating point capabilities it is questionable if adding GL2 shader support is useful. Then on the other hand you'll need to have the software fallback path for cases where you can't accelerate things on highend chips either, so it may be more or less for free to support a software path for older chips.

Doesn't the 3DLabs P10 processor lack floating point? They have GL2 "support".

Korval
05-21-2003, 10:09 AM
For older cards like the R8500 where it will always run in software mode, even for simple shaders, because it lacks floating point capabilities it is questionable if adding GL2 shader support is useful.

Actually, the thing that limits the 8500 is not it's floating-point capabilities. Indeed, the 8500 internally used either 16-bit or 24-bit floats (I'm thinking 24-bit, but I've forgotten). Indeed, I'm not sure if the fragment capabilities of the 9500+ line have changed much compared to the 8500 line, except in the number of instructions/passes/etc. If that's true (and there is some evidence for it), then the 8500 line might be able to run some non-trivially simple shaders. Obviously, something that requires more than 1 dependent texture address or uses too many ALU instructions can't work.

That, and there's the problem of the 32-varying floats. The 8500 line only supports 6 texture coordinates, which translates to only 24 varying floats. Personally, I think it was a mistake to put a set number into the spec; make it a querryable resource with low manditory limits (like 16).

Ostsol
05-21-2003, 10:26 AM
Is OpenGL 2.0 going to redefine standard drawing calls and state manipulation functions? Or is more going to be more of an upgrade, adding on to what already exists in OpenGL 1.x?

Humus
05-21-2003, 11:12 AM
Originally posted by NitroGL:
Doesn't the 3DLabs P10 processor lack floating point? They have GL2 "support".

Yes, but if I'm not mistaken their GL2 support is limited to vertex shaders.

KRONOS
05-22-2003, 01:00 PM
And again, I'm getting tired of waiting for anything regarding GL2... http://192.48.159.181/discussion_boards/ubb/frown.gif

jwatte
05-22-2003, 07:05 PM
Regarding GL2: until you can install it and use it, it doesn't exist. Waiting for things that don't exist is seldom useful (although I'm often guilty of doing this, too :-( )

Regarding 1.x support: the recommendation in the original GL2 spec from 3dlabs was to provide 1.x compatibility as a wrapper library on top of the 2.0 driver. Yes, it would be supported. No, it wouldn't be "core" anymore.

cass
05-22-2003, 07:50 PM
Originally posted by jwatte:
Regarding 1.x support: the recommendation in the original GL2 spec from 3dlabs was to provide 1.x compatibility as a wrapper library on top of the 2.0 driver. Yes, it would be supported. No, it wouldn't be "core" anymore.

For what it's worth, I don't think anybody's really advocating this position within the ARB anymore. Backward compatibility and continued strong support for the existing installed base is one of the main reasons OpenGL still exists today.

V-man
05-23-2003, 06:51 AM
That's good to hear. I beleive that GL was done right from the get go and if it ain't broke, don't DirectX it. Pardon the expression.

From what I remember reading, GL2 was suppose to be backwards compatible. I didn't know someone wanted to boot out the current GL.

From what I saw (the few docs I skimmed), GL2 will be a big introduction. There will be a lot of reading to do to understand and use the features.

glVertexArrayPointer
glDrawIndexArrays

and look at these funny ones
glDrawArraysAsync
glDrawIndexedArraysAsync
....

There is stuff about setting up policies, accessing memory directly. All in all, it's getting more sophisticated.

Are GL2 drivers in the works at NVidia? ATI?

PixelDuck
05-25-2003, 11:19 AM
The memory policies will be much more controllable so adding more control over memory resources. Seemingly much more than D3D. Parallelism has also been improved a lot. Like partial syncronization of command execution and issuing, and background processing. The parallel background processing sounds really good. Also, the proposed GLsync should give more information on the execution of single commands or command sequences and the proposed improvement to flush is also welcomed. There is much good in OGL2 and I have high expectations of it. (Hope I didn't get anything wrong, I'm writing this arround midnight http://192.48.159.181/discussion_boards/ubb/wink.gif

Cheers,
Pix

Korval
05-25-2003, 03:40 PM
Are GL2 drivers in the works at NVidia? ATI?

There is no finalized, approved GL2.0 spec yet. Maybe ATi and nVidia have some of that functionality in the works, but it actual specs aren't final.

Zengar
05-26-2003, 11:46 AM
I don't thing Nvidia will release some GL2 drivers before specs wil become official. They have enought problems with GL1 divers...

Csiki
05-26-2003, 09:53 PM
Originally posted by Zengar:
I don't thing Nvidia will release some GL2 drivers before specs wil become official. They have enought problems with GL1 divers...
I don't know. If they have time to cheat in their driver just for a program, they may have time to write a GL2 driver too. http://192.48.159.181/discussion_boards/ubb/smile.gif

Zengar
05-27-2003, 03:31 PM
That's what I wanted to point out...

CybeRUS
05-28-2003, 03:06 AM
Hi all
I tested GL2 Shaders on 9700 & 9800, thanx to IronPeter for code.
Some strange with it, but it work.
Vertex Shader work fine, but simple fragment shader doesn't work correctly with gl_FBColor.
Shader code:
varying vec4 texc;
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
texc = gl_Vertex*0.1;
}

varying vec4 texc;
void main()
{
vec4 tmp,t1;
tmp=texture4(0, texc);
tmp=texture4(0, tmp);
tmp=texture4(0, tmp);
tmp=texture4(0, tmp);
gl_FragColor = tmp+gl_FBColor;
}

on ATI Radeon 9800 gl_FBColor have only red component (it's bug).
on ATI Radeon 9700 gl_FBColor doesn't add.
But it's fantastic http://192.48.159.181/discussion_boards/ubb/smile.gif)

You can download this sample code: http://www.gamedev.ru/download/?id=34

Humus
05-28-2003, 03:12 AM
To my knowledge gl_FBColor was dropped from the spec.

knackered
05-28-2003, 04:03 AM
Oh how embarrassing, what's wrong with the ARB? They've only gone and spelt 'colour' wrong AGAIN.

Mazy
05-28-2003, 04:21 AM
CybeRUS : how did you compile a openglSL program to vertex and fragment shader? i want to know, i want to do http://192.48.159.181/discussion_boards/ubb/smile.gif

kansler
05-28-2003, 04:32 AM
Knackered:

Main Entry: col·our
Pronunciation: 'k&-l&r
chiefly British variant of COLOR

So I guess you're british?

CybeRUS
05-28-2003, 04:39 AM
Humus:
Yes, gl_FB is droped from spec, but it's there in drivers http://192.48.159.181/discussion_boards/ubb/smile.gif

CybeRUS
05-28-2003, 04:41 AM
Mazy:
I compile it by glCompileShaderGL2
It's GL2_vertex/fragment_shader extensions in driver

knackered
05-28-2003, 04:56 AM
Right on brother, you bet yo ass I'm british.
We 'evolved' the english language, don't you know? If we say it's colour, then it's goddam colour, got it!? Ain't no goddam 'variant', it's the goddam root spelling of the word. jeez, brother.


Originally posted by kansler:
Knackered:

Main Entry: col·our
Pronunciation: 'k&-l&r
chiefly British variant of COLOR

So I guess you're british?

Mazy
05-28-2003, 05:10 AM
CybeRUS : you happend to know the exact name and spec for the extension?

edit: looked over the thread again http://192.48.159.181/discussion_boards/ubb/smile.gif so i guess no ext and no spec, but do i really have to be a member of a russian site in order to get the source?


[This message has been edited by Mazy (edited 05-28-2003).]

CybeRUS
05-28-2003, 05:52 AM
You can download: http://www.3dlabs.com/support/developer/ogl2/OGL2SDK.002.zip

But with some fix:



Change from:
#define GL_VERTEX_SHADER_GL2 0x00010
#define GL_FRAGMENT_SHADER_GL2 0x00020
To:
#define GL_VERTEX_SHADER_GL2 0x7010
#define GL_FRAGMENT_SHADER_GL2 0x7011

Or get my source:



#include <stdio.h>
#include <windows.h>

#define USE_IJL 1
#ifdef USE_IJL
#include "IJL\ijl.h"
#pragma comment( lib, "IJL\\ijl15.lib" )
#endif

#include <GL\gl.h>
#include <GL\glu.h>
#include <GL\glut.h>

#include <string>

float a,b,c;
UINT tex;

int LoadJPGFile(char *FileName)
{
int Height, Width, Bpp;
BYTE Bpp8;
int RetVal;
BYTE *img = NULL;
JPEG_CORE_PROPERTIES image;
ZeroMemory( &amp;image, sizeof( JPEG_CORE_PROPERTIES ) );
if( ijlInit( &amp;image ) != IJL_OK )
{
return -1;
}
image.JPGFile = FileName;
if ((RetVal = ijlRead(&amp;image,IJL_JFILE_READPARAMS)) == IJL_OK)
{
Height = image.JPGHeight;
Width = image.JPGWidth;
Bpp8 = 3;
Bpp = Bpp8*8;
UINT ImageSize=Height*Width*Bpp8;
BYTE *img = new BYTE[ImageSize];
if (img)
{
image.DIBBytes = img;
image.DIBColor = IJL_RGB;
image.DIBHeight = Height;
image.DIBWidth = Width;
image.DIBChannels = Bpp8;
if ((RetVal = ijlRead(&amp;image,IJL_JFILE_READWHOLEIMAGE)) == IJL_OK)
{
glEnable(GL_TEXTURE_2D);
glGenTextures(1, &amp;tex);
glBindTexture(GL_TEXTURE_2D, tex);
//glTexImage2D(GL_TEXTURE_2D, 0, Bpp == 24?GL_RGB:GL_RGBA, Height, Width, 0, Bpp == 24?GL_RGB:GL_RGBA, GL_UNSIGNED_BYTE, img);
gluBuild2DMipmaps(GL_TEXTURE_2D, Bpp8, Width, Height, Bpp == 24?GL_RGB:GL_RGBA, GL_UNSIGNED_BYTE, img);
}
}


delete img;
}
ijlFree(&amp;image);

return 0;
}

void redraw()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity ();
glTranslatef (0.0f, 0.0f, -1.5f);
glRotatef(160.0f,1.0f,0.0f,0.0f);
glRotatef(0.0f,0.0f,1.0f,0.0f);
glRotatef(c,0.0f,0.0f,1.0f);
glBegin(GL_QUADS);
glTexCoord2f(1,1); glVertex2f(+1,+1);
glTexCoord2f(1,0); glVertex2f(+1,-1);
glTexCoord2f(0,0); glVertex2f(-1,-1);
glTexCoord2f(0,1); glVertex2f(-1,+1);
glEnd();
glutSwapBuffers();
}
void motion(int x, int y)
{
c=x*0.1f;
}
void reshape(int width, int height)
{
glMatrixMode (GL_PROJECTION);
glLoadIdentity ();
gluPerspective(60,(width+.1)/(height+.1),0.1f,100.0f);
glMatrixMode (GL_MODELVIEW);
glViewport (0, 0, width, height);
}
typedef GLuint (APIENTRY *CreateShaderObjectGL2)(GLenum shaderType);
CreateShaderObjectGL2 CreateShaderObject;
typedef GLvoid (APIENTRY *LoadShaderGL2)(GLuint shaderObj, GLuint nseg, char **seg, const GLint *length);
LoadShaderGL2 LoadShader;
typedef GLboolean (APIENTRY *CompileShaderGL2)(GLuint shaderObj);
CompileShaderGL2 CompileShader;
typedef const char * (APIENTRY *GetInfoLogGL2)(GLuint shaderObj, GLint *length);
GetInfoLogGL2 GetInfoLog;
typedef GLuint (APIENTRY *CreateProgramObjectGL2)(GLvoid);
CreateProgramObjectGL2 CreateProgramObject;
typedef GLboolean (APIENTRY *AttachShaderObjectGL2)(GLuint programObj,GLuint shaderObj);
AttachShaderObjectGL2 AttachShaderObject;
typedef GLboolean (APIENTRY *LinkProgramGL2)(GLuint programObj);
LinkProgramGL2 LinkProgram;
typedef GLboolean (APIENTRY *UseProgramObjectGL2)(GLuint programObj);
UseProgramObjectGL2 UseProgramObject;

#define VERTEX_SHADER_GL2 0x7010
#define FRAGMENT_SHADER_GL2 0x7011


main(int argc, char *argv[])
{
glutInit(&amp;argc, argv);
glutInitWindowSize(800,600);
glutInitDisplayMode(GLUT_RGBA|GLUT_DEPTH|GLUT_DOUB LE);
glutCreateWindow("Peter Popov GLUT");
glutIdleFunc(redraw);
glutDisplayFunc(redraw);
glutMotionFunc(motion);
glutReshapeFunc(reshape);
if (LoadJPGFile("phong.JPG"))
{
printf("Error loading texture");
return -1;
}
printf("Texture loaded.\n");
glEnable(GL_TEXTURE_2D);
CreateShaderObject=(CreateShaderObjectGL2)wglGetPr ocAddress("glCreateShaderObjectGL2");
LoadShader=(LoadShaderGL2)wglGetProcAddress("glLoadShaderGL2");
CompileShader=(CompileShaderGL2)wglGetProcAddress("glCompileShaderGL2");
GetInfoLog=(GetInfoLogGL2)wglGetProcAddress("glGetInfoLogGL2");
CreateProgramObject=(CreateProgramObjectGL2)wglGet ProcAddress("glCreateProgramObjectGL2");
AttachShaderObject =(AttachShaderObjectGL2)wglGetProcAddress("glAttachShaderObjectGL2");
LinkProgram = (LinkProgramGL2)wglGetProcAddress("glLinkProgramGL2");
UseProgramObject =(UseProgramObjectGL2)wglGetProcAddress("glUseProgramObjectGL2");

if (!CreateShaderObject)
{
printf("Critical error!: GL2 Shaders doesn't support!\n");
//return -1;
}
else
{
GLuint fp,vp,obj;
{
FILE *file = fopen("GL2vertex.vp", "rb");
fseek(file, 0, SEEK_END);
int fsize = ftell(file);
fseek(file, 0, SEEK_SET);
char *string = new char[fsize+1];
fread((void *)string, fsize, 1, file);
string[fsize] = 0;
fclose(file);
printf("Vertex program: %s\n",string);

char **pieces=&amp;string;
GLint length[]={strlen(string)};
vp=CreateShaderObject(VERTEX_SHADER_GL2);
LoadShader(vp,1,pieces,length);
if(!CompileShader(vp)) printf("%s \n",GetInfoLog(vp,length));
else printf("vertex shader was compiled \n");

}
{
FILE *file = fopen("GL2fragment.fp", "rb");
fseek(file, 0, SEEK_END);
int fsize = ftell(file);
fseek(file, 0, SEEK_SET);
char *string = new char[fsize+1];
fread((void *)string, fsize, 1, file);
string[fsize] = 0;
fclose(file);
printf("Fragment program: %s\n",string);
char **pieces=&amp;string;
GLint length[]={strlen(string)};
fp=CreateShaderObject(FRAGMENT_SHADER_GL2);
LoadShader(fp,1,pieces,length);
if(!CompileShader(fp)) printf("%s \n",GetInfoLog(fp,length));
else printf("fragment shader was compiled \n");
}
obj=CreateProgramObject();
if(AttachShaderObject(obj,vp))printf("vertex shader was attached \n");
if(AttachShaderObject(obj,fp))printf("fragment shader was attached \n");
if(LinkProgram(obj))printf("link completed \n");
printf("linking:\n %s \n",GetInfoLog(obj,NULL));
if(UseProgramObject(obj)) printf("All is Ok! \n");
}
glClearColor(1.0,1.0,1.0,1.0);
glutMainLoop();
return 0;
}

Mazy
05-28-2003, 07:52 AM
thanks alot.

Humus
05-28-2003, 09:29 AM
Originally posted by CybeRUS:
Humus:
Yes, gl_FB is droped from spec, but it's there in drivers http://192.48.159.181/discussion_boards/ubb/smile.gif

Why develop for something that's obviously being dropped?

Btw, I don't think you're supposed to talk too widely about the actual state of the implementation, (hoping you're a legitimate user of the GL2 stuff etc)

Humus
05-28-2003, 09:31 AM
Originally posted by knackered:
Right on brother, you bet yo ass I'm british.
We 'evolved' the english language, don't you know? If we say it's colour, then it's goddam colour, got it!? Ain't no goddam 'variant', it's the goddam root spelling of the word. jeez, brother.

If the British wouldn't have constantly been fighting with the french throughout the centuries I would bet the British would have settled for "color" too. http://192.48.159.181/discussion_boards/ubb/smile.gif

knackered
05-28-2003, 11:55 AM
Huh? Aren't you glad we fought the french?
who likes the french?

Humus
05-28-2003, 12:58 PM
Well, they probably like themselves at least http://192.48.159.181/discussion_boards/ubb/smile.gif

Either way, not sure if it was worth it to destroy the oh so beautiful original language you shared with us scandinavians and germans http://192.48.159.181/discussion_boards/ubb/wink.gif

matt_weird
05-28-2003, 12:59 PM
hey, CybeRUS http://192.48.159.181/discussion_boards/ubb/wink.gif , what is 1C' opinion on your source code posting? (can you get any troubles with that? just curious http://192.48.159.181/discussion_boards/ubb/wink.gif )

CybeRUS
05-28-2003, 08:47 PM
Humus:
I don't know about actual state of implementation, but in spec of glslang on this site gl_FB is droped.

matt_weird:
Heh, it's not source from the game, it's just simple code to get work a glslang.
Btw, i can post any source code, there is no secrets from OpenGL peoples. I write open source game engine (http://gungine.gamedev.ru) to help people get start with OpenGL programming.
I want that OpenGL make things better http://192.48.159.181/discussion_boards/ubb/smile.gif

matt_weird
05-29-2003, 12:17 AM
i saw GUNgine, guess it's time to download all the steps, good source of examples (i see you had some updates there recently http://192.48.159.181/discussion_boards/ubb/wink.gif )

"There're no secrets for OpenGL people" -- this sounds a bit..silly, lol (as there could be some Direct3D people in here as well, hehe http://192.48.159.181/discussion_boards/ubb/wink.gif

C++
05-29-2003, 04:23 AM
Stop feeding trolls (Knackered and Humus)! They started talking on off-themes!
#include <directkill.h>
HRESULT hRes;
LPELECTRICALCHAIR13 g_pElCh;
LPDIRECTKILL13 g_pK;
g_pK = DirectKillCreate13(DK_SDK_VERSION);
hRes = g_pK->CreateChair(DK_MORTAL_CHAIR, DK_CHAIR_DEFAULT, &g_pElCh);
if(FAILED(hRes)) Expel("Knackered", "Humus");
ass *ptr;
g_pElCh->Lock(AND_NEVER_UNLOCK, (ASS**)&ptr);
freq_contributor_cpy(ptr, &knackered, &humus);
g_pElCh->SetTormentShader(DKFVF_DESTROY);
g_pElCh->SetKillingStageState(0, DK_ALL);
g_pElCh->ExecutePrimitiveHuman();...

Humus
05-29-2003, 04:28 AM
http://esprit.campus.luth.se/~humus/temp/tryingtobefunny.jpg

matt_weird
05-31-2003, 10:26 PM
..jeez, just what are they waiting for?? http://www.opengl.org/discussion_boards/ubb/confused.gif i wanna get GL2.0 and forget even thinking of that D3D puzzle for at least four next years, can i? http://www.opengl.org/discussion_boards/ubb/eek.gif

GIVE ME GL2.0 RIGHT NOW! RIGHT NOW!! http://www.opengl.org/discussion_boards/ubb/mad.gif http://www.opengl.org/discussion_boards/ubb/eek.gif hehehe http://www.opengl.org/discussion_boards/ubb/tongue.gif http://www.opengl.org/discussion_boards/ubb/wink.gif

matt_weird
06-04-2003, 09:40 AM
No, really, doesn't the DOOMIII release delay mean that JC is waiting for an official GL2.0 release? Also when it comes out?

Humus
06-04-2003, 02:04 PM
I doubt GL2 will affect release date of DoomIII. If GL2 is ready by the time DoomIII is done I guess JC might write a GL2 backend, but I doubt he would wait for GL2 to be ratified and delay the game because of that.

matt_weird
06-04-2003, 08:25 PM
Probly you're right, Humus. Here what i read about GL2.0 & DOOM III, JC says(June,2002):


“I am now committed to supporting an OpenGL 2.0 renderer for Doom through all the spec evolutions. If anything, I have been somewhat remiss in not pushing the issues as hard as I could with all the vendors. Now really is the critical time to start nailing things down, and the decisions may stay with us for ten years.”
http://www.3dlabs.com/support/developer/ogl2/p10.htm

(someone was talking about "asm without crack", hehe; here we go without it, but it's not effective this way http://www.opengl.org/discussion_boards/ubb/biggrin.gif :


"I get some amusement from the all-assembly crowd, and it can be impressive, but it is certainly not effective"

[This message has been edited by matt_weird (edited 06-04-2003).]

barthold
06-11-2003, 01:30 PM
Originally posted by NitroGL:
Doesn't the 3DLabs P10 processor lack floating point? They have GL2 "support".

The vertex shader is a floating point machine. The fragment shader is split in both a floating point and integer machine.


Originally posted by Humus:
Yes, but if I'm not mistaken their GL2 support is limited to vertex shaders.

Incorrect. Current drivers, available on our website, (which implement earlier versions of the GL2 specifications) support both GL2 vertex and fragment shaders on a P10.

Barthold

glw
06-12-2003, 03:21 AM
Originally posted by barthold:
Incorrect. Current drivers, available on our website, (which implement earlier versions of the GL2 specifications) support both GL2 vertex and fragment shaders on a P10.

Barthold



Bathold what about the P9 in the VP560? Is there
support for fragment and vertex programs across
the whole VP range? And does that support vary
from card to card?

You say that fragment shader is split between
integer and floating point. What effect will this
have on the capabilities of the fragment programs?

Thanks in advance.

Gavin

barthold
06-12-2003, 01:46 PM
Originally posted by glw:
Barthold what about the P9 in the VP560?Is there support for fragment and vertex programs across the whole VP range?

The whole VP range supports GL2 vertex and fragment shaders. The P9 is a low cost version of the P10 chip, but it is architecturally the same. Thus it can support GL2 shaders, allthough at a lower performance most likely.



You say that fragment shader is split between integer and floating point. What effect will this have on the capabilities of the fragment programs?


Nothing, that is completely hidden by the compiler. I was just mentioning it to point out that the fragment shader can do floating point.

Barthold