Abstract Phong Material Description for OpenGL & Raytracing

Hi,

I have a Phong Material Implementation for OpenGL up and running using the normal setDiffuseColor() / setAmbientColor() / setSpecularColor() interfaces for Phong Materials inside OpenGL.

Now, I also want to export the same Scene into a raytracer which has a completely different way of describing (Phong) Materials in general.

My question is, does anyone have a pointer to a discussion about creating an abstract material description which can be used as a base material from which the OpenGL and raytracing descriptions of the given (Phong) material can be derived ?

I could not find anything with Google. Limiting the raytracing side of things to the OpenGL description of materials is cripping. However I miss somehow the underlying theory for a common description.

Thanks,

Markus

So what’s the “raytracing” way?

Are you referring to BRDF models?

Yes. BRDF, in other words how to map a BRDF based material description to an OpenGL approximation.

Thinking about it. I would be happy to know how to express a Phong specific BRDF into an OpenGL material description.

Thanks.

[QUOTE=markusm;1271903]Yes. BRDF, in other words how to map a BRDF based material description to an OpenGL approximation.

Thinking about it. I would be happy to know how to express a Phong specific BRDF into an OpenGL material description.
[/QUOTE]
The diffuse component corresponds to a constant BRDF. The specular component corresponds to a BRDF of k[sub]s[/sub]*(R.w[sub]o[/sub])n/cos(θ[sub]i[/sub]), i.e. the specular term from the Phong model divided by cos(θ[sub]i[/sub]). In either case, the BRDF may need to be scaled by 1/dw[sub]i[/sub] or that term may be implicit.

The fact that the specular term has a 1/cos(θ[sub]i[/sub]) term (to cancel out the cos(θ[sub]i[/sub]) in the incident energy density) means that the Phong model isn’t a true BRDF (it doesn’t conserve energy and isn’t bidirectional; for small values of θ[sub]i[/sub], reflected light will exceed the incident light).

If you’re using shaders, it’s straightforward to use a BRDF directly. E.g.


in vec3 light_dir;
in vec3 eye_dir;
in vec3 normal;
in vec3 light_color;

void main() {
    vec3 N = normalize(normal);
    vec3 L = normalize(light_dir);
    vec3 E = normalize(eye_dir);
    float cos_wi = dot(L, N);
    float k = cos_wi * BRDF(L, E); // will typically need additional parameters including N
    gl_FragColor = vec4(light_color * k, 1);
}

This results in a surface which is colourless. Colour may be handled by making wavelength an additional parameter to the BRDF and summing over multiple wavelengths (necessary to support accurate colour fringes with refraction, but usually overkill), having separate red/green/blue BRDFs, or having the BRDF return a RGB triple rather than a scalar.

Thanks for the explanation!

My (open source) framework runs completely in JavaScript, with an OpenGL abstraction layer which is implemented on the Web in WebGL and on Desktops in C++ using native OpenGL.

Now based on what you say, I would define JavaScript classes for different material types like Blinn, Lambertian, Phong etc. which would implement the shader code for the OpenGL part, on the C++ bridge to the raytracer I would create the suitable / fitting C++ BRDF description for the tracer and use that.

Question is, is it possible in OpenGL to have multiple different materials (BRDFs) in a scene which can be assigned to objects or only one global BRDF ? If only one global, this would make the approximation of the OpenGL scene difficult as in Raytracing it is common to use different material types (like Lambertian for matte surfaces, Blinn for specular etc) in one scene.

Thanks

Markus

Any aspect of the state (e.g. the shader, or its uniform variables) can be changed between draw calls (glDrawElements() etc).

You can potentially even use multiple materials within a single draw call, by assigning a material index to each vertex and having the shader use that as an index into an array of material parameters (or as the condition of a switch statement to select distinct functions).

Whether to use multiple draw calls, each with a specialised shader, or whether to have a single “ubershader” is a balancing act. Too many draw calls can hurt performance, but so can an excessively-complex shader.

Ok great, got it.

Now just need to find the right contractor …