GLSL: Shaders not working on Intel drivers

I have a couple of test shader program that does some lighting effects. It works fine on NVIDIA devices, but fails to display anything (at all) when using Intel graphics devices.

I’ve tested both on a laptop with Haswell i7-4700MQ CPU (GL version 4.0, GLSL 400) and a desktop system with Ivy Bridge i5-3470 (GL version 3.3, GLSL 330).

The shaders fail to draw anything (just black screen) on both devices. No errors, just nothing gets displayed. Yet, they work fine on every NVIDIA machine I’ve tried (3 machines, 4 operating systems - Win7, Linux Mint 13, 15, 17).

I have tried compiling with the GLSL reference compiler and got no compiler errors. I’m kind of at a loss for what to do. Any ideas?

test.frag


#version 150
/*
 *  Copyright © 2012-2014 Brian Groenke
 *  All rights reserved.
 * 
 *  This file is part of the 2DX Graphics Library.
 *
 *  This Source Code Form is subject to the terms of the
 *  Mozilla Public License, v. 2.0. If a copy of the MPL 
 *  was not distributed with this file, You can obtain one at 
 *  [b]URL removed to prevent post denial...[/b]
 */
#define MAX_LIGHTS 30

uniform int tex_bound;

uniform sampler2D tex;
uniform int light_count;
uniform vec2 lights[MAX_LIGHTS];
uniform vec3 light_colors[MAX_LIGHTS];
uniform float radius[MAX_LIGHTS];
uniform float intensity[MAX_LIGHTS];
uniform float ambient;
uniform vec3 ambient_color;

in vec4 color, tex_out;
in float light_max_dist[MAX_LIGHTS];

out vec4 frag_out;

void main() {
    vec4 rgba;
    if(tex_bound != 0) {
        vec2 uv = tex_out.xy;
        vec4 tex_frag = texture2D(tex, uv);
        rgba = tex_frag;
    } else
        rgba = color;
        
    vec3 light_sum = ambient_color * ambient;
    
    for(int i=0; i < light_count; i++) {
        float max_dist = light_max_dist[i];
        float r = radius[i];
        float v = intensity[i];
        float att = 0;
        vec2 coord = lights[i];
        vec2 diff = gl_FragCoord.xy - coord;
        float dist = length(diff);
        if(dist < max_dist) {
            att += v / pow(dist / r + 1, 2);
        }
        
        light_sum += light_colors[i] * att * (1-ambient);
    }
    
    frag_out.rgb = rgba.rgb * light_sum;
    frag_out.a = rgba.a;
}

NOTE: The vertex shader is actually 2 files combined. I’ve concatenated them here for simplicity. The first file defines the necessary uniforms and functions for orthographic transformation. The second is the actual vertex shader in use for the program.

transform.vert/test.vert


#version 150
// snap2d-transform.vert - Snap2D GLSL Vertex Transformation Utility
/*
 *  Copyright (C) 2012-2014 Brian Groenke
 *  All rights reserved.
 * 
 *  This file is part of the 2DX Graphics Library.
 *
 *  This Source Code Form is subject to the terms of the
 *  Mozilla Public License, v. 2.0. If a copy of the MPL 
 *  was not distributed with this file, You can obtain one at 
 *  [b]URL removed to prevent post denial...[/b]
 */

uniform mat4 mOrtho;
uniform vec2 vTranslate;
uniform vec2 vScale;
uniform vec2 vPivot;
uniform float fRotate;

void transform(vec4 vertex) {
    mat4 mTranslation = mat4( 1, 0, 0, 0,
                              0, 1, 0, 0,
                              0, 0, 1, 0,
                              vTranslate.x, vTranslate.y, 0, 1);
    float c = cos(fRotate);
    float s = sin(fRotate);
    mat4 mTransRot = mat4( 1, 0, 0, 0,
                           0, 1, 0, 0,
                           0, 0, 1, 0,
                           vPivot.x, vPivot.y, 0, 1);
    mat4 mRotation = mat4( c, -s, 0, 0,
                           s, c, 0, 0,
                           0, 0, (1-c)+c, 0,
                           0, 0, 0, 1 );
    mat4 mNegateTransRot = mat4( 1, 0, 0, 0,
                                 0, 1, 0, 0,
                                 0, 0, 1, 0,
                                 -vPivot.x, -vPivot.y, 0, 1);
    mat4 mScale = mat4( vScale.x, 0, 0, 0,
                       0, vScale.y, 0, 0,
                       0, 0, 1, 0,
                       0, 0, 0, 1 );
    mat4 mvp = mOrtho * mTransRot * mRotation * mNegateTransRot * mTranslation * mScale;
    gl_Position = mvp * vertex;
}

// --------------------------------------------------------------

// version (150) declared by transform shader
/*
 *  Copyright (C) 2012-2014 Brian Groenke
 *  All rights reserved.
 * 
 *  This file is part of the 2DX Graphics Library.
 *
 *  This Source Code Form is subject to the terms of the
 *  Mozilla Public License, v. 2.0. If a copy of the MPL 
 *  was not distributed with this file, You can obtain one at 
 *  [b]URL removed to prevent post denial...[/b]
 */
#define MAX_LIGHTS 30

uniform int tex_bound;

uniform sampler2D tex;
uniform int light_count;
uniform vec2 lights[MAX_LIGHTS];
uniform vec3 light_colors[MAX_LIGHTS];
uniform float radius[MAX_LIGHTS];
uniform float intensity[MAX_LIGHTS];
uniform float ambient;

in vec4 vert_color;
in vec2 vert_coord, tex_coord;

out vec4 color, tex_out;
out float light_max_dist[MAX_LIGHTS];

void main() {
    vec4 vert = vec4(vert_coord.xy, 0, 1);
    transform(vert);
    color = vert_color;
    tex_out = vec4(tex_coord.xy, 0, 1);
        
    float min_lum = 0.001;
    for(int i=0; i < light_count; i++) {
        light_max_dist[i] = radius[i] * (sqrt(intensity[i]/min_lum) - 1);
    }
}

Are you certain it’s a shader problem and not something with the setup in your C code?

If you are using a core profile it may be something simple as forgetting to set a vertex array object. I had it once that I just forgot to set it, the result was a black screen on a strictly core-compliant context. NVidia may be a bit lax in that regard.

I am using Java with the JOGL bindings.

I can’t find anything wrong with the main OpenGL calls, nor do I have any reason to believe it’s an issue with JOGL. Using my compatibility profile (FFP) works on everything I’ve tried, it’s only the shaders that haven’t been working.

The shaders actually DO work on Linux with the Intel driver (as I have just found out). So the problem only exists on Intel/Windows. I’m fairly certain the problem is with the shaders.

Hmmm. After a lot of further testing I’m not so sure. I created some much simpler shaders to help try and locate where the problems are.

In another test program, independent from the project having the issues, I have very similar code that does work on the Haswell machine, and does so with the simplified shaders I just mentioned swapped in.

I’m combing through the GL calls to try and find any discrepencies, but I still have yet to find any. The process seems to be more or less the same as the program that works.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.