win10 trouble

yow all

A ‘new’ refurbed ultrabook using win 10 pro and with updated graphics-driver that matches opengl 4.4 cannot execute my previous opengl works done within a forward compatible 3.3 profile on win 7. ?
It does not show errors except for a very general draw-call error … it executes the program and my cout debug info, but shows nothing on screen.

I will not ask Microsoft since it’s their best interest to see openGL in error and that I at best will get the merry-go-round there.

Has anyone got a clue?

edit: an odity: I tested the graphics-card and it had a fps +900 … that should be plenty for a 60hz screen…

You might provide more details. There’s not much here to latch onto.

For instance, you haven’t told us:

[ul]
[li]what GPU and graphics driver you had installed on there before, [/li][li]what GPU and graphics driver you have installed now (try running GPU-Z or GL Caps Viewer), and [/li][li]what the error is. [/li][/ul]
You have told us that your program(s) were/are targeting a “forward compatible 3.3 profile”. Why? Was your old box running Mac OS? Try removing the forward compatibility bit.

hi Photon
I cannot right-away provide what you ask (have to leave home to the net on local library), but I’ll try from my head:
The pc that created the programs is pretty old and has a minor gforce graphics card that provides a 3.3 openGL version. The programs are worked upon till some degree of perfection (meaning that they execute without problems here). The pc runs win7 pro.
The problems appear on my /new/ and modern ultrabook running on win10 pro using an intel family cpu (somewhere displaying number 5300). The gpu is shared with the cpu. This pc is new to me, and I’m catching up slowly on it’s peculiarities. I’ve only tried to load one program (a game) and it’s failed too … I can get a proper display or a fair sound (each belonging to it’s own compabillity-choise), but not both.

As for a host of my own opengl works saved over time, they seems to have one thing in common: they display nothing on the screen except perhaps the first line: gl_clear_color(some_color) . I’ve run the extensions-viewer and it finds support up to and including ver 4.4.
Most of my works shares the same initiation. It’s put into a try/if and ought to break out of the program. I’ve just looked at the error-code from some of my latest work, and it does show, that there seems to be errors in the shaders. Since I have not yet come to building the tool-chain on the pc I have only gleaned shortly at the code … it does not make much sense (referring to ‘FragUserData’ and ‘aTexture’ … not variables that I can recognise). The try/if rescue code for breaking out of the program may be faulty … I’ve not used it very much, but it does not close down the programs as I would expect.
Anyway, this shader-stuff is an error-path for me to follow.
But … I find it strongly suspect that ALL my stuff (and prof programs) all that works well everywhere else, fails.

The bare idea of trying to set up the tool-chain for working on THAT pc makes me feel as if I got chicken-pox, but that was the general idea for buying it.

Thanks for your input DP


doesn’t forward-compatible mean that my code will be rock-solid once it meets future gpu’s ? … as is what is exactly happening? I might not be the brightest tree in the wood, what else should it mean?

[QUOTE=CarstenT;1292488]The problems appear on my /new/ and modern ultrabook running on win10 pro using an intel family cpu (somewhere displaying number 5300).
As for a host of my own opengl works saved over time, they seems to have one thing in common:
they display nothing on the screen except perhaps the first line: gl_clear_color(some_color) .
I’ve run the extensions-viewer and it finds support up to and including ver 4.4. …

there seems to be errors in the shaders. …
it does not make much sense (referring to ‘FragUserData’ and ‘aTexture’ … not variables that I can recognise).[/QUOTE]

Ok, so you’re switching from an old NVidia GeForce GPU to an Intel HD Graphics 5300 GPU embedded in your CPU which has OpenGL 4.4 drivers installed (though we’re not sure whether that’s OpenGL 4.4 Core Profile, or OpenGL 4.4 Compatibility Profile).

Doing a quick websearch for the shader error fragments you mentioned, I see:

[ul]
[li]Anti Aliasing and LightScatteringFilter [/li][li]‘texture2D’ : function is removed in Forward Compatibile context #13 [/li][li]Crash on startup, OpenGL.error.NullFunctionError (windows, playscii 0.5.1) [/li][/ul]

The common thread being folks running on their embedded Intel GPU with Intel drivers seem to have this problem, but when they flip to an NVidia GPU with NVidia drivers the problem goes away.

A piece of the error messages listed gives a clue:


WARNING: 0:52: 'texture2D' : function is deprecated and not available in Core Profile context 

or


ERROR: 0:15: 'texture2D' : function is removed in Forward Compatibile context

So, it may be that you are (as you indicated) creating a forward-compatible or core context, and your shaders are using deprecated or removed OpenGL features. That seems like the best bet. Alternatively, it could be that your shaders are buggy, the drivers don’t support the compatibility profile very well, or the driver may have a bug.

If you want the widest OpenGL feature support possible, create a GL context with the “compatibility profile”. NVidia and AMD have wide support for this across all OpenGL versions. Other vendors have more limited support. You should check into your Intel’s GL drivers and determine the maximum “core profile” version your drivers support as well as the maximum “compatibility profile” version they support, and write your code accordingly.

doesn’t forward-compatible mean that my code will be rock-solid once it meets future gpu’s ? … what else should it mean?

I can see that interpretation. But what it actually gives you is the maximum number of “old OpenGL features” disabled. The idea is that if you accidentally use a feature deprecated in some OpenGL version, it might be removed in a future version. So “forward compatible” doesn’t even let you get started using it when you are writing the code. It makes it completely unavailable.

For more on Forward-compatibile Contexts, see this link: OpenGL_Context#Forward_compatibility

Hi DP,

“Ok, so you’re switching from an old NVidia GeForce GPU to an Intel HD Graphics 5300 GPU embedded in your CPU which has OpenGL 4.4 drivers installed (though we’re not sure whether that’s OpenGL 4.4 Core Profile, or OpenGL 4.4 Compatibility Profile).”

… just to forego misunderstanding: My opengl-work has been done on the win7 pc, and the .exe’s are attempted executed on the win10 pc … which fauls everything up. The execution shows no errors on the win7. The extensions-viewer says ver 4.4 for the win10 pc, and 3.3 for the win7 pc.

I had a quick attempt on the source-code in win7, to change the compiled version from 3.3 to 3.1 or 3.0 and the debugger pointed to a few functions that was used and not supported (one of which was somewhere in the uniform-location process) … the cascade of changes needed held me from going any further.

I should have a look at the textureD2 stuff … it could make sense as a single source for some of the errors.

Both the pc-type (ultrabook, Lenovo thinkpad) and win10 are new to me, so I’m faced with a lot of changes from my usual pc-rutines. I havn’t installed the win10, and those who did may not have installed any update of Lenovos - I’m investigating this side of the possibilities right now. It would make sense that all problems are fixed here.

I’ve copied your reply for closer scrutiny.

I’ll pop by and tell you the progress,
thanks & Cheers

hi again
For now, my ‘new’ pc is a lost case.
I just checked the graphics driver for that pc online (it provides no opengl info on it’s own). The driver loaded should support opengl 4.x. You’ll probably know the bewildering status of previous win-versions: you would have to update the driver yourself, to get the opengl functionality. I shall assume that the driver-part is ok. That leaves the principiel question:
If I build an executable on win7 of an opengl 3.3, forward compatible, core profile it should be executable on win10 opengl 4.x. Right?
If I got it wrong, which profile should I build it on win7 then, if I want to avoid the full shebang?

Hi,
I’ve been tempted to start a new thread on this, but it’s no good etikette …
To recap: I’ve build a program (on toolchain on win7 mashine (opengl ver 3.3) ) that does run flawless, but not on another laptop win10 (opengl ver 4.4). I’ll skip further info here since the error-info I get from the win10 mashine sort of ‘cries to the skies’:

It’s building the fragmentshader that fails.
The program contains two program objects with each their shaders.
The first build is no more complex than setting the the fragment-color:

fragment_shader_string=
"#version 330 core
"
"precision highp int;
"
"precision highp float;
"
"flat in vec4 coll;
"
"flat out vec4 FragColor;
"
"void main(){
"
" FragColor = coll ;
"
"}
";

the error it spawns is at line 5: flat out vec4 FragColor;
Oddly enough the short error refers to ‘FragUserData’ (…cannot be interpolated’) and not the name of the variable.

The other program-object is more complicated since it involves a texture:
fragment_shader_string=
"#version 330 core
"
"precision highp int;
"
"precision highp float;
"

        "uniform samplerRect aTexture;

"

the error here is at line 4, where the errorstring do refer to the variable by it’s name ‘aTexture’.

One of the links DP brings refers to some d2Texture-trouble, but, as you see, the simple shader breaks on even the simplest defining a variable. … changing my entire texture-setup will be next to impossible, and probably not solve the real problem, whatever that is.

I’ve had a look at the context used, and I cannot find anything suspect about it. I’m using GLFW, setting a forward_compatible, core profile and adding
"#version 330 core
" to all shaders as well as in the context-request minor, major. If I have used anything depricatable, the compilation on the win7 mashine would have warned me.

I do have attempted to update the graphics driver but probably have been dismissed with ‘you already got the best’
If I should look for drivers elsewhere but at the vendor of the mashine or GPU/CPU (it’s dual) I’ll be blank on what and how.

I suppose that this is the sweet revenge in an ongoing war

It’s up to you how much effort this is worth. Personally, I’d just pick up an inexpensive NVidia GPU and install NVidia graphics drivers. Then you can move away from tripping over driver issues and onto solving more interesting problems. Later on if/when you need more GPU horsepower, you can consider an upgrade.

But you said this was a laptop. Sorry. That’s one reason I avoid them – upgradability on those things stinks.

hi dp,
If I get you right: this pc has the gpu emulated on the cpu … no physical stuff to replace.
A previous laptop of mine was graphics-driver updatable … the default driver did indeed ignore opengl.

wasn’t the problem of missing compatibility not solved by the arrangement between Microsoft (and others) and openGL? … as a prerequisit for making software that involves graphics, a prerequisit for corporation?
I want to know what I’m doing wrong or who is responsible for the foul-up. Microsoft is my obvious suspect. This would … back in the day … stir quite an uprore. Why doesn’t it? The problem is general enough to involve two of my two win10 laptops.

dp, tell me: how do I dump two years of intense work into the bin, or live with the fact that my work is not presentable on modern pc’s?

[QUOTE=CarstenT;1292974]
If I get you right: this pc has the gpu emulated on the cpu … no physical stuff to replace. [/QUOTE]

Probably. As I mentioned before, check the specs of your laptop or run GPU-Z or similar program to be see if your has another GPU in it. Some laptops have both the on-board Intel GPU in the CPU chip as well as an NVidia GPU which can render frames alongside it, and you can toggle rendering between these two GPUs. You can also just check for the presence of another GPU in your Device Manager (look under “Display Adapters”).

wasn’t the problem of missing compatibility not solved by the arrangement between Microsoft (and others) and openGL?

Before you go there, you should verify that your problems are definitely not due to something you’re doing wrong. It doesn’t sound like it, but we should be sure. Post a short, stand-alone test program that illustrates your problem for folks to review and try locally. If your issue isn’t due to an error on your part, then…

This isn’t whether graphics drivers can work well on Windows “in general”. This is an issue of the quality of graphics drivers written by GPU vendors.

Some GPU vendors “are” producing graphics drivers with high quality and capability. Some are not. As always, as a consumer you have to be aware of which vendors are which and make your hardware purchases accordingly.

dp, tell me: how do I dump two years of intense work into the bin, or live with the fact that my work is not presentable on modern pc’s?

If in-fact your problems are due to driver quality (yet to be proven), you have several choices: 1) Figure out how to work-around the issues in the graphics driver that you have, 2) Punt that and just chose GPUs from vendors with better graphics driver quality, or 3) Run an OpenGL implementation the executes the pipeline on your CPU (e.g. Mesa3D).

Nothing so drastic as dumping years of intense work. Just vote with your dollars like the rest of us do.

hi dp,
I’ll scrutinize the points you’ve pointed out.
I’ve hoped to get a transportable pc/low power-consumption (to power with a solar cell in the field) to be able to leave home and work elsewhere. The first laptop was a rather expensive one, but second hand, the second one the cheapest you can get. There doesn’t seem to be any difference between the errors they throw on the program - the cheap one does not fault on the solar recharging though, but this is off topic. Due to my goal, the setup has already provoked a cascade of doobious investments that has made me touchy. Both of the pc’s are Lenovo (thinkpad/ideapad) and with ‘intel inside’ … it’s hard to think of win10 and Intel foul up their corporation, but I’ll certainly take a look at it. None of the pc’s contains the word opengl, but you know how it is. The net and my pcs are in two different places, but I’ll let you know later.
Thank you for responding.

hi dp,
I’ve copy-pasted some usable samplecode for erecting initiation to test the shader that does not work. In the process I happened to notice GLEW_EXPERIMENTAL=true … This could be an obvious error in the forward-context problem. I’ll have to do some running back & forth to check the version of glew I’m using.

---- app fouled up trying to bring sample-code

hi dp,
This editor failed my first attempt to add sample-code.
Here is a copy-paste of the original functions used. The GLFW & GLEW versions used are are still the latest versions released.
Vertex-shaders passes compilation and even the simplest fragment-shader does not. The pc that compiles the .exe has a ‘real’ graphics-card (navidia).

– ugly code-warning –
I have classed all the ‘program-stuff’ into a base and derived classes.
“glSystem::gui_program* p_gui =new glSystem::gui_program;”
is such a derivate. The gui_program.mBuffer-struct holds all the id’s (program,VAO…), and much more


GLFWwindow *initGlfw(const char* titl){
    if (!glfwInit())
    {
        std::cout << "initting glfw failed
" << std::endl ;
        exit(EXIT_FAILURE);
        return NULL;
    }
    else{
        std::cout << "glfw is initiating 
" << std::endl ;
    }
    glfwWindowHint(GLFW_OPENGL_DEBUG_CONTEXT, GL_TRUE);
    glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
    glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
    glfwWindowHint(GLFW_CLIENT_API,GLFW_OPENGL_API ) ;
    glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT , GL_TRUE );
    glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
    glfwWindowHint(GLFW_RED_BITS, 8 );
    glfwWindowHint(GLFW_GREEN_BITS, 8 );
    glfwWindowHint(GLFW_BLUE_BITS, 8 );
    glfwWindowHint(GLFW_ALPHA_BITS, 8 );
    glfwWindowHint(GLFW_DEPTH_BITS, 24 );
    glfwWindowHint(GLFW_DOUBLEBUFFER, GL_TRUE );
    GLFWwindow *m_window = glfwCreateWindow((int) sys::s.paneWidth, (int)sys::s.paneHeight, titl , NULL, NULL);
    if (!m_window)
    {
        glfwTerminate();
        std::cout << "init window failed
" << std::endl ;
        exit(EXIT_FAILURE);
        return NULL;
    }
    else{
        std::cout << "init window should be sucessfull
" << std::endl ;
    }
    glfwMakeContextCurrent(m_window);

    return m_window;
}








glSystem::gui_program* get_GUI(){
    cout << "builds get_GUI()
";
    glSystem::gui_program* p_gui =new glSystem::gui_program;
    vt.add_type(geometry::eVertexComponent:Position);
    p_gui->setVertexType(vt);
    p_gui->add_uniform_name("color",glSystem::eUnifType::vec4);

    p_gui->add_uniform_name("matrix",glSystem::eUnifType::matrix);
    p_gui->add_uniform_name("offset",glSystem::eUnifType::vec4);
    p_gui->add_uniform_name("utility",glSystem::eUnifType::ivec4);
    p_gui->vertex_shader_string=
        "#version 330 core 
"
        "precision highp int;
"
        "precision highp float;
"
        "layout(location = 0) in vec4 position; 
"
        "uniform vec4 color; 
"
        "uniform mat4 matrix; 
"
        "uniform vec4 offset; 
"
        "uniform ivec4 utility;
"
        "vec4 tmp ; 
"
        "vec4 tmp_2 ; 
"
        "flat out vec4 coll;
"
        "void main()
"
        "{
"
        " tmp_2 = position+offset ;
"
        " tmp = matrix*tmp_2 ;
"

        " tmp.w = 1.0;
"
        " coll = color ;
"
        " if(utility.y==1){coll.w=0.5;} ;
"
        " gl_Position= tmp ;
"
        "}
";

    p_gui->fragment_shader_string=
        "#version 330 core 
"
        "precision highp int;
"
        "precision highp float;
"
        "flat in vec4 coll;
"
        "flat out vec4 FragColor;
"//compile-error:not exportable(cannot be interpolated in win10)
        "void main(){ 
"
        "     FragColor = coll ;
"
        "}
";

    sys::makeSysProgram(p_gui,GL_FALSE);

    p_gui->init_locations();

    vector<GLuint> e(6*64,0);
    e.at(1)=1;
    e.at(2)=2;
    e.at(4)=2;
    e.at(5)=3;
    for(GLuint i=1;i<64;i++){
        e.at(i*6 + 0) = e.at(0) + 4*i ;
        e.at(i*6 + 1) = e.at(1) + 4*i ;
        e.at(i*6 + 2) = e.at(2) + 4*i ;
        e.at(i*6 + 3) = e.at(3) + 4*i ;
        e.at(i*6 + 4) = e.at(4) + 4*i ;
        e.at(i*6 + 5) = e.at(5) + 4*i ;
    }
    p_gui->allocate_indices(e);
    return p_gui;
}






GLuint loadProgram( glSystem:Program_base* sProg, GLboolean bTransfFeed ){
    cout << "enters loadProgram
";
    sProg->mBuffer.Program=glCreateProgram();
    GLuint vertShader;
    try
    {
        vertShader = loadShader( sProg->vertex_shader_string, eShader::eVertex ) ;
        sys::checkErrors("loadProgram_VerShader err? : ") ;
    }
    catch(std::exception &e){std::cout << "an exception ocurred in vertShader:
 " << e.what() << std::endl ;}
    glAttachShader(sProg->mBuffer.Program, vertShader);
    cout << "loadProgram created vertShader: " << vertShader << "
";

    GLuint fragShader;

    try
    {
        fragShader = loadShader( sProg->fragment_shader_string,eShader::eFragment ) ;
        sys::checkErrors("loadProgram_FragShader err? : ") ;
    }
    catch(std::exception &e){ std::cout << "an exception ocurred in eFragment:
 " << e.what() << std::endl ;}
    glAttachShader(sProg->mBuffer.Program, fragShader);
    cout << "loadProgram created fragShader: " << fragShader << "
";
    if(bTransfFeed){
        glTransformFeedbackVaryings(sProg->mBuffer.Program,1,attribNames,GL_SEPARATE_ATTRIBS);
        sys::checkErrors("loadProgram glTransformFeedbackVaryings err? : ") ;
    }

    glLinkProgram(sProg->mBuffer.Program) ;
    sys::checkErrors("loadProgram link err? : ") ;
    GLint status1;
    glGetProgramiv (sProg->mBuffer.Program, GL_LINK_STATUS, &status1);
    if (status1 == GL_FALSE){
        checkErrors("loadProgram after linking ") ;
        std::cout << "Shader status NOT ok
" << std::endl ;
        char infoLog[1024];
        glGetProgramInfoLog(sProg->mBuffer.Program, 1024, NULL, infoLog) ;
        std::cout << "shader failed with error : 
" << infoLog << std::endl ;
        GLuint err =0;
        std::cerr << stderr << " Error: 
" << glewGetErrorString(err) << std::endl ;
        glDeleteProgram(sProg->mBuffer.Program) ;
        glDeleteShader(vertShader);
        glDeleteShader(fragShader);
        throw std::runtime_error("Shader could not be linked.
");
        glfwTerminate();
        exit(1);
    }
    else{ }

    GLint verif;
    char infoLog[1024];
    glValidateProgram(sProg->mBuffer.Program);
    glGetProgramiv(sProg->mBuffer.Program,GL_VALIDATE_STATUS,&verif);
    if(verif==true){}
    else{ std::cout << "shader.verification NOT ok" << endl ; }
    glGetProgramInfoLog(sProg->mBuffer.Program, 1024, NULL, infoLog) ;
    std::cout << "ValidationLog : 
" << infoLog <<std::endl ;

    glDeleteShader(vertShader);
    glDeleteShader(fragShader);
    sys::checkErrors("loadProgram exit err? : ") ;
    return sProg->mBuffer.Program ;
}

So, you’ve written shaders that compile on an Nvidia driver, but fail on another laptop with Intel graphics.

And what have you done to debug this?

Have you tried reading the instructions?
For example, look at page 27 in the GLSL 3.30 specification, Section 4.3 “Storage Qualifiers”, where it says:

These interpolation qualifiers … do not apply to inputs into a vertex shader or outputs from a fragment shader.

Indeed, copy-pasting your shader here on OS X, a “flat” fragment output fails to compile:

ERROR: 0:5: Invalid qualifiers 'flat' in global variable context

Or, for another easy second-opinion verification, you could paste your shader into Shader Playground (a godbolt-like website for shader compilers). Khronos’s glslang compiler there also errors as expected:


ERROR: ...tmp:5: 'flat/smooth/noperspective' : can't use interpolation qualifier on a fragment output 

Similarly, look for “samplerRect” in the GLSL specification. It doesn’t exist, does it? But look at page 14 “Variables and Types”, where all the sampler types are listed, including sampler2DRect.

The expected behavior when you type in a malformed shader is compilation failure. You’ll need to fix your shaders to follow the documented syntax if you want them to compile.

(If they actually are compiling on an Nvidia driver, that driver is broken.)

doo, am I releaved…
I have got error-response that did not repeat and that was not handled by me … maybe the pc has been ‘wise’ about what it runs, and has capacity to do something, I don’t know. Making all this fuss about a program that fails and where I could error-track on the other pc would be insane … it works perfectly there. I’ll send you a thumbs up when I’ve error-corrected to a working program.

I’m very greatful.
Carsten

8oO … it works …
I’m down on my knees in awe and admiration.
Just changed the three words.
I’ve made an algorithm that splits a contour-array (covering an irregular polygon with islands) into an array of geometric fan-data, to be processed in opengl for your choosing, or elsewhere. The test-app, including fonts and text, is of my own doing and all spans +50000 lines. The only good thing to say about it is, that it works. I have still not figured out if it solves a problem of generel interest.

I deeply appreciate your intervention
Carsten

I’ve tried to attach an image of the app and how far it can go: a screenful of hand-clicked polygon-points:

[ATTACH=CONFIG]1893[/ATTACH]

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.