Use a texture from another application

Hi,
I am not a true beginner in OpenGL as I know the basic things, but I would like to perform a more complex task.

Let say I have an OpenGL app running on Win7, it’s AppA. I have not a full control on that app but I can get the texture IDs it is using. Then I would like to code another app (AppB) to use the texture from AppA. AppB is a DLL.

The running process is as follow :

  1. Start AppA
  2. AppA sends the texture ID to AppB throught a DLL
  3. AppA starts AppB (set pixel format, create window) throught a DLL.
    Then AppA and AppB will work in parallel.

AppA and AppB have a different pixel format (AppB uses Quadbuffer stereo). Each app have its own window (so DeviceContext, right ?).

My problem : I cannot get the texture to work in AppB.

My attemps
(in every attemp I am binding the texture from AppA using its ID in “glBindTexture( GL_TEXTURE_2D, tid)”):
a) Directly use the texture in AppB with a glBindTexture and the textureID (tid) from AppA => not working.
b) Init AppB with the same RenderContext as AppA (using freeglut : “glutSetOption(GLUT_RENDERING_CONTEXT, GLUT_USE_CURRENT_CONTEXT)”) => not working.
c) I tried to make the two RCs to share data with “wglShareLists(…)”.
AppB is a DLL, here is its init function.


void EXPORT_API initStereoWindow()
	{
		int argc = 1;
		char* argv[] = {"plugin"};
		glutInit(&argc, argv);

		glutSetOption(GLUT_ACTION_ON_WINDOW_CLOSE , GLUT_ACTION_CONTINUE_EXECUTION);
		//glutSetOption(GLUT_RENDERING_CONTEXT, GLUT_USE_CURRENT_CONTEXT);

		HGLRC originalRC = 0, quadRC = 0;
		originalRC = wglGetCurrentContext();

		glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_STEREO);
		gWindowId = glutCreateWindow("StereoRenderer");

		quadRC = wglGetCurrentContext();

		bool ok = originalRC == quadRC; // ok is false
		ok = wglShareLists(originalRC , quadRC); // ok is true
		
		glutDisplayFunc(display);
	}

So, AppA starts, load AppB DLL, then calls the “initStereoWindow()” function. That function use “wglShareLists(…)” to make the RCs share data.
==> not working.

In each case, AppA calls the “glutMainLoopEvent()” (freeglut function equivalent to non-blocking “glutMainLoop()”) before swap its buffer.
One more thing, in case c) the clearcolor in AppV is the same as that in AppA. Since I did not set it (with glClearColor), I assume that data is shared between the apps.


Someone know how can I acheive that ?
In a simple form : I would like to share texture between two app (one that I can control only to call DLL, another one that I have coded).

Thanks in advance.


Appendix
Display and helper function


	void drawTexture(unsigned int tid)
	{
		glEnable( GL_TEXTURE_2D );
		//glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
		glBindTexture( GL_TEXTURE_2D, tid);

		glBegin( GL_QUADS );

		glTexCoord2d(0.0,0.0); glVertex2d(-1.0,-1.0);
		glTexCoord2d(1.0,0.0); glVertex2d(+1.0,-1.0);
		glTexCoord2d(1.0,1.0); glVertex2d(+1.0,+1.0);
		glTexCoord2d(0.0,1.0); glVertex2d(-1.0,+1.0);

		glEnd();

		glDisable(GL_TEXTURE_2D);
	}

	void EXPORT_API display(void)
	{
		glPushMatrix();
		glDrawBuffer(GL_BACK_LEFT);
		glClear(GL_COLOR_BUFFER_BIT);
		glLoadIdentity();
		drawTexture(gLeftTexId);

		glDrawBuffer(GL_BACK_RIGHT);
		glClear(GL_COLOR_BUFFER_BIT);
		glLoadIdentity();
		glTranslatef(0, 0, -2.0f);
		drawTexture(gRightTexId);
		
		glPopMatrix();
		glutSwapBuffers();
	}

Then I would like to code another app (AppB) to use the texture from AppA. AppB is a DLL.

Then it is not “another app”, since it’s just a .dll.

glutInit(&argc, argv);

I’m not comfortable at all with using GLUT to create this window. I’m not saying that it won’t work or that it doesn’t work. But GLUT is intended to be used by itself, and this use case is not something that GLUT ensures will work.

  1. AppA sends the texture ID to AppB throught a DLL

Do you mean you generate a new dll ? So strange.

Sorry, I have to clarify that…
AppB is a DLL with 4 important functions:
> setTextureID(int id) : called by AppA to specify the texture’s ID. Store the ID in a global variable.
> initStereoWindow() : called by AppA to init the “second window”. @Alfonse : perhaps using GLUT is not the good way.
> display() : never directly called by AppA.
> updateStereoWindow() : called by AppA every frame, before swaping its own buffer. That function use “glutMainLoopEvent()”.

I hope it’s more clear with that in mind.

By the way, thank you for your reply (arts and Alfonse).

Does wglShareLists succeed?

Yes, it returns “true” value and the HGLRC values are different (0x10001 and 0x10002 if it is usefull to know).
Moreover, it seems that the clearcolor is shared. AppB clear its buffer with the same color as AppA, despite I never explicitly set it.

Hi everyone.
In fact, my true goal is to run a regular OpenGL application in Quadbuffer mode.
Here I am trying to generate two texture with AppA then AppB will use them in QB mode. Perhaps there is a simpler way to acheive that.

I am thinking of other solutions.
Like force OpenGL to run in QB mode :
a) By using a custom OpenGL dll (opengl32.dll), which always active the GL_STEREO pixel format => Do you know how to do ?
b) By tweaking the driver (nVidia Quadro 4000) => After googled for that I found nVidia cards have some “Force stereo shuterring” option. But I have nothing like that for my driver…

After that, I can use an external DLL from AppA to select the correct buffer (left/right), but I am not sure that will work…

Any help is welcome, thanks.

Hi, perhaps I should ask this question in the “Advanced forum”, I am not sure it is a beginner task since it involve some specific skills…

By using a custom OpenGL dll (opengl32.dll), which always active the GL_STEREO pixel format => Do you know how to do ?

You can shove that into the pixel format all you want, but you will only get a QBS context if the driver wants to give you one.

Hi,
the driver is able to run in QB mode, I already coded QB stereo programs. The custom DLL would be to force AppA to init in QB mode, then I only need to select the correct buffer during the render process.
But I don’t know how can I compile my own opengl32.dll…

I imagine having your own opengl32.dll means having to intercept all GL 1.1 calls and also wglGetProcAddress calls. Look at GLIntercept code.
http://www.opengl.org/wiki/Debugging_Tools

If you meant you wanted to create a real opengl32.dll like Microsoft does, then that is not possible. It would require a license from MS and that is not going to happen.

Ok, hummmm :stuck_out_tongue:
I found GLIntercept during my previous research, I downloaded the sources… but it’s a maze. I am not sure I can compile a DLL that just force the QB mode.
I need to keep the performance as high as possible, this is the reason of my interest in “texture sharing”/“custom dll”. The simplest solution should have been to copy the texture buffer over the different apps, but it involve some “dirty slowly” code.

Thanks again, I hope we can get to solution.

I don’t understand something. You want to force the application to run in QBS mode. But… didn’t you write that application? You seem to have some control over it, if you can force it to load your “AppB” .dll file.

In any case, you had better have control over it, because you’re not going to make QBS work without it. You can’t simply flip a switch and get quad-buffer stereo to work. You have to change what buffers are rendered to (GL_BACK_LEFT and GL_BACK_RIGHT). You have to change what the viewing matrices are, to have two individual eyes and actual stereoscopic effects. You have to render the scene twice. And so forth.

Simply overriding OpenGL32.dll to make it force QBS into the pixel format isn’t going to be enough. Unless you actually are the OpenGL driver, you can’t make an application go stereoscopic without its consent. And even then, the driver sucks at it.

It’s a special case, cause I did not code AppA… AppA has its own initialization process but it let me execute some action during its rendering step. I can call foreign function from a DLL. As I want to made AppA goes QBS, my plan is as follow :

  1. Force AppA to initialize with the QBS pixel format by using a custom DLL.
  2. In AppA, select the correct buffer by calling a DLL (execute glDrawBuffer(GL_BACK_LEFT)).
  3. Modify the frustum/camera + render…
  4. In AppA, select the correct buffer by DLL (execute or glDrawBuffer(GL_BACK_RIGHT)).
  5. Modify the frustum/camera + render…
  6. Let AppA swap the buffer(s) like it is used to do (since the pixelformat is in QBS, swapping will act on all the backbuffers)

It’s ok, I am aware of that particular steps involved in stereo mode. I plan to execute them by a DLL like explained previously.

In summary, I need to force AppA to run in QBS mode.

Perhaps this is too obvious, but its worth to ask. Are you sure that the texture you want share is created after the wglShareLists call?
I have my GL a little dusted but i think (correct me if i am wrong) that only things created after wglShareLists call are going to be shared.

Also, only for curiosity, what is returning to you IsTexture in the AppB with te relevant tex id?

Thanks for your reply.
You are right, wglShareLists is called after the texture creation. I think you point the problem.
What if I call wglShareLists before texture creation but after AppA initialization ?

The IsTexture test is a good advice, I will try and give you the feedbacks.

Thanks.

You should always call share list before creating any objects. Preferably before objects in any context are created, but if not, then only the source context should have objects.

I used to write a driver to do exactly this. It’s a real headache. Theres so many things an opengl programs can do to make your life a nightmare, create layer buffers, off screen render targets with separate contexts and shared lists, using the ARB extensions to setup the pixel format for the window, I could go on. Single buffered windows !

quad buffer thanks to vista and 7 is a dead API pretty much anyway

quad buffer thanks to vista and 7 is a dead API pretty much anyway

What do they have to do with QBS?

Windows Aero doesn’t allow opengl to create windows with quad buffered pixel formats, due to the way aero is rendered (ie through direct 3d). You can create them in exclusive mode (ie fullscreen) but from what I remember nvidia will only allow shutter mode, I guess for their glasses and own stereo system

In XP the driver controlled the final presentation of the rendering and not direct 3d … so you could do stuff like set 2 monitors to clone mode, then with an opengl window with a quad buffered pixel format you could get a different image on each display, for mirror or dual projectors etc.

I created a work around for this under Vista which basically created 2 shared windows and drew to both simultaneously, but this created a new set of nightmares. You can’t have full frame rate for 2 windows and v-sync enabled. You can with Opengl Swap groups, but that’s a nvidia quadro only extension. Quadros cost a bomb.

Conclusion,
until MS create a stereo API in direct x things will suck.