I have a program that I wrote that is behaving completely differently on all three of my test machines.
The behaviour is coming from gldrawelements(). On my crappy work laptop (with an Intel graphics chipset) everything draws perfectly. It provides opengl 3.0. On my desktop at home with an ATI Radeon HD 5770 I get a machine exception when it hits the gldrawelements function. This machine runs Opengl 4.2. Finally, my crappy Lenovo laptop with a ATI Radeon Mobility chipset runs the program but doesn't display anything drawn with gldrawelements but doesn't crash. This is the same binary on three different computers. Making the situation more bizarre is that all three machines work perfectly with glDrawArrays. The only thing I can think of is the opengl version each is using? Why does the crappy intel chipset work perfectly? I am running WinXp on my work laptop, vista on my desktop, and win7 on my personal laptop. All have updated drivers.

I have attached the precompiled code below, its in Euphoria so I am not sure how much help it will be. Also, if my code was bad, why does it work perfectly on my work laptop?

I have a compiled OpenGL program with the following EU code. The entire codebase is way too large to post. I can provide a binary if anyone thinks it will help.
vbo creation:
----------------------------
Code :
gl_genBuffers(1, vboverts)
gl_bindBuffer(GL_ARRAY_BUFFER, peek4u(vboverts))
gl_bufferData(GL_ARRAY_BUFFER, length(Verts)*4, VertsArray, GL_STATIC_DRAW)
gl_bindBuffer(GL_ARRAY_BUFFER, 0)
 
gl_genBuffers(1, vbotxt)
gl_bindBuffer(GL_ARRAY_BUFFER, peek4u(vbotxt))
gl_bufferData(GL_ARRAY_BUFFER, length(Txtr)*4, TextArray, GL_STATIC_DRAW)
gl_bindBuffer(GL_ARRAY_BUFFER, 0)	
 
gl_genBuffers(1, vbonorms)
gl_bindBuffer(GL_ARRAY_BUFFER, peek4u(vbonorms))
gl_bufferData(GL_ARRAY_BUFFER, length(Norms)*4, NormArray, GL_STATIC_DRAW)
gl_bindBuffer(GL_ARRAY_BUFFER, 0)
 
gl_genBuffers(1, iboidx)
gl_bindBuffer(GL_ELEMENT_ARRAY_BUFFER, peek4u(iboidx))
gl_bufferData(GL_ELEMENT_ARRAY_BUFFER, length(Shapes)*2, IdxArray, GL_STATIC_DRAW)
gl_bindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0)


Render:
----------------------------
Code :
gl_enable(GL_BLEND)		
gl_useProgram(glprogs[2][1])
gl_activeTexture(GL_TEXTURE0)		
gl_bindTexture(GL_TEXTURE_2D, peek4u(texture_mem + ( ( n_3dx[4][glcmds[$]]-1 ) * 4 ) ) )  
gl_uniform1i(glprogs[2][3][4], 0)
 
gl_uniform1i(glprogs[2][3][5], shade)
 
gl_uniformMatrix4fv(glprogs[2][3][1], 1, GL_FALSE, CoreMatrix)		-- push model matrix
gl_uniformMatrix4fv(glprogs[2][3][2], 1, GL_FALSE, xCamera[7] )		-- push view matrix	
gl_uniformMatrix4fv(glprogs[2][3][3], 1, GL_FALSE, PerspectiveMatrixArray )	-- push perspective matrix
 
gl_enableVertexAttribArray(0)				-- coord3d
gl_bindBuffer(GL_ARRAY_BUFFER, peek4u(n_3dx[6][1]))
gl_vertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0)	
 
gl_enableVertexAttribArray(1)				-- textcoords
gl_bindBuffer(GL_ARRAY_BUFFER, peek4u(n_3dx[6][2]))
gl_vertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 0, 0)	
 
gl_bindBuffer(GL_ELEMENT_ARRAY_BUFFER, n_3dx[6][3])
gl_drawRangeElements(GL_TRIANGLES, 0, n_3dx[6][5], n_3dx[6][5], GL_UNSIGNED_SHORT, n_3dx[6][4])		
errCode = gl_getError()
 
if errCode > 0 then
	writetofile(sprintf("\nRender code %d thrown!   %d\n", {errCode, rendnum}), ZA_LOG)
end if
 
gl_disableVertexAttribArray(0)
gl_disableVertexAttribArray(1)	
 
cntp += 1
 
gl_disable(GL_BLEND)


Thanks for any help!
Steve A.