Reading Ints, Graphics card crashes

I am attempting to read integers to a shader and use them as attribs but my graphics cards keeps crashing. Are integers harder on the GPU, and should I just stick to floats and cast them later?

Provided that your card supports OpenGL 3.0 or later, there should be no problem with using integer attributes.

Try to reduce your program to a simple test case which demonstrates the problem. Ensure that your code compiles without warnings (even with most warnings enabled), and that glGetError() indicates that no errors have occurred (you don’t need to call it after each OpenGL function call, just at the end of each major function in your program).

Alright. Since I knew it wasnt OpenGL, I was able to single out the error within my code. Thanks.