View Full Version : array of integer in GLSL
09-16-2010, 01:55 PM
im having trouble trying to implement an array of integer. so i have an array of int in the openGL, it contains millions of item. at first i thought i can save it as a 1D texture (isampler1D) and use texelFecth to implement intArray[index] alike operation. but im stuck since my isampler1D/texelFetch always return 0. So i was thinking is there any other option to represent a huge amount of int array in GLSL? If so, how?
thanks in advance
09-16-2010, 02:57 PM
There is an implementation-defined maximum to the dimension(s) of texture maps. It's probably on the order of 16,384 or so... much smaller than several million as you need. You could get around your problem by using a 2D texture map, which allows you to store far more elements, but will involve somewhat more complicated addressing since you'll have to convert your 1D index into a 2D pair of indices.
09-16-2010, 03:11 PM
yeah i noticed that too just now. how can i check this limitation? is there some kind of command that i can use? what about 2D implementation? is there limitation on those too? if I for example use a 2D texture but with a width 1 and height a million for example, can it be done to simplify my addressing?
thanks in advance
09-17-2010, 06:20 AM
Consider using Texture Buffer Object. It shouldn't have this limitation.
09-17-2010, 11:41 AM
See: glGet, GL_MAX_TEXTURE_SIZE for maximum texture size as for texture buffer objects, the maximum size can be queried with GL_MAX_TEXTURE_BUFFER_SIZE... one thing that is *funny*, a GL3 implementation is only required for those values to be atleast 1024 and 65536 respectively though if memory serves correct on both NVIDIA and ATI the values are much, much larger (for NVIDIA it is 134,217,728). Another approach is to use GL_NV_shader_buffer_load, but it is NVIDIA only.
Powered by vBulletin® Version 4.2.3 Copyright © 2017 vBulletin Solutions, Inc. All rights reserved.