View Full Version : Using GL_BYTE for normal array
02-18-2008, 12:30 PM
I know that some ATI cards will fallback to software rendering if a 24-bit color array is used. Does anyone know if using GL_BYTE for the normal array data type will cause the same problem? I would prefer to do without a vertex attribute, if possible.
Erm, software-mode AFAIK not, but it will definitely be slow (tested that half a year ago).
It's better to use GL_UNSIGNED_BYTE with 4 components and unpack the data into the [-1;1] range manually in the shader. Maybe you even find some useful data, that you can store in the fourth component.
Powered by vBulletin® Version 4.2.2 Copyright © 2015 vBulletin Solutions, Inc. All rights reserved.