You really don’t seem to understand the whole buffer object thing.
Buffer objects are memory. They are memory that OpenGL owns, but you tell OpenGL when to allocate it and what it contains.
glBufferData (stop using the ARB extension. Buffer objects have been core OpenGL for over half a decade) allocates the memory and stores data into it. It’s a combination of “malloc” and “memcpy”.
So first, there’s no need to map the buffer since you just copied data into it. You know what’s there. There is no need to confirm it.
Second, “pattern” is the location in CPU memory that you told glBufferData to upload data from. Once the data has been uploaded, the contents of “pattern” are irrelevant.
Let’s say you have this:
int pattern[5] = {1, 2, 3, 4, 5};
int bufferObjectData[5];
memcpy(bufferObjectData, pattern, sizeof(pattern));
pattern[2] = 10;
What is the value of “bufferObjectData[2]”? It’s not 10.
After you have uploaded the data, if you are using the original pointer for anything other than size information, you’re doing it wrong. So I have no idea what you were trying to do with that glMapBufferRange call, but “pattern[0]” is likely not doing anything useful.
Third, why are you mapping a pointer just to get one byte? That’s what the third argument of glMapBufferRange is: the number of bytes to map. If you just want a byte, call glGetBufferSubData.
Fourth, even if you want to map a single byte (and you shouldn’t), you can’t pretend that the pointer you get back is a pointer to an int. Because you only mapped a byte, it is only legal to read a byte from the pointer you get back. The C/C++ type “int”, for any hardware you would actually run this code on, is larger than a byte. If you wanted to read just an integer’s worth of data (and again, use glGetBufferSubData), you need to get at least “sizeof(int)” bytes.
If you’re going to read from a buffer object, you should in general be serious about it. Buffer objects live in OpenGL’s memory, and they should be read only when you really need to.
Fifth:
glBufferDataARB(GL_ARRAY_BUFFER_ARB, sizeof(pattern), pattern, GL_READ_WRITE_ARB);
GL_READ_WRITE_ARB is not a legitimate buffer object usage hint. The wiki article on the subject could tell you that.