Problems playing animation created blender

I am attempted to play an animation created in blender inside my openGL program. First of all, i wrote a custom python exporter to convert to a custom format. It is similar to .obj, except the it also stores bone hierarchy, vertex weight influences, as well keyframes and bone transformation matrices per keyframe. (And it also uses raw data). I know my python exporter and c++ importer do everything correctly, except the bone matrices. The animation is wrong, so either i am exporting the wrong matrices from blender, or there is something wrong with my opengl/glsl code to render it.

My python code to export the matrix is basically this (from what i gathered browsing stack exchange it should be correct):

            for i in list_of_frames:
...
                bpy.context.scene.frame_set(i) #set this as the current frame in blender
                for bone in arm.pose.bones:  #loop through bones
                    mat = bone.matrix_basis
...
                    for m in range(4): #write matrix to file. Note that blender matrices are row-major and i write it in the form <row1><row2>etc.
                        fw(struct.pack('f',mat[m][0]))
                        fw(struct.pack('f',mat[m][1]))
                        fw(struct.pack('f',mat[m][2]))
                        fw(struct.pack('f',mat[m][3]))

My c++ code to read it goes like this:

				glm::mat4 mat;
				for (int y = 0; y < 4; y++) {
					for (int x = 0; x < 4; x++) {
						dataToFloat(buffer.data() + ptr, &mat[y][x]); //NOT-row-major
						ptr += 4;
					}
				}
				glm::rotate(mat, -3.14f / 2.f, glm::vec3(1.0f, 0.0f, 0.0f));
				glm::scale(mat, glm::vec3(0, -1, 0)); //use openGL coordinate system

Now, other then the code to convert blender->openGL coordinate system, i know it copies the data correctly, but either my python exporter is giving it the wrong data or the c++ renderer is using it wrong.

A ‘keyframe’ is stored as a time stamp and a list of matrices, for each bone:

struct KeyFrame {
	float time;
	std::vector<glm::mat4> mat4s; //stores the matrices for each bone at this keyframe.
};

And a bone is just:

struct Bone {
	std::vector<int> children;
	Bone* parent;
	int id;
	glm::mat4 matrix; //doesnt store data between frames, temporary calculations
};

Now, my this is my renderer; note that an ‘AnimationMesh’ is a collection of vertices, normals, vertex weights, etc plus a bone hierarchy, and a ‘Animation’ is a list of keyframes:


//used to recursively calculate a bone's final matrix:
glm::mat4 calcMatrix(Bone* bone) {
	if (bone->parent == NULL) {
		return bone->matrix;
	}
	else {
		return calcMatrix(bone->parent)* bone->matrix;
	}
}

void RenderAnimation(AnimationMesh* aM, Animation* anim, float delta, GLuint shader) {
        //make sure current time is in bounds
	if (anim->time >= anim->keyFrames.back().time) {
		anim->time = anim->keyFrames[0].time;
	}	
	if (anim->time < anim->keyFrames[0].time) {
		anim->time = anim->keyFrames[0].time;
	}
        //find the previous and next keyframe for the current time 
	KeyFrame *start = NULL, *stop = NULL;
	for (int i = 0; i < anim->keyFrames.size(); i++) {
		if (anim->time >= anim->keyFrames[i].time) {
			if (anim->time < anim->keyFrames[i + 1].time) {
				start = &anim->keyFrames[i];
				stop = &anim->keyFrames[i + 1];
				break;
			}
		}
	}
        calculate a value used for interpolating between previous and next keyframe. 
	float inter = (anim->time - start->time) / (stop->time - start->time);
	for (int i = 0; i < aM->Hierarchy.size(); i++) {
               //for each bone, interpolate between the matrices of the bone at the previous and next keyframe. Store this to the corresponding bone's matrix.
		aM->Hierarchy[i].matrix = glm::mat4_cast(glm::slerp(glm::quat_cast(start->mat4s[i]), glm::quat_cast(stop->mat4s[i]), inter));
	}
	//calculate each bone's final matrix (that will be sent to shader) by multiplying it by the bone's parent's matrices recursively (see function above).
	std::vector<glm::mat4> mat4out(aM->Hierarchy.size());
	for (int i = 0; i < aM->Hierarchy.size(); i++) {
		mat4out[i] = calcMatrix(&aM->Hierarchy[i]);
	}

	//upload to shader
	glUniformMatrix4fv(glGetUniformLocation(shader, "boneMats"), mat4out.size(), GL_FALSE, glm::value_ptr(mat4out[0]));

       //increment time
	anim->time += delta;

        //bind VBOs and render it
	RenderAnimationMeshStatic(aM, shader);
}

https://imgur.com/a/bHbFe shows how the animation looks in my program vs how it looks in blender.

I have not been able to solve this for a week so any ideas as to what i am doing wrong will be appreciated :slight_smile:

OK so, I got this mostly working by, instead of exporting a matrix, exporting a vec3 position and a quaternion rotation. The position gets interpolated linearly and the quaternion with slerp. However I have blender’s axis mapped to my engine’s axis like so: Y->X Z->Y X->Z. I tried converting the quaternion to Euler angles, swizzling them, and then converting back to a quaternion, but this seems to be failing sometimes. What is the correct way to do this?

This is corresponds to a rotation of 120 degrees about the axis [1,1,1], which as a quaternion is [-0.5, -0.5, -0.5, 0.5]. So multiply the other quaternion by that (don’t forget to swizzle the translation as well). Alternatively, you might need its inverse (when you say “mapped”, it’s not clear as to the direction), which is [0.5, 0.5, 0.5, 0.5].