[glm] Use quaterions to move the camera direction according to mouse movement

I am trying to use quaterions to move the camera direction vector in the following way.

This code is perfectly working

Code :

glm::quat temp1 = glm::normalize( glm::quat((GLfloat)( -Input1.MouseMove.x * mouse_sens * time_step), glm::vec3(0.0, 1.0, 0.0)) );
glm::quat temp2 = glm::normalize( glm::quat((GLfloat)( -Input1.MouseMove.y * mouse_sens * time_step), dir_norm) );
Camera1.SetCameraDirection(temp2 * (temp1 * Camera1.GetCameraDirection() * glm::inverse(temp1)) * glm::inverse(temp2));

this code is not

Code :

glm::quat temp1 = glm::normalize( glm::quat((GLfloat)( -Input1.MouseMove.x * mouse_sens * time_step), glm::vec3(0.0, 1.0, 0.0)) );
glm::quat temp2 = glm::normalize( glm::quat((GLfloat)( -Input1.MouseMove.y * mouse_sens * time_step), dir_norm) );
glm::quat temp3 = temp2 * temp1;
Camera1.SetCameraDirection(temp3 * Camera1.GetCameraDirection() * glm::inverse(temp3));

The two pieces of code, from my understanding of glm, should produce the same result. However, they are not. The first piece of code produce expected result. In the second piece of code when i move the mouse I get extremely small movements in an apparently random direction.

Why I cannot multiply quaterions successfully? Am I using GLM in a wrong way?