Dear carsten neumann, I appreciate for your contribution. By making the following change,

FragColor = vec4(FaceID.x/255.0f, FaceID.y/255.0f, FaceID.z/255.0f, 1.0);

The default frame buffer shows me the CORRECT result, proving that the unsigned bytes are successfully passed from host program to vertex shader, then from vertex shader to fragment shader.

However, I further tested that, uvec3 can not be used as the output type for a custom frame buffer .

Here is the details of my testing process:
1). Set GL_RGB8UI as the custom render buffer storage internal format:

glRenderbufferStorage(GL_RENDERBUFFER, internalformat, width, height);

2). Set uvec3 as the fragment output type in the fragment shader:
#version 400

flat in uvec3 FaceID;
layout( location = 0 ) out uvec3 FragFaceID;
void main() {

FragFaceID = FaceID;

3). Read the pixels in the custom render buffer:

glReadPixels(0, 0, glViewportWidth_, glViewportHeight_, GL_RGB, GL_UNSIGNED_BYTE, src_);

Debugging the code, I found that the data I read back are all zeros.

On contrary, use GL_RGB8 in step 1; normalize the data and and use vec3 in step 2,

#version 400

flat in uvec3 FaceID;
layout( location = 0 ) out vec3 FragFaceID;

void main() {

FragFaceID = vec3(FaceID.x/255.0f, FaceID.y/255.0f, FaceID.z/255.0f);

we get the correct result in step 3.

It seems that my assumption: "fragment shader only ouput normalized values" turns to be right. But I really hope that somebody could tell me it is not the truth, and give me instructions to show my assumption is wrong.