Music in programs

This is more just a curiousity thing than anything else, but I have noticed that in every demo I have ever seen produced by anybody, they always have music in it. This isn’t what my question is about, my question is regarding the synchronization of the music with the animation. The demos usually have things happening in the scene and are usually in sync with the music in someway. Do the demomakers hardcode their camera movements to be in time with the music or is there some other easier way? Just curious!

First, I have not been much in the demo scene, but this is what I either know or can come up with with some thinking.

It depends on type of music source and type of event to trigger. For streaming sound, like mp3/vorbis, you generally hardcode events. When your music change, you note the time and hardcode an event in your demo. For rythm events, you can analyse the sound in real time, since it’s really difficult to hardcode beats. With modules, like mod/it/s3m, rythm events can be easier. With some formats you can insert markers in the file. Insert those markers on beats and trigger an event on each marker. You can also use these markers for scene change.

Camera movements is generally hardcoded if you need to fly through a scene in a controlled way (like through a house).

…I also believe that most music player toolkits out there have support for music event information (e.g. pattern positions etc), which can be used for synchronization. Check out the documentation for e.g. libmikmod and FMOD. A problem may be if you need really precise sync, since you have sound buffer and OpenGL rendering delays etc.