View Full Version : MIDI sequencing and timing offsets

July 11th, 2013, 04:07 PM
I'm trying to make a tool that converts old music-box tunes to MIDI, by use of things like image proccessing. I have most of the basics done, among different things an algorithm that produces a list of pixel-offset and tone-name datapairs.

The idea is to use the datapairs to produce a playable MIDI file. This would in theory not be that hard, but it seems that the .mid sequencing format isn't really designed for this. The problem is the timebase, which seems to be centered around using quarternotes in combination with a global beat. The data I have is however directly based on microsecond time offsets. While it's fully possible to translate raw timedata to MIDI timebase, I have by far good enough information about the .mid format to do this. I have tried the internet, but everybody who tries to explain the .mid format and it's timebase somewhat fails to give a clear overview.

I am not really asking for a description on the .mid format here. I would assume programs that translate raw time/tone data into MIDI already exist, and I am basically at the moment only interested in testing how the output of the algorithm sounds. I would appreciate if anyone had any suggestions.

July 11th, 2013, 05:19 PM
Have you looked at The MIDI Specification (http://www.oktopus.hu/imgs/MANAGED/Hangtechnikai_tudastar/The_MIDI_Specification.pdf)? Starting on page 183, it explains why the oddball quarter-note time scheme is used.

July 16th, 2013, 08:16 AM

I figured out that it was possible to just set the number of ticks per second, and used that approach instead. Now I'm finally able to hear how these melodies are actually supposed to sound like; none of them plays correctly on real hardware due to quite heavy wear.