Let's Make Robots! | RobotShop

Making sticks sync for a drum-machine

To The Cow God (and anyone else interested)

Making a microcontroller do beats is an awful thing.

Code needs to be in 2 parts;

  1. The "Hit"
  2. The "Sequence / note-part"

In 2) There is all the logic that makes up a song, consisting of lines of code divided into segments of 16

In 2) bits are set, such as BD = 1 (Play BassDrum), and also head, speaker-click etc. Once for every 16'th note

Every other "line" in 2) should also have an extra pause, how long it is makes the beat more or less shuffle.


It is awful because:

If you make a sound with the speaker, all halts, and music gets non-rhytmic / out of beat.

If you (in 1) have a lot of instruments hitting at the same time, typical coding will make this beat slow, and music gets non-rhytmic / out of beat.

And.. Not all instruments have the same speed.


Ways to work around;

"Nodes"-segments in 2) should always run 1)

1) is build up this way;

A: All routines for every instrument should always be run.

Say; "No basdrum is to be played?" - Well, run the routine anyway, but just set the pin to 0. "No speaker-click?" Run the routine anyway, just make "no sound-code"

B) Make a mark in the code, called "BANG". At this part, ALL instruments should hit!

have a delay-hard-coded (or measured as we talked about) for each instument.

When starting 1), start a counter that has micro-pauses.

  • When slowest instrument (long stick)'s delay-factor is up, fire that stick
  • When middle instrument (long stick)'s delay-factor is up, fire that sampler / whatever
  • At "bang", also fire the instant speaker-click
  • BANG
  • hold short break to ensure all sticks are really out
  • release all instruments / take back sticks
  • return to next "note" in "2"



I hope this helps a good way, it was actually quite fun and easy to code.


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Ah, that makes sense. That's a little different than the approach I was taking -- since I'm a musician, I approached it from the point of view of actually defining how long a quarter note, eigth note, etc are in MS and using that. But it starts to get complicated with more complex beats, plus like you said, it takes different amounts of time with different numbers of "instruments". My drum routine looks like this right now:


void drum(byte drum1, byte drum2, byte drum3, unsigned int duration)
{ digitalWrite(drum1, HIGH); }
{ digitalWrite(drum2, HIGH); }
{ digitalWrite(drum3, HIGH); }


{ digitalWrite(drum1, LOW); }
{ digitalWrite(drum2, LOW); }
{ digitalWrite(drum3, LOW); }

if(duration > 50)
{ delay(duration - 50); }


And an example 'beat' is something like:


void play_beat1(int are_we_sampling)
drum(SNARE, (are_we_sampling ? QUARTER_NOTE - 30 : QUARTER_NOTE));
// if we're sampling, we cut this last beat short by 30 ms
// so the sample ends a little early, giving the board time
// to reset so we can play it in a loop in time


(There's some other helper functions and constants that I didn't include). But I think I'll shift it more towards your type of approach, where you just set bits for each of 16 beats and have a routine to play them. That sounds a lot more orderly and logical.