SweetBeats: audio case study

3 min readAug 17, 2021


How do you create a music app that always sounds good no matter what the user inputs? That should also be fun to use for an advanced user? And result in a somehow unique track?

Those were some of the challenges we faced taking on this project together with Wildlife and Duncan Channon.

SweetBEATS is an online beat making campaign to promote SweeTARTS candy. Plan8 had the chance to be a part of this by creating the music and step sequencer functionality.

Creating your beat

You start out by selecting your style, one for melody and one for rhythm. Each style represents a group of instruments. And in one tap you’ve made a beat based on your combination of styles.

A classic step sequencer has a toggle button for each step, and a common step resolution is sixteenth notes. To be able to have a chord progression without it getting too repetitive we decided that the minimum track length should be eight bars. That makes 128 sixteenth steps, a bit too much to fit in a simple mobile user interface.

The interface

We thought about having longer step resolution, like a beat or a bar, but discovered that wouldn’t give us enough variation of the tracks.

To make the experience simple and fun the designers wanted each instrument to be controlled with a candy slider. That solved the issue with too many toggle buttons. Instead we could define multiple patterns of different intensity and then populate the sequencer grid as you drag the sliders. (A pattern is like a midi track containing information of pitch, start and length of the notes.)

Making sure it always sounds good

To make it impossible to mess it up and sound bad we had to take some precautions.

Each instrument is like a ‘single-lane’ drum machine where each sixteenth step can be toggled on or off. Behind the scenes a step could consist of multiple notes and samples, for example a kick and a snare or a three note chord. So we give the user the possibility to choose if a step should play but not what should be played. By constructing patterns that sound good when all steps are checked we could make sure the user couldn’t make it sound too weird or off.

Getting Reaper to export a song in JSON format

All together we created 120 patterns. This was done by our composers in our go-to DAW Reaper. Having this done in a midi editor is crucial as we had to listen to the patterns in different combinations and tweak them to make sure everything sounded good together. However, when getting everything set up to play in a web sequencer we much rather have this information in JSON format. By writing a custom reaper action using the ReaScript Python API we could easily achieve this and use Reaper as an editor for our patterns.

JSON data output from Reaper

The JSON file was then used to set up the web audio api framework Tone.js. For each instrument we created a tone sampler and tone parts were used to playback the patterns


To create a music app that will always sound good we can’t give the user too much freedom. We used predefined progressions and melodies and only let the user select instruments and control if a step should play. Having all drums on one lane also gave us more control of the output. For a playful app like this we found it’s more gratifying for the user to play around with realtime effects than the actual editing of the patterns. To ensure track variation we used sixteenth note step resolution and made sure all layers could be combined together.