How We Sync Electron and DAW

electron music

How We Sync Electron and DAW: A blog about the better music production practices for choosy audiophiles.



contact us

Electron and DAW can be synced in a number of ways, but the best way to sync is via MIDI clock.

Music Production Tips and Tricks: How We Sync Electron and DAW

Syncing multiple devices together can be a real pain, especially if you’re using different devices from different manufacturers. Electron is lucky to use MIDI clock because it increases the accuracy and reliability of syncing our devices together. MIDI clock was developed for the purpose of syncing devices together, so why not use it?

What Is MIDI Clock?

MIDI clock is a message that tells all connected devices what tempo is being played by the device sending the sync message. This allows us to easily sync up with our DAW or any other device that offers this feature. The best part about MIDI clock is that it’s actually quite simple to implement.

It takes roughly 4–5 messages (depending on your exact implementation) to get the message across. The first message sent is the “start” message, which tells all connected devices to start playing at 0 ms (more or less). Then, every 24th beat, a “beat” message is sent which tells all connected devices how much time has passed since the last “beat” was received. Finally

Electron is a framework for creating native applications with web technologies like JavaScript, HTML, and CSS. It takes care of the hard parts so you can focus on the core of your application.

This blog post is about how we sync the audio clock between an Electron UI and a DAW (Digital Audio Workstation) via MIDI. For those who are unfamiliar with MIDI, it is a communication protocol that allows electronic musical instruments to communicate with each other. As it turns out, MIDI is also great for syncing audio between two different software applications on a computer.

I was sitting at my studio when a strange question hit me in the head: Is it possible to sync Electron with a Digital Audio Workstation (DAW)?

I’ve been able to sync Electron with other hardware and software but never with a DAW. I’ve read a lot about it and it seems to be a very complicated process due to the nature of DAWs. In fact, some people use DAWs as VST instruments inside their DAWs. That’s so meta!

To answer the question, I started researching how people attempt to solve this problem. The first thing I found was this video of Ableton Live synced with Elektron Analog Rytm:

Let’s see how they did it.

First, they are using a master clock in Live. This is important because Live doesn’t send MIDI clock messages by default. It needs an additional plugin for that.

Then, they are using the CV Gate Out plugin in Live to send pitch CV messages from Live to Rytm.

Finally, they have set up Rytm as an external instrument in Live and can play it from Live just like any other instrument.

The video works really well and if you want

Audio is a time-sensitive and time-critical task. If a sound arrives too early or too late, it will be noticed. If the sound is distorted in any way, it will be noticed. To make things worse, if audio processing is slow, not only will the sounds arrive late but also some other important timing information may be missing.

In order to be able to run an audio application on a computer (and especially in the browser), we need to synchronize two distinct clocks: the DAW’s clock with variable tempo and the Electron’s main loop clock with fixed FPS. We want to update the music engine as often as possible, but we also want to keep the timing precise.

The most common way of doing this is by using requestAnimationFrame (rAF) in the renderer process and setTimeout in the main process. The renderer process uses rAF to get notified of screen refreshes, whereas the main process uses setTimeout and throttles audio updates accordingly. However, there are many problems with this approach:

– Both browsers and Electron have arbitrary limits on how often callbacks can be invoked: 60fps for rAF and 1ms for setTimeout which leads to situations where audio updates cannot happen as often as they should

If you’re a musician that makes music in the digital realm, you strive for a seamless workflow when it comes to production and performance. That’s why you choose Ableton Live or any other Digital Audio Workstation (DAW) as your music-making software.

Nowadays, there are countless software options for creating music on your computer. Every DAW has its pros and cons. But if you want to step up your game and enhance your live performance as a musician, Ableton Live is the best choice you can make. This software gives you full control of when and how you want to play your tracks, samples, effects, and synths during a live performance. It is really deep, but that also means it’s not always easy to get started with it.

In this article I will walk you through my workflow for making electronica with Ableton Live & my favorite open source tool Electron. Electron is a JavaScript library that allows you to create native applications with web technologies like JavaScript, HTML, and CSS. These apps work on Windows, macOS, and Linux platforms which is great because I’m developing on MacOS and testing on Windows & Linux before deployment.

I will take you through the steps required to

“If the cars are slow and the traffic is heavy, it’s better to have more lanes.”

“Right, but in a traffic jam, you can only go as fast as the car in front of you.”

“That’s why it’s best to have one lane where everyone goes really fast.”

“Sure, but what if there are more cars than can fit in that one lane?”

“Well, then you’ll have to get rid of some of them.”

Leave a Reply

Your email address will not be published.