Making Music With Your Web Browser
March 14th, 2017 | By Niels Klom | 7 min read
Notice: When editing this article, a bug was discovered in version 51 of Firefox which causes the sounds to only play once. Since then, Firefox has released version 52 which doesn't include this bug anymore. If you are experiencing this issue, we suggest you update Firefox.
Recently, Chrome announced that it would be discontinuing support for Chrome Apps(1) since developers have increasingly integrated similar capabilities within the browser itself. So we thought why not try and make something that is really desktop grade using HTML5. In this tutorial, we will be showing you how to make a very simple Digital Audio Workstation using HTML5. Ironically, though, this tutorial will only work in Firefox as of writing.
For tutorial purposes, we've already created a UI for you to use. It also includes some audio samples that we will use in the DAW. You can find it on Github, or enter this in the terminal:
git clone https://github.com/JscramblerBlog/simple-daw-template.git && cd simple-daw-template
You will find two empty JS files, drumpad.js and recorder.js. We are going to be coding the Drumpad first. You should also spin up a simple localhost to serve the files, as not all of the features used work with the file protocol.
A DAW can have any number of features that will help you make music digitally. Our DAW will let you play 4 drum samples and then record different tracks using them. Each drum sample is nothing more than an audio element. To make it easier to play these samples we will be making them playable using the keyboard. The user will also be able to change the volume of the samples. Let's declare some vars first, remember this is in the drumpad.js file.
var kick = document.getElementById('kick'),
hihat = document.getElementById('hihat'),
snare = document.getElementById('snare'),
clap = document.getElementById('clap'),
controls = document.querySelectorAll('.inst-controls');
We are making references to each audio element containing a sample to all four of the volume bars, which are just an input[type="range"].
Side Note: Should you add your own samples instead make sure they are supported by Firefox and aren't loaded from another origin, as this will create issues later on. (Audio loaded from another origin is muted in Firefox when using MediaStream).
Next, we're going to create the function to handle the keydown events and play the correct sample. I've chosen the keys based on the first letter of the sample but if you want to change it you can easily figure out keycodes here.
We're not doing anything too complicated. If the keycode from the keydown event matches the keycode we want, we'll play a sample. Also, notice how we're setting the sample back to the start every time. This is important because otherwise pressing a key repeatedly would do nothing.
function playInst (event) {
var code = event.keyCode;
if (code === 83) { // 83 = s
snare.currentTime = 0;
snare.play();
return;
} else if (code === 67) { // 67 = c
clap.currentTime = 0;
clap.play();
return;
} else if (code === 72) { // 72 = h
hihat.currentTime = 0;
hihat.play();
return;
} else if (code === 75) { // 75 = k
kick.currentTime = 0;
kick.play();
return;
} else {
return;
}
}
Now that we've created this function we're going to add an event listener to the window to listen for keydown events.
window.addEventListener('keydown', playInst);
The samples will play now but we still can't change the volume using the interface. To do this we'll create a function that handles a change event on a range and sets the volume accordingly from 0 to 10.
The function below gets the value from the range, which will be between one and ten and then sets the relative sample's volume to the correct level. To do this, the value is parsed as an integer unless it's 10 in which case the volume is 1 anyway.
function changeVolume(event) {
var val = this.value;
var valString = '0.' + val;
var valFloat = parseFloat(valString);
if (val.length === 1) {
this.parentElement.nextElementSibling.volume = valFloat;
} else if (val.length === 2) {
this.parentElement.nextElementSibling.volume = 1;
}
}
We've made the function but we can't just add it to the controls variable because it contains more than one element. To do this we could've used a for loop but we decided to use a forEach loop instead. We can't just call this as a prototype on the controls variable because actually, document.querySelectorAll does not return an Array, but a NodeList.
Array.prototype.forEach.call(controls, function(control) {
var volumebar = control.children[1];
volumebar.value = 10;
control.nextElementSibling.volume = 1;
volumebar.addEventListener('change', changeVolume);
});
Each input gets an event listener. The volume of the sample is set to 10 and the value of the range to 10 so that they correspond from the start. That was our drumpad. Right now we can make simple beats using the keyboard but we can't actually record anything yet, we're going to do that next.
To keep the code separated, switch to the recorder.js file. We'll be using MediaStream's to record our samples and put them together. Once again we are going to declare a bunch of variables first.
var trackContainer = document.getElementById('tracks'),
trackTemplate = document.getElementById('track-template'),
track = [],
feedbackElement = document.getElementById('feedback');
recordButton = document.getElementById('record'),
stopButton = document.getElementById('stop');
audioContext = new AudioContext(),
audioContextStreamDest = audioContext.createMediaStreamDestination(),
kickStream = kick.mozCaptureStream(),
snareStream = snare.mozCaptureStream(),
hihatStream = hihat.mozCaptureStream(),
clapStream = clap.mozCaptureStream(),
kickSource = audioContext.createMediaStreamSource(kickStream),
snareSource = audioContext.createMediaStreamSource(snareStream),
hihatSource = audioContext.createMediaStreamSource(hihatStream),
clapSource = audioContext.createMediaStreamSource(clapStream);
A few of these are just DOM references. The track variable is an empty Array which the chunks of our recording will later be saved in. We also create a new AudioContext.
An AudioContext is basically a digital mixer to which we can hook up different inputs, outputs and effects. Immediately after creating the AudioContext we add a MediaStream destination (input) to it. This allows the AudioContext to handle MediaStreams.
After that, each Sample (they were originally defined in drumpad) is requested as a MediaStream using the captureStream function which returns a MediaStream. The stream is then made AudioContext friendly using the createMediaStreamSource function, even after
all of this, the code doesn't do anything. You might have noticed that nothing above mentions recording, we'll get to that now.
var Recorder = new MediaRecorder(audioContextStreamDest.stream);
Recorder.ondataavailable = function(chunk) {
track.push(chunk.data);
}
Recorder.onstop = function() {
var trackBlob = new Blob(track, {
'type': 'audio/ogg; codecs=opus'
});
var trackURL = URL.createObjectURL(trackBlob);
addTrack(trackURL);
track = [];
}
We are creating a MediaRecorder. MediaRecorder can simple record MediaStreams, these can come from Audio or Video elements but also from Webcams and Microphones so, yes, technically you could also record a voice!
We've set the MediaRecorder up to handle data and do stuff with it when it's done but it's not actually doing any recording yet. Once the recording is done we'll create a new Blob from the chunks and also pass that along to a function to update the UI.
Now we need to connect the samples to the AudioContext and to the systems main Audio output.
kickSource.connect(audioContextStreamDest);
kickSource.connect(audioContext.destination);
snareSource.connect(audioContextStreamDest);
snareSource.connect(audioContext.destination);
hihatSource.connect(audioContextStreamDest);
hihatSource.connect(audioContext.destination);
clapSource.connect(audioContextStreamDest);
clapSource.connect(audioContext.destination);
You may have noticed that you couldn't hear the samples for a bit there before connecting them to the AudioContext destination.
The other destination is the MediaStream capable one that the MediaRecorder is also recording. So all of our samples have been hooked up to one AudioContext which then gets recorded by a MediaRecorder.
function record() {
Recorder.start();
recordButton.children[0].classList.add('recording');
recordButton.removeEventListener('click', record);
stopButton.addEventListener('click', stop);
}
function stop() {
Recorder.stop();
recordButton.children[0].classList.remove('recording')
recordButton.addEventListener('click', record);
stopButton.removeEventListener('click', stop);
}
recordButton.addEventListener('click', record);
All that this code does is start the MediaRecorder when the user clicks record, and stop it when the user clicks stop, and make the buttons only usable when needed.
The recorder will now work but the UI won't be updated, to do this we'll have to make a function to handle the Blob created by the MediaRecorder.
function addTrack(blobURL) {
var name = trackTemplate.content.children[0].children[0];
var audio = trackTemplate.content.children[0].children[1].querySelector('audio');
var dl = trackTemplate.content.children[0].children[2].querySelector('a');
name.textContent = 'Track' + trackContainer.childElementCount;
audio.src = blobURL;
dl.href = blobURL;
trackContainer.appendChild(trackTemplate.content.cloneNode(true));
trackContainer.children[trackContainer.childElementCount - 1].children[1].addEventListener('click', playTrack);
}
This code uses the HTML5 template tag to create a new element for the recording and add it to the DOM. It also automatically adds an event listener to the new elements play button, all we have to now is create that callback function.
function playTrack(event) {
var audio = this.querySelector('audio');
var icon = this.querySelector('i');
if (audio.paused) {
audio.play();
icon.innerHTML = 'stop';
} else {
audio.pause();
icon.innerHTML = 'play_arrow';
}
audio.onended = function() {
icon.innerHTML = 'play_arrow';
}
audio.currentTime = 0;
}
Again, this isn't anything too complicated. All this does is play and pause the track and show the correct icon. The track will always be played from the start. That's it. You should now have a simple but powerful DAW made using HTML5.
Of course this isn't anything compared to the desktop grade stuff out there, but it definitely shows that the web is a more than fertile place for building state of the art creative applications. You can find the finished code on Github and a demo here.
You might have noticed the record button on newly created tracks. The idea behind this is that if a user toggles the button the selected track will also be recorded whilst recording, allowing you to create more complex sounds. For example, this would allow you to add hats over a snare and kick pattern. But there are many more things you could add:
Add support for vocal (microphone) recording.
Let the user add new samples.
Add full instruments (using an oscillator).
Make the samples programmable using a BPM system.
Add effects like pitch or distortion.
Add MIDI support (Web MIDI is currently not supported in FF).
Good luck!
Jscrambler
The leader in client-side Web security. With Jscrambler, JavaScript applications become self-defensive and capable of detecting and blocking client-side attacks like Magecart.
View All Articles