Make some noise using the p5 sound library.
p5.js + p5.sound
In addition to the basic drawing API, p5.js includes add-on libraries. The parameters chapter introduces the p5.dom library, and the examples in this chapter use the p5.sound library. The p5.sound library builds on the Web Audio API and provides functions for generating tones, playing recorded sounds, and visualizing the waveform and spectrum of sounds. I highly suggest taking a look at all of the sound examples to get an idea of what the sound library can do.
This chapter will focus on playing and visualizing pre-recorded sound assets. The Comp Music chapter talks more about sound synthesis.
Compare a basic p5.js drawing with a p5.audio sketch. What do they have in common? How are they different?
let myImg;
function preload() {
myImg = loadImage("images/world.jpg");
}
function setup() {
createCanvas(500, 500);
}
function draw() {
image(myImg, 0, 0);
}
let mySound;
function preload() {
mySound = loadSound("sounds/hack-comp.wav");
}
function setup() {
mySound.loop(0, 1, 1, 0, 4);
}
Light and sound flow through our environment as electromagnetic and air pressure waves. Our eyes and our ears collect data about these waves and our visual and auditory cortexes process that data to create information. Though they work in different ways, both of these sensory systems are powerful. We can take advantage of these systems by choosing the types of forms we create.
p5.js makes it pretty easy to play sound assets, like this audio clip from Hackers (1995).
Chrome won’t start playing sound on a page until the user performs a “gesture” on that page, so starting a sound in setup()
won’t work consistently. These examples start audio on a button press.
The second parameter to play()
controlls the rate of playback. If you play a sound faster the pitch will go up; play it slower and the pitch will go down. p5 cannot change the pitch of a sound without changing its length.
This example loads an audio file twice and then plays two loops at once. The looping end time is slightly different so the playback falls out of sync.
This example uses frameCount
to keep time and play a beat with drum samples. This works for a quick sketch, but frameCount
isn’t a great timekeeping source, so you might notice hitches in your rhythm. The Tone.js lets you precisely schedule when samples are played, and might be a better choice if you want to make music.
This example doesn’t have any sound in it. Add sound as part of the coding challenges below
p5.sound provides methods for recording the sound generated by your program and saving it to a file. The following function records length
milliseconds of audio and saves it as output.wav
.
Your audio may sound fine at run time but clipped when you play back the .wav
. If this happens, try reducing the volume of the sounds you generate.
If you can’t get good quality audio capture from p5.SoundRecorder, you might get better results using a screen recording program like Quicktime Player.
// uses the p5 SoundRecorder and SoundFile classes to record the audio output.
// begins recording when called. records for _length_ time in milliseconds.
function record(length) {
const soundRecorder = new p5.SoundRecorder();
const soundFile = new p5.SoundFile();
soundRecorder.record(soundFile);
setTimeout(function () {
console.log("Recording Complete");
soundRecorder.stop();
save(soundFile, "output.wav");
}, length);
}
Explore this chapter’s example code by completing the following challenges.
•
••
••
•••
This example adds a single cuepoint to the sound at 1.7 seconds. When playback passes that specific time, cueBig()
is called.
This example uses p5.Amplitude()
to track and visualize the volume/loudness of the entire sound.
This example uses p5.FFT()
to track the and visualize the waveform of the sound.
The Fast Fourier Transform transforms a signal from the time domain to the frequency domain. For audio analysis that means the FFT can tell the strength of different frequencies—bass, mids, treble—in an audio buffer.
For a visual exploration of FFT, see this excellent video by 3Blue1Brown But what is the Fourier Transform? A visual introduction.
Explore this chapter’s example code by completing the following challenges.
•
••
You can create an empty soundFile object with new p5.SoundFile()
and generate the sound data yourself with Javascript. To do so you will create and fill a Float32Array and then attach it to the SoundFile object with setBuffer()
.
Creating sounds this way lets you work at the lowest possible level: individual samples. This can be fun, and it gives you complete control, but you will probably need to reinvent some basics.
P5.js has functions for working with oscillators, envelopes, and effects if you want to work at a little higher level. You might also consider using a dedicated Javascript sound synthesis library like Tone.js, which is a little more powerful and better documented.
Explore this chapter’s example code by completing the following challenges.
•
••
Keep sketching. Make a bunch of noise!
Choose a 15-second video clip. Use p5.sound to create a new soundtrack for your clip. Combine audio and sound.
Choose a 15-second audio clip. Use p5.sound to generate graphics driven by the sound. Combine audio and sound.
Create a 15-second procedurally generated audio and visual form. The audio and video should be generated from the same process.
Color from Hexcodes to Eyeballs Technical Essay Jamie Wong describes the each step of how color is expressed and interpreted.
Compform '16 Comp Music Lecture Notes Compform '16 week on computational music.
Compform '16 Comp Music Examples Lecture Notes Examples from Compform '16 week on computational music.
WebMidiAPIShim Library A javascript shim for accessing midi devices.
Fourier Transform 3Blue1Brown A great visual introduction to the Fourier transform.
An Interactive Introduction to Fourier Transforms Demo Visual and interactive demos of Fourier Transforms for both sound and drawing.
.