Drawing waveforms using AudioKit and CoreGraphics
Couple of notes to get started:
- For the audio file, I am going to use a song by Scott Holmes Music from the Free Music Archive. I have renamed the file
Upbeat Funk Pop.mp3
- I have added the file to the bundle in my sample project. This could be done just as easily for something loaded from a
URLRequestor even recorded locally.
- I am making some assumptions about using screen width and height. These numbers can be adjusted as needed.
- There is some UIKit code here but I am running it in a Catalyst app, so the screenshots are from macOS.
- Each code block below is intended to run in the same function. I have a completed example at the bottom.
First, let's get the file from the bundle and get an AVAudioFile with it:
Next, we want to get the amplitude samples from that AVAudioFile. This is where AudioKit comes into play. Take note that
getData will return two arrays, one for the left channel and one for the right channel. We are only interested in the first channel here. Additionally, we are using
UIScreen.main.bounds.size.width as the width for the image we will generate. This is used with
getData to ensure we get one sample value per point on the screen. We could generate an image wider than the screen if we needed to scroll, but as it is this will only be one sample per point across the width of the screen.
Now we can begin to set up our CGContext to handle drawing the actual waveform.
drawAreaHeight height here represents
half of the area we will use to draw. The purpose of this is we will draw amplitude modulation both above and
below a middle point in the image. We will essentially mirror the waveform to the bottom half to fill it out. Example:
Here is the fun part where we begin to draw the waveform. We will move forward on the x-axis one point per
index) and will draw the desired height of the sample both above the vertical center, and below
the vertical center.
Here is a video to demonstrate the direction we are drawing here. This is intentionally slowed down. The drawing happens pretty close to instantly in real use. This is just to show how we are drawing one vertical line at a time, each line matching one of the amplitude samples.
Once we have drawn our samples, we need to stroke the path and then we can grab an image from the CGContext and end the context.
Here will be the completed waveform!
Circling back to the idea of
yCenter and how we mirrored the lines around a that point in the
image, we could have stopped at the vertical center and only drawn the top half of the waveform:
Or even started at
yCenter and only drawn the bottom half:
Last note --- there are use cases for scaling the sample values if all of the samples are very low. I have not included an example of that here but am open to adding it.
Next: Useful Ansible plays for Raspberry Pis