# A Phase-Aligned Oscilloscope for Web Audio

A music synthesizer should produce nice periodic waveforms when a note is played. We should be able to see that regularity when we visualize the sound pressure with an oscilloscope, here demonstrated with the Yamaha DX7 emulator: ## The Problem

We can see the regularity, but hold on. The waveform is jumping around, flickering left and right. It doesn't appear fixed in one spot. That makes it awfully hard to see how the waveform evolves as we hold a note down.

The basic problem is that the visualization update is not synchronized to the wave period. The waveform is drawn by taking a snapshot of audio data – say, 1024 samples – at successive instants. The snapshots are performed in this case by a Web Audio API analyzer node, ideally at 60 times per second. The position (phase) of the periodic wave will not appear aligned in successive snapshots (unless we're playing an E above middle C, which happens to be 659.25 hz, a near multiple of 60 hz). Hmm!

## The Solution

We need 2 ingredients to really do this right.

1. When we get some audio data to draw, we need to know the exact moment in time the data corresponds to. The Web Audio API provides this in the form of AudioContext.currentTime.
2. We need to know the frequency of the note we're interested in drawing. Let's say whatever note was pressed last.

Every time we want to draw a frame of audio data, we divide the `sampleTime` by the wave `period` and call the remainder `sampleOffset`. The units are in audio samples, running at 44100 samples per second.

Let's say we're drawing two successive frames of audio data. For these two frames, `sampleTime` might be 10000 and 10705. The note pressed down is middle C at 440 hz, generating a waveform that repeats every `44100 / 440 = 100.2` samples. So we get a `sampleOffset` of `10000 % 100.2 = 80.2` and `10705 % 100.2 = 83.8`. We need to draw the first frame shifted 80.2 samples to the left, and the second frame shifted 83.8 samples to the left. And so on. Ah, much better! The little wobble at the end of this animation shows a pitch vibrato.

Here are the important parts in code. When we get a new note down, we update the periodicity for the visualizer:

```var noteFrequency = frequencyFromNoteNumber(synth.getLatestNoteDown());
visualizer.setPeriod(sampleRate / noteFrequency);
```

and then in our draw loop, subtract the sampleOffset from the x-position:

```analyzerNode.getFloatTimeDomainData(data);
var sampleTime = sampleRate * analyzerNode.context.currentTime;
var sampleOffset = sampleTime % this.period;
...
for (var i = 0, l = data.length; i < l; i++) {
var x = (i - sampleOffset) * WAVE_PIXELS_PER_SAMPLE;
var y = data[i];
graphics.lineTo(x, y);
...
}
```

This doesn't so well for polyphonic synthesis, as multiple notes will have different wave periods running all at once. It works nicely if you hold a high note, and then play an octave or a fifth lower. You can see the consonance (and dissonance) in the waveform as you play various intervals.