In the last post we looked at point and line sources. We discussed how line array, although inspired by the concepts of line source, is in fact made up of an array of point sources.

We are surrounded all the time by point sources, however an array like this is a special case as the sources it is assembled out of are what we call *coherent sources*, by which we mean that they are producing identical waveforms in time and in phase with one another. Coherent sources interact with one another in some unique ways which we will now explore.

**Phase**

Let’s begin by considering an individual source, producing a single frequency. The sound wave being emitted has three important properties:

- Amplitude
- Frequency
- Phase

I’m going to assume for now that readers are familiar with amplitude and freqeuncy, but let’s take a deeper look at phase.

Phase describes the position in the wave cycle at any give snapshot in time and space. A single frequency waveform is described by a sine wave, so we identify phase as an angle corresponding to the current position in the wave cycle.

If we assume a fixed listener position, phase becomes determinate upon time. When listening to a single source like this phase itself does not really matter, it is constantly changing at a fixed pace (determined by the frequency) so whilst the rate of change is important the actual value of the phase at any given moment is entirely abitrary in so far as our perception is concerned. The best way to think of this is like we are shifting the origin point on a graph – the values change but the shape of the waveform does not.

This all changes when we add a second coherent source. Individually the phase of the two sources does not matter, but when listeners are exposed to both sources at the same time any relative phase difference between them becomes very important.

Phase differences like this will occur due to the listener position. Although the the two sources may be operating in phase, if the listener is not equi-distant from the two sources there will be a time delay (and thus a corresponding phase delay) between one and the other.

We can calculate this fairly simply:

t_{delay} = X/c

Where *t _{delay}* is the time delay between the source generating the wavefront and it arriving at the listener,

*X*is distance from the source to the listener, and

*c*is the speed of sound.

The phase angle (relative to the phase of the source) is then:

Φ = 2πft_{delay}

Where Φ is the phase angle in radians and *f* is the frequency. Note the importance of the frequency term, phase is directly related to frequency so at a different frequency with the phase angle will also be different.

If we measure Φ for both sources and then subtract one from the other we get the relative phase angle between the two.

Depending on the relative distances between the listener and the two sources the relative phase angle could be either small or large. One point to note is that as the phase angle describes a sine wave, the value of the wave is cyclical at a frequency of 2π. The value at Φ = 0 is the same as the value at Φ = 2π, which is the same as the value at Φ = 4π etc. We can therefore simplify matters by considering large relative phase angles to induce behaviour identical to the corresponding angle within the range of -π ≤ Φ ≤ π.

**Interference**

The relative phase angle is crucial to determining how the two sources interact. The sound waves from the two sources will be superimposed at the listener position, interfering either constructively or destructively depending upon the relative phase angle.

For a phase angle of Φ = 0 the two sources are “in phase” with one another and the interference is constructive. The waveform from one source lines up exactly with the waveform from the second source and the superposition produces a resultant waveform which is the same frequency, but double the amplitude of either of the individual waves. This is a increase in level of +6dB SPL over the level of either of the individual sources.

For a phase angle of Φ = ±π the two sources are in “anti phase” with one another and the interference is destructive. The waveform from one source is exactly the opposite of the waveform from the second source and the superposition produces a null waveform as they completely cancel one another out. This is a decrease in level of -∞dB.

For other phase angles the effects are less pronounced. The superposition will produce a waveform of the same frequency, but different amplitude and phase depending upon the relative phase angle between the two sources. For all -2π/3 < Φ < 2π/3 interference will be constructive. For all Φ < -2π/3 or Φ > 2π/3 interference will be destructive.

The case of Φ = ±2π/3 is a particularly interesting scenario in which the superposition produces a waveform of the same amplitude as the individual sources, with a phase angle half way between the two. In this scenario interference is neither constructive nor destructive, adding the second source makes no difference to the amplitude.

**Waves of Different Amplitude**

So far we have been making one major simplification; we have assumed that the waveforms arriving from the two sources are of the same amplitude. In reality this will not likely be the case. The two sources will be outputting the same level, but amplitude of each source will decay with distance in accordance with the inverse square law. In a scenario where we are considering the two sources to be non equi-distant we therefore need to factor in the relative level of the two sources.

If p_{source} is the amplitude of the waveform measured at 1m from the source then p at the listener position will be:

p = p_{source}/X

Where the amplitude of the two waveforms differs, calculating the the resultant waveform becomes slightly more complicated and the relationships we’ve described above break down. They still conform to the same general pattern of constructive and destructive interference, by the magnitude of the interference is less significant.

We can still determine the phase and amplitude of the resultant waveform, but the math becomes a little more complex: we use a process called phasor addition. A *phasor* (or *phase vector*) is a vectorised representation of the wave, described in polar co-ordinates.

**Polar Plots**

We are all used to cartesian plots, where we define the position of a point in terms of its value on two axis (usually* x* and *y*). Position can be defined in absolute terms, relative to the origin at (0,0), or in relative terms relating to the position of another known point.

A polar plot differs as we instead define the position in terms of an angle and a magnitude. Again, this can be absolute or relative.

Our single frequency waveform can be defined on this landscape as a vector using polar co-ordinates. Superimposing a second waveform of the same frequency then becomes a simple matter of vector addition. As we know the angle and magnitude of each vector we can calculate the resultantant vector via pythagorus and trigonometry to give us the phase and amplitude of the superimposed waveform.

Where p_{total} and Φ_{total} are the resultant amplitude and phase angle of the superimposed waveform, and p_{1/2} and Φ_{1/2} are the amplitude and phase of the original waveforms.

We can modify these equations for scenarios where we have more than two sources:

Where n_{total} is the total number of sources, and p_{n} and Φ_{n} refer to the amplitude and phase of each individual source.

**Conclusions**

So far we’ve developed equations to determine the resultant waveform at fixed positions and frequencies. In my next post we are going to go on to explore how these effects translate when we put multiple coherent sources into an environment involving variable frequency and listener positions.