Audio: What Are The Differences Between Polarity And Phase?

Audio- What Are The Differences Between Polarity And Phase?

The terms “phase” and “polarity” are often used interchangeably in audio, though they technically mean different things. If you've ever wondered about the difference between phase and polarity, you've come to the right place to find out.

What are the differences between polarity and phase in audio? Polarity references an audio signal's position above or below the reference value/voltage. Inverting a signal's polarity will swap positive and negative voltages. Phase references a point along an audio signal's waveform. In repeating waveforms, each complete cycle will have 360Âş of phase.

In this article, we'll define polarity and phase more clearly to better understand the differences between the two terms.


What Is Polarity?

Polarity is perhaps the simpler of the two terms to understand, so let's start with it. Any electrical signal or waveform (including analog audio signals and their digital representations) has polarity.

The polarity of an audio signal references its position above (positive) or below (negative) the reference value/voltage. This reference line or median line is at “zero amplitude” regardless of the median/reference signal value (this takes into account bias voltage and other offsets).

So, a signal's polarity is essentially a function of its positive and negative points along its waveform.

By itself, polarity isn't that interesting to us in audio.

However, when we have multiple signals, polarity is worth considering. This is especially true if the signals are copies of each other.

If we had two identical signals, we could sum them together to achieve twice the signal level. This can be visualized below:

Now, if we were to flip the polarity of the second (blue) signal, the two identical signals would completely cancel each other out. The points of positive amplitude in one signal would destructively interfere with the negative amplitudes of the other and vice versa. The result would look like this:

The benefit of polarity, especially when dealing with identical signals, is that we're not shifting the phase by delaying any signals. Rather, we're simply flipping or inverting the polarity so that all positive values become negative and all negative values become positive.

Polarity is certainly important in audio technology. Let's consider a few instances where polarity should be considered:

Polarity And Speaker Connections

Loudspeakers will typically have positive and negative input terminals. The positive speaker wire connects the amplifier's positive output terminal to the speaker's positive input terminal. The negative speaker wire, conversely, connects the amplifier's negative output terminal to the speaker's negative input terminal.

Swapping these two wires, in theory, would cause the speaker to pull air when it's supposed to push air and vice versa.

If there's only one speaker in a system, the effect of mixing up the speaker wires will be minimal and imperceivable.

The issues happen when multiple speakers are used, like in a stereo system, and one or more speakers are in reverse polarity to the other speakers. This will cause significant issues of destructive interference within the acoustic environment.

If we consider a pair of speakers wired in reverse polarity, we'll have lacklustre results, though we likely won't have absolute cancellation.

First, if the audio fed to the speakers is in stereo, each speaker will have a different audio signal, to begin with, though any commonality (centre-panned elements) will suffer. Second, the distances between the speakers and the listener(s) within the acoustic environment will allow some sound to be heard, even if the speakers are wired in reverse polarity to one another.

That being said, it's always best to avoid such wiring issues.

Polarity And Balanced Audio

Balanced audio is a system that carries a mono analog signal with three conductive wires (positive polarity audio, negative polarity audio, and common ground).

This type of audio transfer is common in mic level devices (microphones and mic inputs) as well as low-level instrument level devices (keyboards, etc.) and helps minimize signal degradation over long cable runs.

The two signal wires in balanced audio cancel each other out since they carry signals that are in opposite polarity to one another. A balanced input, therefore, requires differential amplifiers. These differential amps effectively sum the differences between the two signal wires while eliminating the induced noise that would be common to each signal wire. This process is known as common-mode rejection.

Related Article On Balanced Audio

To learn more about balanced audio and microphones, check out my article Do Microphones Output Balanced Or Unbalanced Audio?

Polarity And Recording/Mixing

When recording and mixing audio, we should be aware of polarity and phase, though, for this section, we're concerned with polarity.

You'll likely find polarity inversion or “phase flip” switches on preamps, consoles, digital audio workstations and other audio equipment. The phrase “phase flip” is actually a misnomer, as the actual action of these options has to do with inverting the polarity (perhaps this is where some of the confusion comes from).

Polarity inversions can be essential in recording and mixing sessions to better align the various tracks of the session. We know how ill-aligned signals interfere destructively when summed/mixed together, so it's in our best interest to have the best relationships between the positive and negative amplitudes of our signals.

When two microphones are positioned so that they face each other (a common occurrence when miking the top and bottom of a snare drum), the likelihood of one signal being in the positive while the other is in the negative (and vice versa) is high. This is also the case when two microphones set to capture the same source at varying distances aren't positioned properly to account for the delay in the sound waves.

For more information on mic positioning, check out my article Top 23 Tips For Better Microphone Placement.

We may also want to invert the polarity of samples to better match with the other audio tracks in the session.

Note that, in these instances, we're not dealing with identical waveforms. However, flipping one signal's polarity can help it align with the other(s) better.


What Is Phase?

Now that we know what polarity is, let's discuss phase. We'll begin our discussion on phase by likening it to polarity before delving deeper into phase itself and how it's used in audio.

Phase refers to the position of a point in time on a wave cycle, measured in degrees. In a periodic waveform such as a repeating audio signal (like the sine wave), the beginning of a waveform starts at 0Âş and repeats every 360Âş.

Phase is a function of time, and higher frequencies (having shorter wavelengths) take less time to go about their full 360Âş cycle.

This cycling and phase can be visualized in the following graph:

Inverting the polarity of this waveform would look like this:

Note that, in this very specific case, inverting the polarity is the same as shifting the phase by 180Âş (assuming the sine wave repeats at the same amplitude infinitely).

However, altering or “shifting” the phase effectively means moving the signal in time, whereas polarity inversion does not.

If we were to take another basic waveform like the sawtooth wave, we'd have much different results between inverting the polarity and shifting the phase by 180Âş.

Here's an overlay of an original sawtooth wave and an inverted copy:

Here's an overlay of an original sawtooth wave and a copy shifted 180Âş:

Things get even more dissimilar once we start dealing with complex audio waveforms (which comprise the vast majority of audio).

In fact, phase, in the truest sense in the context of repeating waveforms, isn't necessarily what we're discussing in audio since we're mostly concerned with highly complex, non-repeating waveforms. However, the idea of phase being the time-dependent location of an audio waveform is a valuable idea to understand in audio and is often used in the world of audio and music production.

So, polarity inversion and phase shifting are not the same things.

Let's now consider how phase is used in audio:

Phase And Mixing

While polarity flipping can help us to align the different tracks in our recording/mixing sessions, we're limited by how much we can truly align each track. Moving on to phase shifting, we achieve much greater control over the timing of each track, affording us a more clinical tool for aligning our tracks for better “phase relationships”.

Beyond the phase relationships between individual tracks, we should be aware of the phase relationships between our left and right stereo channels. It's the differences between the left and right channels that effectively give us the sense of stereo width. However, taken too far, these phase differences can destroy our mix and ruin any chance of good mono compatibility.

Phase correlation meters span continuously from -1 to +1, or from 180Âş to 0Âş. They can be put on stereo tracks or the stereo mix bus to meter the phase relationship between the left and right stereo waveforms.

At +1, we have a 100% correlation between the channels (they are exactly the same).

At 0, we have the “widest permissible left/right divergence” or the widest permissible stereo image.

Having the mix bus correlation meter moving between 0 and 1 is ideal. Smaller variations mean smaller differences in width.

At -1, our left and right channels are completely out of phase and will completely cancel each other out.

Mix bus correlation meter values between -1 and 0 mean that significant phase issues are present that will interfere with the stereo audio and definitely with the summed-to-mono audio.

It's best to hover between 0 and +1. However, we may want our mixes a bit closer to +1 to ensure better phase relationships between the left and right channels and, therefore, better mono compatibility.

Phase And Audio Effects

Phase is used in several audio effects.

The first that comes to mind is phaser, which uses a series of all-pass filters (that produce phase shifting as a side effect) to produce various modulated notches and peaks across the frequency spectrum.

Speaking of filters, EQ has an inherent side effect of frequency-dependent phase shift at and around the corner or centre frequencies. The greater/steeper the EQ move/filter, the greater the phase shift (positive or negative). Though this isn't the primary use for the process, I figure it'd be worth elaborating on after discussing phaser.

We also have linear phase EQ, which negates any phase shifting in the signal being processed.

Effects like chorus, flanger and vibrato all work with a modulated delay circuit that effectively modulates the phase of a signal to produce the effect (the delay time of the delay circuit is modulated).

Call To Action!

Select a mono audio track in one of your sessions. Duplicate it, and flip the polarity of the copy. Note what happens to the output (it's completely cancelled).

Flip the polarity of the copy back to the original, and instead nudge it forward so that the initial positive part of the transient now lines up with the initial negative part of the transient. Note that the sound changes versus the completely-in-phase duplication.

Leave A Comment

Have any thoughts, questions or concerns? I invite you to add them to the comment section at the bottom of the page! I'd love to hear your insights and inquiries and will do my best to add to the conversation. Thanks!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *