MIDI has become so commonplace in music recording and production that it can sometimes be confused with audio information. Though the two are certainly related, MIDI and audio are indeed two separate subjects and should be understood as such.
What are the differences between audio and MIDI? Audio is a representation of sound. It is stored mechanically, magnetically or digitally, and its signals travel digitally or as AC electricity. MIDI is a digital interface that stores notation, pitch, volume, and velocity info for sampled or synthesized instruments without any inherent audio info.
In this article, we’ll discuss audio and MIDI in greater detail to better understand the difference between these two essential elements of audio production.
What Is Audio?
Audio is most simply described as electrical energy (active or potential) that represents sound. Though sound has been omnipresent in the Earth’s atmosphere since the beginning of time, audio was only conceived and realized in 1876 with the invention of the telephone.
Since this time, audio technology has evolved rather significantly. Recording and playback of audio were first done via mechanical means, starting with the phonograph cylinder (invented in 1877). Electrical audio recording and what would become known as analog audio came about in the 1920s. Digital audio made its debut in the 1970s.
Today, we break audio down into two main camps:
- Analog audio: represents sound as an electric AC voltage (whether active or potential)
- Digital audio: represents sound as a series of binary numbers
Audio frequencies, like sound, are in the audible range (for humans) of 20 Hz – 20,000 Hz. Unlike sound, there are no “infra” or “ultra” audio frequencies.
An audio waveform is shaped the same as the sound waves it represents.
Audio can be recorded with microphones, which are transducers that convert sound waves into analog audio signals (AC electrical signals). Analog audio can also be recorded via synthesizers and even digital sources so long as there is a digital-to-analog converter (DAC) between the digital audio device and the analog recording device.
To learn more about how microphones work, check out my article How Do Microphones Work? (The Ultimate Illustrated Guide).
Digital audio can be recorded in DAWs or digital recorders. It’s often the case that analog audio (especially from microphones) is converted to digital via an ADC for this to happen. However, digital instruments, which can often be programmed to perform with MIDI information, are also commonplace in digital audio recording.
Audio is played back and monitored using speakers. There are plenty of speaker types (including headphones) available that allow us to hear audio as sound. Note that speakers are transducers that convert electrical energy (specifically analog audio) into mechanical wave energy (sound waves). Therefore, digital audio must be converted to analog audio before it’s ever heard by our ears and brains.
For more information about how speakers and headphones work, check out my article How Do Speakers & Headphones Work As Transducers?
To relate audio and MIDI, then, we can state that audio files can be recorded from MIDI information.
To learn more about audio and its direct relationship to sound, check out my article What Is The Difference Between Sound And Audio?
What Is MIDI?
MIDI (Musical Instrument Digital Interface) is much newer than audio, first introduced in 1981 and standardized by 1983. MIDI is a technical standard for connections, communication, and general interfacing between electronic musical instruments, computers and other related digital audio devices/software.
MIDI includes notation, pitch, volume, and velocity information. MIDI files contain this MIDI data, which can be used to control MIDI-compatible hardware electronic musical instruments and virtual instruments. This impactful technical innovation effectively controls the sequencing of musical notes for MIDI-compatible instruments.
MIDI does not contain any inherent audio information and never transmits audio signals. However, unlike audio files/signals, which only represent sound, MIDI is much more flexible. MIDI files can be used to control a variety of instruments; be altered in tempo, notation, velocity, etc., and take up much less space than audio files.
MIDI is seen in the piano rolls within digital audio workstations to trigger software instruments. It is used to play compatible hardware and software instruments and samplers via MIDI controllers in real-time.
One stream of MIDI data has a total of 16 independent channels for messages and events.
MIDI messages are either System messages or Channel messages:
- Channel messages carry up to two simultaneous pieces of information at once, such as:
- Note On and Off: which notes are depressed and released along with each note’s velocity
- Aftertouch: the pressure applied to a key after it’s depressed (often used to control vibrato, volume or other parameters)
- Control Change: changes a parameter value on the device
- Program Change: changes the patch number on the device
- Channel Pressure: the largest pressure value across all depressed keys
- Pitch Bend Change: change in the pitch bend wheel/lever
- System messages control essential data to all devices included in the MIDI network, such as:
- Timing clock: to synchronize all device with the master clock
- Transport: to tell the device(s) to start, stop, or continue
- System exclusive (sysex): allow for manufacturer-specific messages
A MIDI event is simply a MIDI message that occurs at a specific time. Events are essential for MIDI sequencers and music production.
Though we’re only just scratching the surface of the technical details of MIDI, there’s still a lot of information in the above paragraphs. Fortunately for us, the MIDI standard takes care of all the interface details so we can focus on what matters most: creating music.
Using MIDI, we can easily change from instrument to instrument to audition new sounds without committing to any printed audio. We can also quickly change the pre-performed or pre-programmed MIDI notation to change speed, velocity, musical key and more. It’s truly a powerful tool for creating music.
What Are The Differences Between Audio & MIDI?
Though audio and MIDI are definitely related, we can tell from the paragraphs above that we’re not exactly comparing apples with apples, so to speak. Though the differences between the two were largely discussed in the previous sections, let’s try to list out key differences for the fun of it.
Audio was first conceived in 1876. MIDI was invented in 1981.
Audio can be digital or analog. MIDI is only digital, hence the name.
Audio is a direct representation of sound. MIDI has no inherent audio and is rather a method of controlling compatible instruments to produce audio and, ultimately, sound.
Audio can be manipulated by altering its waveform but can be difficult to change drastically without negative consequences (artifacts, distortion, etc.). MIDI data can be changed easily, and the new information will trigger compatible instruments with the same quality.
Audio files require less CPU to playback. MIDI files, which trigger instruments, require more CPU to control the connected instruments in real-time.
Given the same length, digital audio files are larger than MIDI files.
Let’s investigate and expand on these last few points as we discuss the advantages audio has over MIDI and, conversely, the advantages MIDI has over audio.
Advantages Of Audio Over MIDI
Once a sound has been recorded and converted to audio, the playback becomes much easier and fluent. Unlike MIDI, audio requires little processing power, which is essential during production and mixing playback. Producers and mixing engineers prefer audio because it is not CPU intensive, allowing for a smoother, uninterrupted workflow.
Audio is easy to manipulate, especially during sampling. Producers can change the pitch and speed of audio files easily without requiring complex methods. Producers and mixing engineers can chop bits of audio files and arrange them in various orders to improve production or use them as effects.
A mix engineer and producer do not require excess musical knowledge when chopping, stretching, or manipulating audio files. Most bedroom producers with little music knowledge manage to produce great beats without knowing the key, modulation, major, or minor. Working with audio is straightforward, especially during production.
Audio is not device-dependent. Regardless of the playback type, audio remains the same, unlike MIDI, which is device-dependent and may alter the sound on various platforms. For example, the same MIDI file controlling ‘Synth Preset A’ will sound different than when it controls ‘Synth Preset B’.
Audio’s permanent nature means that producers and engineers can share files for further processing without worrying about device dependency, like having the correct virtual instruments and sound libraries.
Finally, MIDI has the tendency to miss-trigger when bouncing larger projects with lots of MIDI information. I would always suggest, when working in a digital audio workstation, to convert all MIDI instruments into audio before bouncing the project to disk.
Audio is much more stable when bouncing. Relying on the DAW to trigger each and every MIDI note during a bounce down could lead to important information being left out due to improper triggering or even non-triggering of MIDI notes.
Advantages Of MIDI Over Audio
One of the better advantages of MIDI files over audio files is their small size. When sharing songs, audio files occupy large amounts of space and making it harder to send via the internet. Therefore, sharing production files in MIDI files format is more flexible and possible because they are smaller and compact.
Note that the collaborators, in this case, need the same virtual instruments, presets and/or sample libraries to obtain the same audible results from the MIDI.
MIDI files are device-dependent, which makes them highly flexible. Producers can audition different instruments before committing to any particular sound.
Switching from a synth to a violin, for example, is made super-easy by switching the virtual instrument the MIDI is controlling. This task would require re-tracking (along with knowing how to play both instruments or hiring different musicians) if we only had audio to work with.
MIDI files provide producers with flexibility, which is very important when selecting the right sounds.
MIDI files are highly editable. It is easy to revert to a project, change a few MIDI notes, and improve the melody or harmony. Again, this is unlike audio files, which would require pitch manipulation software (which sounds rather robotic and unnatural) even to attempt to change the performance.
MIDI files allow users to change the notes, speed and velocities to really hone in the perfect performance.
As a personal example from my work as a composer, MIDI is a huge time saver. I’ve had clients come back wanting key and tempo changes without timbre or instrumental changes.
Had I recorded audio myself or with hired musicians, I would have had to recreate the recording environment perfectly and re-hire the musicians to perform the same piece at a different tempo and in a different key.
With MIDI, however, I simply changed the tempo within my DAW, selected the MIDI for all non-percussion virtual instruments and changed the key. What would have taken a week took 15 minutes and saved hundreds if not thousands of dollars.
Of course, I bounced each instrument’s MIDI in place as audio (I use Logic Pro) before bouncing the project for the final product.
Audio & MIDI Together
Many musicians and music producers work with both MIDI files and audio files. MIDI files and audio files are part of the music creation process, and producers use them jointly to create fantastic music. Of course, the final product of a song is audio, though some producers market MIDI packs for other musicians to incorporate into their songs.
MIDI, as discussed, is perfect for auditioning various virtual instruments or samples before committing to any particular element of the mix. We can easily audition and fine-tune these instruments, along with the MIDI information itself, to achieve the desired performance.
Speaking of samples, audio files can be sampled into a sampler and then controlled via MIDI information. How’s that for using audio and MIDI together?
Though MIDI is great for production, mixing engineers specifically prefer working with audio files. For one, audio files are not device-dependent; therefore, audio engineers can use them in their platform without requiring specific MIDI instruments to reproduce the sound as the musician originally saw fit.
Audio files also offer mixing and mastering engineers an opportunity to work seamlessly without CPU overloads. Engineers prefer working with stable file formats allowing them to focus on the process resulting in increased productivity.
MIDI files and audio files are all essential parts of the music creation process. There are plenty of creative ways of working with both MIDI and audio files in music projects, resulting in a more enjoyable process and better overall productions.
What are the best digital audio workstations? The top 7 best digital audio workstations in the world, in my opinion, are:
- Avid Pro Tools
- PreSonus Studio One
- Apple Logic Pro
- Ableton Live
- Image-Line FL Studio
- Steinberg Cubase
- Cockos Reaper
Continue reading: Top 7 Best Digital Audio Workstations (DAWs) On The Market
What are the best virtual instrument brands? The top 11 best virtual/software instruments plugin brands, overall, on the market today are:
- Native Instruments
- Spitfire Audio
- IK Multimedia
- Vienna Symphonic Library
Continue reading: Top 11 Best Virtual/Software Instrument Plugin Brands