Digital Sound & Music: Concepts, Applications, & Science, Chapter 6, last updated 6/25/2013
4
Chowning, Stanford, in 1966), cmusic (created by F. Richard Moore, University of California
San Diego in the 1990s), and pcmusic (also created by F. Richard Moore).
In the early 1980s, led by Dave Smith from Sequential Circuits and Ikutaru Kakehashi
from Roland, a group of the major synthesizer manufacturers decided that it was in their mutual
interest to find a common language for their devices. Their collaboration resulted in the 1983
release of the MIDI 1.0 Detailed Specification. The original document defined only basic
instructions, things like how to play notes and control volume. Later revisions added messages
for greater control of synthesizers and branched out to messages controlling stage lighting.
General MIDI (1991) attempted to standardize the association between program numbers and
instruments synthesized. It also added new connection types (e.g., USB, FireWire, and wireless),
and new platforms such as mobile phones and video games.
This short history lays the ground for the two main topics to be covered in this chapter:
symbolic encoding of music and sound information in particular, MIDI and how this
encoding is translated into sound by digital sound synthesis. We begin with a definition of MIDI
and an explanation of how it differs from digital audio, after which we can take a closer look at
how MIDI commands are interpreted via sound synthesis.
MIDI Components 6.1.2
MIDI (Musical Instrument Digital Interface) is a term that actually refers to a number of things:
A symbolic language of event-based messages frequently used to represent music
A standard interpretation of messages, including what instrument sounds and notes are
intended upon playback (although the messages can be interpreted to mean other things,
at the user’s discretion)
A type of physical connection between one digital device and another
Input and output ports that accommodate the MIDI connections, translating back and
forth between digital data to electrical voltages according to the MIDI protocol
A transmission protocol that specifies the order and type of data to be transferred from
one digital device to another
Let’s look at all of these associations in the context of a simple real-world example. (Refer to
the Preface for an overview of your DAW and MIDI setup.) A setup for recording and editing
MIDI on a computer commonly has these five components:
A means to input MIDI messages: a MIDI input device, such as a MIDI Keyboard or
MIDI controller. This could be something that looks like a piano keyboard, only it
doesn’t generate sound itself. Often MIDI keyboards have controller functions as well,
such as knobs, faders, and buttons, as shown in Figure 1.5 in Chapter 1. It’s also possible
to use your computer keyboard as an input device if you don’t have any other controller.
The MIDI input program on your computer may give you an interface on the computer
screen that looks like a piano keyboard, as shown in Figure 6.2.
Previous Page Next Page