Multidimensional Polyphonic Expression (MPE) MIDI is an emerging standard that describes how expressive performance controllers convey their intent to synthesizers, sound modules, and of course music software such as DAWs. Current examples of expressive controls are the LinnStrument, Soundplane, Roli Seaboard Rise, Eigenharp, and Continuum Fingerboard. MPE is not yet a standard. It is an agreement negotiated amongst the interested parties that has been submitted to the MIDI Manufacturer's Association (MMA) for consideration and approval.
Why do we need this? Expressive controllers provide the musician a means to control various aspect of the sound on a note-by-note basis throughout the lifetime of the note. What this means in practice is that you, the performer, can shape each note while the note is playing. One example of an instrument that affords such control is a violin. Contrast that to a basic electric organ where control is largely static: you press a key and it sounds until you release.
Conventional MIDI controllers already provide for some of this with pitch bend, aftertouch, mod wheel, and so on. But these facilities have a major limitation; they apply to all notes currently playing. So it works very well for monophonic play; not so much for polyphonic playing. Pitch bend is particularly problematic because bending notes and adding vibrato is something that does not quite sound right when it s applied equally to all sustained notes.
MIDI does have two facilities to impart expression on a per-note basis: note velocity and polyphonic aftertouch. Note velocity is constant throughout the life of the note, or at least until the note is released when a second velocity can be specified; that is rarely supported or used. Polyphonic aftertouch does fit the bill but historically it as been an exotic capability rarely found in keyboard controllers, and many synthesizers do not fully support it. Furthermore a number of popular DAWs, like Ableton Live and Reason do not support it.
Which brings us back to MPE. MPE builds upon MIDI to facilitate communicating note-by-note expressive control. It's not really a true addition to MIDI but rather a way to use MIDI's existing commands to accomplish this goal. In fact the central concept behind MPE is not even new. It has been in use for guitar synthesizers and controllers since the dawn of MIDI, and the original MIDI specification contained an explicit mode just for that purpose. Furthermore, Haken Audio's Continuum has used MIDI in a way substantially like MPE since 2001.
Handling pitch, including note bends, is a primary goal of MPE. MIDI uses two separate command types to control pitch. The first tyle, the NoteOn and NoteOff commands, set pitch for the lifetime of the note. The second is Pitchbend. Pitchbend though is global. All currently sounding notes are changed by the same degree.
Guitar synthesizers and controllers faced this same problem. Their solution was to assign each string to its own MIDI channel. So instead of a guitar controller playing on MIDI channel 1, it would play on MIDI channels 1 through 6, one channel per string. Assigning each note to its own monophonic MIDI channel means that the full capabilities of MIDI can be utilized just for that one note. Pitchbend for one channel (note) does not affect another channel. Each note can have its own ModWheel, aftertouch, even other parameters assigned to MIDI Continuous Control (CC) commands. Imagine controlling important synthesis parameters, separately with each finger!
Buried in the original MIDI 1.0 specification is a channel mode, Mode 4. When a controller sends notes to a sound module while in Mode 4, notes are each sent on their own channel, the first on the lowest channel, the second on the next, and so on. MPE does not explicitly use Mode 4, but instead defines a new interpretation of one of the other modes, Mode 3. Mode 3 means "polyphony operation on a single channel". It is historically the default mode for many synthesizers and sound modules. MPE though does not use it in this fashion. Instead it uses it much like Mode 4: each note event on its own MIDI channel.
The following diagram illustrates how three note events, plus some note expression, are outputted by an MPE controller:
Looking at this from the perspective of a string instrument, like a guitar, this is intuitive: one channel per string. But what about a keyboard instrument? Instead of thinking about strings or notes, think instead about touch events. Each touch event starts a note and the subsequent actions of the the finger responsible for that touch are interpreted as expressive gestures. For example how deep the touch presses may represent note volume, where the finger is left to right the pitch, and where it is front to back controls some timbral aspect.
Each touch event produces a stream of MIDI information that describes that event over the touch's lifetime. All of that MIDI data is sent on the MIDI channel that is assigned to that touch event. The first note event to the first channel, the second to the second channel, and so on. If there are less notes than available channels then each note has the exclusive use of that channel. So pitch bend commands and MIDI CC commands can alter the sound of that note without affecting other playing notes.
This is the normal way MPE works. There is even a way for an MPE controller to specify its assumed polyphony to the sound module. If there are more simultaneous note events than channels then an MPE controller is expected to send the extra channels on existing channels without forcing sounding notes off. In this case pitch bend and CCs no longer only affect a single note but all notes assigned to that channel. Frankly this situation is not likely to arise when playing an MPE controller, but could well happen when playing back multiple sequencer tracks of containing MPE data.
MPE has the concept of a "zone master channel". This channel is used to send MIDI commands that affect the overall operation of all currently sounding notes. For example, it is possible to have a "global pitch bend" controller that would bend all sounding notes. How this works in conjunction with individual note bends is a decision left to the sound engine. MPE supports multiple zones (think of these like keyboard splits), and each has its own zone master channel. The zone master channel sets the base of the zone, with the first note channel one more. The default set up has a single zone covering all channels, with the zone master on channel 1, and the first note channel set to channel 2. So the maximum polyphony for independently controllable note events is fifteen.
Limiting the number of simultaneous touch events to fifteen turns out to not be a practical problem for a number of reasons. A single musical part rarely has that number of simultaneous notes held, let alone all expressively played by the performer. There are also technical limitations, like MIDI bandwidth, that come into play before the channel limit is hit.
MPE has additional commands that are used to establish the working relationship between the controller and the sound source it is controlling. These commands specify things like maximum polyphony (how many channels are used), zones, the assumed pitch bend range, and so on. Most controllers though wake up in a simple "one zone" starting at channel 1, with notes playing starting on channel 2.
How exactly are note events (touches) represented? MPE defines three primary dimensions of control: pitch, pressure, and the "third axis" (MPE does not actually name this). MPE controllers are free to define additional per-note controls. How any of these are "mapped" to the controller depends on the controller. For example, the Continuum Finger Board defines left-to-right position as pitch, up and down pressure as pressure, and front to back as the third axis. The LinnStrument and Roli do something similar. But those are just one way of defining an expressive polyphonic controller.
Pitch is sent using a combination of MIDI note commands and MIDI pitch bend commands. The MIDI note establishes the idealized pitch, and pitch bend the variation from that ideal. This allows not only note bending and vibrator, but also "playing in the cracks" between keys.
MPE pressure is sent with the MIDI aftertouch command. This has only 7-bit values so there is an optional way to use two MIDI CC commands to send 14-bit pressure information. Since there are very few MPE compatible sound engines around, the default aftertouch command is most often used.
The third axis is sent using a CC74 message, and like pressure, there is an alternative 14-bit mode using two other MIDI CC commands. Many of you will recognize CC74 as commonly used for a synthesizer filter's cutoff frequency. At first this may seem like an excellent choice but in the real world of hardware and software synthesizers it can be a problem. Many synthesizers use this MIDI CC to set the absolute value of the filter cutoff, and this is rarely what you want to happen with an expressive controller. This will make many existing synthesizers difficult to use with a MPE controller.
MPE is still evolving but it's good that its "out there" and implementations are sprouting up both on the controller and sound engine sides. Some DAWs already embrace it, others kind of work with it, and a few require creative work arounds. It takes a certain pioneering spirit to dive into this at the moment but there are great rewards for the brave!