What's the Deal with M.P.E.? (MIDI Polyphonic Expression)


Community Manager Luke Graves recently sat down with Artiphon's inventor and founder, Dr. Mike Butera, for a brief conversation on MPE (MIDI Polyphonic Expression) and the future of music-making. For additional reading, check out the MIDI Manufacturers Association's MPE press release on midi.org.



Luke Graves:
To start, could you explain what MPE is and a bit of the history behind it?

Mike Butera: The original MIDI standard was developed and launched in 1983, and the creators' goal was to allow for a general musical language between electronic devices. They imagined it would be used for many different types of instruments; the keyboard, of course, was the dominant one, but drum machines and grid-type instruments, as well as string-like and wind-like instruments, were all possibilities.

One of the big debates they had at the time was: "Should electronic instruments take advantage of the full range of continuous expression that was possible? How many parameters should we give the musician to control versus automating them?" On a keyboard, like a Moog synth, you can set up whatever sound you want, but when you press the key, it just does that one thing.

So a few years ago, a group of people in the MIDI community came together and said: "We want to figure out a way within the current MIDI spec to have more expression per note." This wasn't necessarily driven by strings, but it was driven by this desire to think in terms of giving players more freedom to modify parts of a synthesizer or sampler beyond the screen or knobs. To make it more performative rather than automated.

The goal with what is now MPE (or MIDI Polyphonic Expression) is that every note is going to have a whole channel’s worth of information.

It also says, with the right controller – like an Artiphon INSTRUMENT 1 – you can have, for every note that you play, all the different controls for just that one note. Every note that goes in is going to have its own channel, and that means it's going to have its own pitch, its own velocity, and any other control that you want.

Now, you have all of this information available and all this expression that previously hadn’t been possible.

Luke Graves: Are there other configurations that people have attempted to create over the years?

Mike Butera: MIDI HD was in development for a decade, but it wasn't necessarily backward compatible and never got the full support. It was just going to be a newer, higher resolution MIDI. MPE allows for higher resolutions, but it's more about saying "We're going to have all these different controls per note rather than channel wide."

From our angle, what’s really exciting about MPE is that the Artiphon INSTRUMENT 1 provides string-like expression via MPE that just has never been done. Having pressure sensitivity, pitch bend, and tilt, or the ability to have string-bending and bow expression – all at the same time in a way that string players – guitarists, bassists, banjo players, violinists, etc.–can use their muscle memory. That's really exciting.

Finally, we've achieved a level of string-like expression that is intuitive, bringing string players into a fully digital space.

Luke Graves: What are some practical applications for MPE?

Mike Butera: Think about a normal guitar wah pedal. Any sound that comes out the guitar goes through this one pedal. Same with distortion, same with delay, or reverb. Now you can have all of those effects happen separately for each note. You can imagine, like for a Hendrix solo, keeping some strings without wah and having them drone, and then riding on top of that with a pressure-sensitive wah effect or a violin or cello technique where the more you press on the bridge, the bigger the resonance or the reverb is going to be. You can do a really light touch and keep it really close up, and then as you press harder, you’re actually opening up the sound space.

The more literal stuff with string bending includes vibrato as well. Violinists spend years trying to figure out how to get their hand to move that way, and the INSTRUMENT 1 actually lets you do real vibrato on the string, and the MIDI registers the minute detail. There are so many string-like behaviors and gestures, and we’ve built an instrument that’s flexible enough to capture all of them.

Luke Graves: Outside of traditional sounds and techniques, how does MPE fit into emerging sonic landscapes and new methods of digital music production?

Mike Butera: With a lot of synthesizers like modular synths, they’re based around knobs, and some of those knobs control just one note and others are universal. With MPE, you have the choice of whether you want to turn a universal knob with a certain control, or if you want it to be as if every note has its own knob (and to build a modular synth that big is really expensive).

Then you have apps like Moog’s Model 15. You can just turn on MPE and suddenly it’s as if you have eight different modular setups all duplicated. But you don't have to mirror everything eight times; you do it once, and then every note you play gets that kind of control.

Also, it's been great to see Apple jumping on board with this – they've really been a pioneer with this stuff, and most of the synths in Logic in GarageBand are MPE-compatible including GarageBand forIt's really exciting to think about what the synthesizer can do, but now also what you can do to the synth with your hands. Going beyond keyboards and pads and thinking about strings and other dimensions of expression like tilting – really customizing your expression, not just customizing the sound in the box. I think that's the big difference.

Luke Graves: Where does MPE, MIDI, and digital music-making go next?

Mike Butera: We're at a kind of breaking point between traditional versus electronic instruments, and there's a question of how much people want to emulate the experience of analog or physics-based instruments within the realm of electronic where you can do whatever you want.

One of the central questions for us is "How important is the physicality?" The actual ergonomics of expression versus just the capability of making sound. Because at this point, if you have an iPhone you can make any sound you want. But regarding MPE and electronic instruments from an interface point of view, there's this question of human-centered design: what does your body want to do to make music?

Where it goes next, we're already seeing some early examples of virtual reality and mixed reality which are rethinking space. You know, waving your arms in the air and making sound. Now you have virtual instruments that you can interact with in embodied but not physical ways. I think creating interfaces that can match people’s ideas of what expression looks like, that’s the challenge. And what we’re trying to do is create an instrument that captures all of those things. MPE, synths, everything is circling around to enable that kind of flexibility. Now the question is, "What do people actually want to do?"

Reading next