Luke Graves: Outside of traditional sounds and techniques, how does MPE fit into emerging sonic landscapes and new methods of digital music production?
Mike Butera: With a lot of synthesizers like modular synths, they’re based around knobs, and some of those knobs control just one note and others are universal. With MPE, you have the choice of whether you want to turn a universal knob with a certain control, or if you want it to be as if every note has its own knob (and to build a modular synth that big is really expensive).
Then you have apps like Moog’s Model 15. You can just turn on MPE and suddenly it’s as if you have eight different modular setups all duplicated. But you don't have to mirror everything eight times; you do it once, and then every note you play gets that kind of control.
Also, it's been great to see Apple jumping on board with this – they've really been a pioneer with this stuff, and most of the synths in Logic in GarageBand are MPE-compatible including GarageBand forIt's really exciting to think about what the synthesizer can do, but now also what you can do to the synth with your hands. Going beyond keyboards and pads and thinking about strings and other dimensions of expression like tilting – really customizing your expression, not just customizing the sound in the box. I think that's the big difference.
Luke Graves: Where does MPE, MIDI, and digital music-making go next?
Mike Butera: We're at a kind of breaking point between traditional versus electronic instruments, and there's a question of how much people want to emulate the experience of analog or physics-based instruments within the realm of electronic where you can do whatever you want.
One of the central questions for us is "How important is the physicality?" The actual ergonomics of expression versus just the capability of making sound. Because at this point, if you have an iPhone you can make any sound you want. But regarding MPE and electronic instruments from an interface point of view, there's this question of human-centered design: what does your body want to do to make music?
Where it goes next, we're already seeing some early examples of virtual reality and mixed reality which are rethinking space. You know, waving your arms in the air and making sound. Now you have virtual instruments that you can interact with in embodied but not physical ways. I think creating interfaces that can match people’s ideas of what expression looks like, that’s the challenge. And what we’re trying to do is create an instrument that captures all of those things. MPE, synths, everything is circling around to enable that kind of flexibility. Now the question is, "What do people actually want to do?"