I think I'd put that like this, just to be clearer about what sort of midi implementation is required at both ends to get the full benefits:BONES wrote: ↑Sun Jan 22, 2023 1:39 am I don't think any of the others are doing that. Roli, for example, have had to make their "5 dimensions of touch" work within the existing MIDI standard, it's not something explicitly enabled by MPE, nor do you need MPE to take advantage of it. In fact, I almost never run anything in MPE mode, yet I still get to use the 5D touch features.
If you are playing monophonically, or dont need every single dimension to be independently controlled per voice then completely typical 1 channel midi is enough. What is missing from this mode is per-voice pitch and per voice CC's.
If you want per voice pitch bend and CC's per voice then you use multiple midi channels. And the original midi spec even has specific modes that describe this sort of application of multi-channel midi. Counter-intuitively called 'Mono Mode' (AKA guitar mode). In practice also included in various synths for a different purpose, multi-timbre multi-patch loading and playing each one via a different midi channel - with those synths you can achieve the same result as 'midi mono mode' by loading the very same patch into each timbre slot.
MPE is basically a further refinement of that old classic midi mono mode, and is built on top of those principals. It introduces the concept of one of the midi channels being used as a global channel for midi messages that are supposed to affect all the notes. Adds a couple of zones to that picture. It encourages the use of a much wider default pitch bend range, and methods for detecting and setting that pitch bend range. It provides a standard for which CC to use per voice (CC74 = MPE Y). And describes which aftertouch messages should be used, channel pressure not poly aftertouch (although plenty of MPE controllers and MPE synths still allow both despite what the spec says about this). And it also offers some rules for which circumstances a synth should continue to have its voices affected by some of those signals, eg what happens to the pitch of a bent note once you lift your finger.
MPE does represent a narrowed subset of part of what is actually possible in total using multiple midi channels. eg it only covers a certain number of dimensions of per-note expression, ie only CC74 using the label MPE Y, rather than formally describing a controller and synth pairing where there may be numerous other CC's being used per note in addition to CC74. Theres nothing in the broader midi spec that would prevent more being used, but users wouldnt be able to rely on a MPE synth supporting this (although some do in practice).
MIDI 2.0 is much broader, doesnt involve some of those arbitrary decisions about how many expressive signals a controller can send per note and bumps up the resolution of messages and the number of channels that can exist within a system. And includes a rather fancy system of two way communication and discovery that would enable different parts of your setup to understand each others capabilities without a lot of tedious manual configuration on the users part. But obviously unless/until we have a world where various MIDI 2.0 devices actually exist, this is just an 'on paper' discussion, not something we can experience ourselves.