VST3 Note Expression vs MPE?

Official support for: bitwig.com
Post Reply New Topic
RELATED
PRODUCTS

Post

So now that Bitwig will support VST3 and note expression, I'm curious about how note expression is different from MPE. Will MPE type instruments like linnstrument be able to record to note expression from their different axes of expression?

And does VST3 expression store the midi channel per note? Or does it not even need to split the notes to different midi channels?

Cheers!

Post

Yes... good questions... Am also interested in a clearer understanding of this

Post

Would like to get some info here, too.

Post

Its a good question...with quite a of subtle consequences.
(some of the following is a generalisation, and also based on my understanding)

its not really vs. as they are different things with different goals, that can potentially combined.

MPE - is a mid based wire protocol, its based on the idea of a midi generator (controller) and sound generator (synth), it requires no middle man, and is applicable to both hardware and software (v. important!)

VST note expression - part of the VST API, not related to midi, better to think of as 'per note automation' . being part of VST, obviously is software only, and relates to a vst host.

from this comes the subtle consequences

whats this mean in a software environment
(but again remember MPE can be hardware synths/controllers connected via wires directly)

MPE - requires very little of the 'host' (or even no host), just that it passes (and records) midi thru 'untouched' (this is the bit hosts struggle with ;)) , it is then up to the sound generator (synth = vst/au) to interpret the midi stream as it wishes.

VST note expression (VST NE) - exposes per note parameters, it has no way to interpret what a controller is sending it. it is up to the HOST to decide what expression is added to what notes.
(e.g. on a score,piano roll... like we use automation)

this is the crux, MPE has the synth doing the mapping VST NE, has the host doing it.

of course this partly comes from their backgrounds...
MPE (formerly VpC midi) originates from 'player expression' i.e. controllers.. VST NE , i believe, originates more from scoring, i.e. composers wanting to be able to annotate scores with expressions (its a big area in cubase) ... also given its Steinberg, the host is important

of course they can work together, a daw could accept MPE from a controller, and then allow mapping of that data to VST note expression...

so why do VST developers do MPE, when they could do VST 3 NE..
well VST3 NE is only relevant to VST, not AU/AAX etc... MPE works on all. and VST NE requires host support.
so no surprise, the main support for VST NE is cubase (,though in fairness, cubase is the best for MPE too :))

back to bitwig, my hope is it will allow for VST NE enabled VSTs, the ability to do the mapping in the host... and to be able to see clearly the mapping when its recorded... (as bitwig doesn't provide scoring I can't really see much else use )

hope thats helpful

Post

Very. Thank you.
iMacPro 1,1 | 64gb | OSX 10.15.7
http://www.gesslr.com
http://www.storyaudio.com

Post

Thanks Technobear!

So on a practical level, one of the issues with Bitwig seems to be that unlike Logic and Cubase, the midi channel of the note played does not seem to be stored in the midi sequence, so it seems like it only records the notes and then rotates the overlapping ones as it plays back the midi. So for example if you're using something likethe linnstrument in channel per row mode in order to take advantage of say, hammer-ons on the same channel/row, once you play it back it won't necessarily be the same since it didn't store the channel. Do you know if there is anythin gin the MPE spec about this as guidance for a host? It seems to me that Bitwig hasn't fully supported MPE if it doesn't store the midi channel of a given note. Unless I'm wrong about that and I just don't see where that info is. Another problem it causes is that if you release a note that has a long release and then play another note and send expression or pitch bend on that, it will aslo be applied to the previous note that is still heard because Bitwig puts it on the same channel (since they don't overlap). Have you come across this issue? And I guess Note expression wouldn't have this kind of issue since it doesn't need to do it by midi channels, but actually stores data per note, right?

Post

Echoes in the Attic wrote:Thanks Technobear!
So on a practical level, one of the issues with Bitwig seems to be that unlike Logic and Cubase, the midi channel of the note played does not seem to be stored in the midi sequence, so it seems like it only records the notes and then rotates the overlapping ones as it plays back the midi. So for example if you're using something likethe linnstrument in channel per row mode in order to take advantage of say, hammer-ons on the same channel/row, once you play it back it won't necessarily be the same since it didn't store the channel. Do you know if there is anythin gin the MPE spec about this as guidance for a host? It seems to me that Bitwig hasn't fully supported MPE if it doesn't store the midi channel of a given note. Unless I'm wrong about that and I just don't see where that info is.
MPE doesn't dictate how a host stores its track data, and I think its reasonable, that a host would not tie itself directly to midi, and whilst I think its reasonable a host can allocate the controllers 'touches' to any synths 'voices' , it seems 'obvious' to me in should do this consistently and its repeatable...

but there are quite a few 'gotchas' in this area
- one common 'mistake' (in the name of simplicity/and conventional editing) is the idea that the same note can only be played once. this is 'incorrect', like on a guitar , many of these controllers allow for C3 to be played multiple times simultaneously with different timbre/velocity.. this can produce a nice musical effect.
(note: MPE allows for this, VST NE , I don't believe does)
- how do you deal with touch count > synth voice count, and voice stealing
- voice allocation strategies like LRU
Another problem it causes is that if you release a note that has a long release and then play another note and send expression or pitch bend on that, it will aslo be applied to the previous note that is still heard because Bitwig puts it on the same channel (since they don't overlap). Have you come across this issue? And I guess Note expression wouldn't have this kind of issue since it doesn't need to do it by midi channels, but actually stores data per note, right?
that sounds like a bug.. which might be in BWS or in the Linnstrument.
what should happen is the Linnstrument, should be sending a pitch bend reset immediately BEFORE the new note-on, the reason is: its too late after the note-on, and it should be as long after the note-off to allow for the release tail.
Ive seen a bug in one host , which ignored pitchbends messages when there were no active notes... which would result in the behaviour you mention.
( I reported the bug the time to developer, but I cant remember which host it was, perhaps it was BWS...)

to be fair, this is new for host developers,and also they may not have that many musicians using it, so its going to take a while for it to stabilise.

Im looking forward to beta testing BWS 2.0 to see if the MPE/VST NE implementation is good :)
(btw: IIRC, I dont think Cubase supports this 'tie up' yet, at least time time I tried in 9.5 it didnt work)

I think 2017 will be an important year for MPE/Note expression, MPE controllers are gaining in popularity, and importantly the 'entry price' is coming down to further this.
this is in turn will drive developers to improve support e.g. more synths/vst, better daw integration.
I hope BWS as an 'early adopter' will try to push this, to maintain their advantage...
(Id be quite surprised/disappointed if Live 10 does not support NE and MPE!)

Post

I'm not using the linnstrument actually and I have no reason to believe that it sends anything incorrectly. I'm using the bitwig touch screen grid keyboard, or simply drawing midi data into clips.

I didn't really follow why you thought it was a bug. This is the same behavior with any VST I've tried pretty much. And it makes sense because of the way Bitwig is assigning midi channels when note are played back. It only rotates the channel when another note is played over an existing note (the note itself, not inlcuding the release time of a synth for example). So if you play one note and release it but it's still heard from it's long release, then you play a new note that has pitch automation for that note, both notes will bend, because the instrument receives the second note on the same channel since it doesn't overlap midi data.

There is a workaround for isntruments where you use multiple copies of an instrument, like Kontakt or Omnisphere, where the copies are different channels, and that is to set each copy to mono, so that when this scenario happens, at least it cuts off the previous note so that new expression only affects the new notes. But it would be preferable if the old note could fully release and stay on an independent channel. But from what I can tell, this is really up to the host to either store the separate midi channels per note (e.g. as received from hardware), or to rotate channels continuously regardless of whether midi notes overlap.

Oddly, the issue doesn't happen with Bitwig instruments so I don't know how they communicate internally but they don't seem to care if the same midi channel is used for consecutive notes, they still get separate expression.

Post

So now that bitwig 2 has been out for a while, has anyone figured out how to map an incoming modulation like the timbre/slide/Y axis to a parameter of a vst3 synth supporting note expression? I've been demoing Halion 6 and just can't figure out how one would modulate a parameter in Halion and record the movements from a linnstrument into bitwig.

Cheers

Post Reply

Return to “Bitwig”