And that's the ONLY way your host will let it be made. Restrictive where you could have been inclusive. Just like Stein-aha.tony tony chopper wrote:I too said that both should coexist, what I wrote isAs a composer with classical training I have the level of abstraction needed to be able to understand both the "improvisatory" approach that is apparently defended by Admiral and the more rational approach defended by Gol. Both can and should cohexist.
"music can & is also made by crafting on a screen, using a mouse."
Why does the controller need to know the pitch bend range of the instrument? It doesn't (unless it's a guitar controller). Again, you're trying to restrict, limit and disable the user when you should be enabling them. Just how some of you guys roll, I guess.
you can make anything out of NRPN's when you -agree- on a protocol. And this is then exactly like.. a new API.. but with all the hassles of 7-bit crap & limited messages.and the even more exoteric RPN and NRPN, which allow basically EVERYTHING.
Even what IS standard in MIDI is very poorly adopted by software. You wanna send a pitch bend range RPN to VST's? Count how many will properly interpret it (none?) vs how many will think it's just CCs automating something else.
Besides, things have changed a lot, there are many peripherals out there, that should be the #1 reason for plugins not to know about controllers that are attached to the sequencers, & thus not MIDI because MIDI is clearly not the best abstraction protocol.
Other non-MIDI instrument protocols, mice, Joysticks, Leap Motion, touchscreens.. they're all valid peripherals & plugins shouldn't know/have to know about them (but that's an old discussion)
Look, if a host wants to translate MIDI into automation events, I think that would be great. As long as it still allows access TO THE ORIGINAL MIDI for PLUG-INS THAT WANT IT. But when we leave it up to Steinberg to decide what attributes our instruments are allowed to have, we all lose.