Steinberg: No more VST2 Development

DSP, Plugin and Host development discussion.
Post Reply New Topic
RELATED
PRODUCTS

Post

sonigen wrote:
mystran wrote:is there some technical objections against GMPI?
I'd object to XML, I think XML is a bloated pile of verbose crap.
LV2 chose 'turtle', I think because of objections to XML. I don't know it, but the purpose is the same as XML.
I'm not 100% sure this is accurate since I only quickly glanced at Jeffs SDK but it looks like it's all C++. That the API is tied to C++ interfaces/classes. This is unacceptable to me, it should be plain C with C++ as an overlay. You make the API from C++ you limit future options, and you make life harder for people who are not using C++.
Agree, although my plugins use C++, GMPI is actually mandated as binary compatible with C (which it is). The C++ is just for me a convenient 'wrapping' of the underlying C interface.
Also, I suspect you have a point about the XML, although it's very quick and easy to type in your parameters etc in XML, it would also be more straight forward for some people to have programmatic specification as an additional option. i.e. the host would call an actual function to get the plugin's number of inputs/outputs/parameters etc. There's no reason the XML couldn't be an optional layer built on top of a more traditional mechanism. What do you think?

Post

AdmiralQuality wrote:I think we already have this interface everyone is looking for. It's VST 2.4.
Has anyone considered making an 'open' VST2.4 ?

Post

Vanilla C (as far as I know) is fairly easy to access from many languages.

Post

Jeff McClintock wrote:
AdmiralQuality wrote:I think we already have this interface everyone is looking for. It's VST 2.4.
Has anyone considered making an 'open' VST2.4 ?
Well, that's what aciddose is talking about, isn't it? But it's really quite unnecessary. We've all already got our copies and our agreement with Steinberg didn't say anything about us needing to stop support for it just because they do.

VST2.4 is the most stable, well tested, and well understood audio plug-in API, and should continue to be for years to come.

Post

JCJR wrote:I'm very ignorant but XML might be "as good as it gets" platform-agnostic way to substitute for the various resource files of yesteryear. Unless there is some other better way to get it done.
JSON is an interesting alternative. It's object serialization versus extensible tree-form representations of documents, and less work for something like a storing and restoring a big hash table (there's some ability to nest, just it gets cumbersome to get several levels deep). Unmarshalling (fancy new word for parsing text into objects I picked up recently) from JSON I think is cognitively uncomplicated. Library routines for dealing with JSON are pretty readily available.

I think JSON's pretty reasonable here ... what'd be the point of having extensible document mark-up? (Ultimately one can get that out of JSON too, so even some limited cases don't necessarily generalize)

Post

Jeff McClintock wrote:Also, I suspect you have a point about the XML, although it's very quick and easy to type in your parameters etc in XML, it would also be more straight forward for some people to have programmatic specification as an additional option. i.e. the host would call an actual function to get the plugin's number of inputs/outputs/parameters etc. There's no reason the XML couldn't be an optional layer built on top of a more traditional mechanism. What do you think?
Yes. Procedural interface to query the same info would probably be ideal.

That said, as long as the average plugin doesn't need to actually parse XML, it's probably not that bad if it can generated it at run-time (doing a few loops of sprintf is quick enough, even if rather pointless if it's going to be parsed again immediately afterwards). Having to embed static XML though would be very annoying, as it essentially adds an additional build-step (to generate the XML that would have to be embedded) and additional build-steps are painful. I have no idea how GMPI uses it though, I really should check.

[edit: also being able to figure out the inputs/outputs/parameters etc at runtime only, is more or less essential for writing binary wrappers that can just load any arbitrary plugin]

As for JSON: I don't see how that makes much difference, if you still intend to store it as text. You'll still need a parser in either case. Arguably JSON might be slightly easier to parse with an ad-hoc parser, but in practice the difference isn't that huge and you probably still want a library to do it from C.

Post

AdmiralQuality wrote:I think we already have this interface everyone is looking for. It's VST 2.4.
Only if you don't want/need to support RTAS, AAX or VST3. I don't think there's something like symbiosis for AAX/RTAS, so you'd need to rewrite a lot of code.

For sidechain support in Cubase a plugin needs to be VST3. It might be somehow possible with VST2, but nothing the average user wants to do. Of course, this can be considered a Cubase bug, but that doesn't help much. Customers expect solutions, not problems.

Post

mahaya wrote:
AdmiralQuality wrote:I think we already have this interface everyone is looking for. It's VST 2.4.
Only if you don't want/need to support RTAS, AAX or VST3. I don't think there's something like symbiosis for AAX/RTAS, so you'd need to rewrite a lot of code.

For sidechain support in Cubase a plugin needs to be VST3. It might be somehow possible with VST2, but nothing the average user wants to do. Of course, this can be considered a Cubase bug, but that doesn't help much. Customers expect solutions, not problems.
How old are you, son?

Post

He's right about customers expecting solutions.

That is why we need VST2.5 which has proper support for parameter change events, basic interpolation of those events and a better system for grouping channels to replace the lame functions (input/output properties) that VST2.4 has. Also would be nice to have better parameter properties support to specify min/max ranges, default value, whether the parameter is integer (switch) and better strict documentation to ensure hosts implement this correctly.

There are a few more features that could be added, and when you start to see every other host supporting 2.5 "side-chain" except cubase I think that will turn things around pretty quickly.

People will switch from "why not VST3? I want my side-chan! WAAAH" to "Wow cubase sucks. Why doesn't it have proper channel routing support?"

edit: Actually the thing I'd most like to see is a strict definition for order of events, threading and timing. For example editor->idle called at a 16ms rate, weighting applied to unruly plugins to maintain this rate for other plugins, only called from the main messaging/graphics thread. FLS8 was calling editor->idle from two threads at once! Really, it was in the same function at the same instruction in two different threads at the same time. I'd like to see that sort of thing never happen.

Idle is probably a bad name, what I'm pointing to is a more centralized system where the host is fully in control of timing and events, redraws and so on. It already is unless your plugin launches its own timing thread and disobeys the OS. I heard argument about this already - that you should be triggering WM_PAINT messages with sendmessage() and so on rather than updating directly. Only problem is when you call sendmessage() it immediately calls your window proc through the dispatcher and you end up in the same place you would have been if you just drew directly, but with a whole load of overhead tacked on. Try it, step through and you'll see the rects get sent on to WM_PAINT just as if you had jumped right in and done the blit yourself rather than calling through the messaging system. I've tested this on xp, vista and win7, not win8 though.
Free plug-ins for Windows, MacOS and Linux. Xhip Synthesizer v8.0 and Xhip Effects Bundle v6.7.
The coder's credo: We believe our work is neither clever nor difficult; it is done because we thought it would be easy.
Work less; get more done.

Post

aciddose wrote:There are a few more features that could be added, and when you start to see every other host supporting 2.5 "side-chain" except cubase I think that will turn things around pretty quickly.
I think almost every other host already supports sidechains via 4in/2out for VST2.

Post

mahaya wrote:
aciddose wrote:There are a few more features that could be added, and when you start to see every other host supporting 2.5 "side-chain" except cubase I think that will turn things around pretty quickly.
I think almost every other host already supports sidechains via 4in/2out for VST2.
Well, a lot of them do, not every one.

The real issue is in the implementation the user can see though, and users claim (personally I don't see the difference, but...) that the way it is done in VST3 is better for them.

So obviously we as developers/programmers/whatever need to wrap our heads around that and figure out how to shut those customers up by shoving something better in their mouths. :hihi:
Free plug-ins for Windows, MacOS and Linux. Xhip Synthesizer v8.0 and Xhip Effects Bundle v6.7.
The coder's credo: We believe our work is neither clever nor difficult; it is done because we thought it would be easy.
Work less; get more done.

Post

Jeff McClintock wrote: LV2 chose 'turtle', I think because of objections to XML. I don't know it, but the purpose is the same as XML.
I dont think we should have some external text file full stop. It should just be done via the API with enumerations & strings.
Chris Jones
www.sonigen.com

Post

aciddose wrote: edit: Actually the thing I'd most like to see is a strict definition for order of events, threading and timing. For example editor->idle called at a 16ms rate, weighting applied to unruly plugins to maintain this rate for other plugins, only called from the main messaging/graphics thread. FLS8 was calling editor->idle from two threads at once! Really, it was in the same function at the same instruction in two different threads at the same time. I'd like to see that sort of thing never happen.
I think it's enough to have either...

GUI calls, or Audio calls, and that each type will not be called concurrently. But you can get audio & gui calls concurently. Essentialy, there's one GUI thread, and one Audio thread.

Saying that, I been thinking for a while there could be a multithreaded Process() call, so the host could enable it so a plugin can process different elements with different threads. For example one thread per voice.

That said I haven't thought it through at all. And it would be an optional function, as very few plugins would need it. IE. The regular process must be implemented, and this doesnt have to be either side.
Idle is probably a bad name, what I'm pointing to is a more centralized system where the host is fully in control of timing and events, redraws and so on. It already is unless your plugin launches its own timing thread and disobeys the OS. I heard argument about this already - that you should be triggering WM_PAINT messages with sendmessage() and so on rather than updating directly. Only problem is when you call sendmessage() it immediately calls your window proc through the dispatcher and you end up in the same place you would have been if you just drew directly, but with a whole load of overhead tacked on. Try it, step through and you'll see the rects get sent on to WM_PAINT just as if you had jumped right in and done the blit yourself rather than calling through the messaging system. I've tested this on xp, vista and win7, not win8 though.
There's

RedrawWindow, causes a WM_PAINT imediately
InvalidateRect, causes a WM_PAINT to be put in the message queue. Will be done when there's nothing else more important.

You should really only be calling the later and you should not be sending a WM_PAINT yourself IMO.

A plugin SDK should not be concerning itself with this i think.
Chris Jones
www.sonigen.com

Post

sonigen wrote:There's

RedrawWindow, causes a WM_PAINT imediately
InvalidateRect, causes a WM_PAINT to be put in the message queue. Will be done when there's nothing else more important.

You should really only be calling the later and you should not be sending a WM_PAINT yourself IMO.

A plugin SDK should not be concerning itself with this i think.
Actually those are what I was referring to. Both will lead to WM_PAINT immediately. I was surprised to find this with InvalidateRect.

Also you need to keep in mind who is controlling the message queue? Oh, the host is... only it isn't. You depend on the host calling dispatchmessage to get things like WM_TIMER and WM_PAINT, but it can't control exactly what is happening unless it picks apart all the OS-specific junk.

You might say that is up to the host and so leave it be, but then you're moving things that should be defined by the API into the domain of OS-specific, which sucks.

The API should definitely control GUI updates just as it controls parameter updates, audio processing and everything else.

The API should not make the programmer have to create OS-specific code and "hope" that it works depending upon the internals of the host.

Look at it this way: does the logic apply to having the audio stuff managed by the plugin itself? No? Is the GUI not a core of the plugin? It either is or isn't, and if it is the API should control/provide it.

It already provides the window handles to editor->open, it just doesn't provide any timing or update mechanism except editor->idle, which I think suits this task.
Free plug-ins for Windows, MacOS and Linux. Xhip Synthesizer v8.0 and Xhip Effects Bundle v6.7.
The coder's credo: We believe our work is neither clever nor difficult; it is done because we thought it would be easy.
Work less; get more done.

Post

mystran wrote:I have no idea how GMPI uses it though, I really should check.

[edit: also being able to figure out the inputs/outputs/parameters etc at runtime only, is more or less essential for writing binary wrappers that can just load any arbitrary plugin]

As for JSON: I don't see how that makes much difference, if you still intend to store it as text. You'll still need a parser in either case. Arguably JSON might be slightly easier to parse with an ad-hoc parser, but in practice the difference isn't that huge and you probably still want a library to do it from C.
XML is suggested in GMPI around presets and storing descriptions of GUI stuff. The latter seems a little out of scope here (like MIDI mapping which I wish was addressed in plug standards a bit, probably generalizes to any kind of control interface). But presets seems like a place to think about parsing text into objects and vice versa (heh, it's kind of a control interface as well). I'm not sure where else.

The formal differences between JSON and XML show up in APIs and tooling. JSON just maps really naturally to atomic variables or structs or objects (anything that's idiomatically suited to object literal syntax in JavaScript). XML doesn't fail here (some people think it's ugly or verbose, it can do the job) but the addition of everything else related to document models means anyone depending on it has to think about document models as well.

Probably as you say it doesn't make much difference :hihi:

Post Reply

Return to “DSP and Plugin Development”