Our Community Plugin Format

DSP, Plugin and Host development discussion.
Locked New Topic

Poll 1 - Let's give it a name (Acronym)

HOT Plugins
0
No votes
WAR Plugins
0
No votes
TOP Plugins
0
No votes
OUR Plugins
0
No votes
PRO Plugins +1 Point
1
3%
EVE Plugins
3
10%
ION Plugins
3
10%
IVY Plugins
2
7%
MAN Plugins +1 Point
0
No votes
WTF Plugins +2 Points
1
3%
KVR Plugins (permission issue?)
2
7%
DIY Plugins
1
3%
COP Plugins
1
3%
API Plugins (Amazing Plugin Interface)
0
No votes
TPIA Plugins (This Plugin is Amazing)
0
No votes
OPI Plugins (Open Plugin Interface)
8
27%
OPS Plugins (Open Plugin Standard)
8
27%
 
Total votes: 30

RELATED
PRODUCTS

Post

i think, generally, vst2.x was actually quite good and a new standard should probably be modeled after it - and it should be lightweight. maybe something like a single .h file < 50 kb. looking at vst2, the plain c interface itself are the two files aeffect.h (17kb) and aeffectx.h (63kb), and there seems to be some bloat in it already - for example what has all this speaker-arrangement stuff to do with interfacing to a dsp process (edit: hmm...well, ok - i never did any surround plugins - maybe in such, the plugin needs to know such things)? also, all the midi program- and bank-handling seems to impose a very old-schoolish way to handle plugin states and should be excluded - i think, from the plugin interface's point of view, a plugin state should just be a binary blob of data without assuming any structure or organization - how state data is structured is the plugin's business, not the interface's

one thing that was not done well in vst2 was the identification of plugins by a "unique-id" that had to registered at steinberg. i think, better would the following scheme:

plugins are identified by a combination of:
-vendor name (string)
-plugin name (string)
-version number (unsigned int)
-there should be clear rules for what happens when a project was saved with a different version
than installed and/or if several versions of the same plugin are installed (maybe like:
take the version number that is closest to and >= the saved version number)

just an idea (maybe crap - dunno, i'll just throw it in the ring for discussion):

for extensibility, maybe there should be some completely general way of issuing requests from plugin to host and from host to plugin - maybe some sort of generalized callback function hostToPluginRequest(int type, void *data) and pluginToHostRequest(int type, void *data). the type would be one of an enum-value (that can be easily extended later) and the data could be any data-structure suitable to specify the input and output of the request (the receiver of the request would have to do an appropriate type-cast of the void-pointer - which cast is "appropriate" is dictated by the type). this pair of functions would constitute a super-general and easily extensible two-way communication channel (in the extreme case, even the audio processing callbacks could be handled that way and you would not need anything else - but that's probably not a good idea - there should probably be specific callback functions for that, for efficiency reasons, etc). ..maybe it should return an int, informing whether the request was handled (it should be possible to ignore requests, i.e. not implement a particular feature), etc. and maybe input and output data should be separate (or maybe not). the pluginToHostRequest could be a function pointer that is passed to the plugin on instantiation and the hostToPluginRequest could be a function-pointer field in a data-structure similar to AEffect that has to be filled out by the plugin during instantiation. there could be also a similar data-structure for hosts, though and the pluginToHostRequest could be a field of that - then, that "host" data-structure would be passed to the plugin on instantiation.
On August 1st, we will make it happen. We will release a new free and open community plugin format.
i'd say, that's a tight schedule for such an endeavor - to say the least :wink:
Last edited by Music Engineer on Mon Jul 06, 2020 11:14 pm, edited 3 times in total.
My website: rs-met.com, My presences on: YouTube, GitHub, Facebook

Post

Music Engineer wrote: Mon Jul 06, 2020 4:30 pm i think, generally, vst2.x was actually quite good and a new standard should probably modeled after it - and it should be lightweight. maybe something like a single .h file < 50 kb. looking at vst2, the plain c interface itself are the two files aeffect.h (17kb) and aeffectx.h (63kb), and there seems to be some bloat in it already - for example what has all this speaker-arrangement stuff to do with interfacing to a dsp process?
One thing to note is that VST2 also kinda has two different methods for host->plugin communication: there is the dispatcher that handles most of the stuff, but then there's also the direct function pointers for a few operations (ie. processing and get/set parameters). While this sort of makes sense to avoid the dispatcher for real-time operations, in practice the separate processEvents still goes through the dispatcher anyway.

I've written quite a few wrappers around quite a few APIs, not just plugin APIs and for what it's worth I feel like anything that requires a bunch of functions pointers tends to result in a lot of pointless boilerplate. So what I'd like to see would be an API where literally everything goes through a common dispatcher that just gets the operation code and a void* to a operation-specific struct that contains the relevant parameters. In my humble opinion, this is by far the easiest type of binary interface to work with, even though it's not popular these days, because it doesn't follow the latest design pattern fads.

This sort of scheme is also trivially extensible, you just need a new opcode and possibly a new struct definition for a new set of parameters and that's it. Preferably do this with operation-specific results also stored through the struct, so that return value can be reserved for error reporting and you can then have a "default: return 0;" type of thing to indicate that the operation is not supported and ensure that hosts can work around older plugins without issues (and plugins can work-around older hosts if you use the same scheme for plugin->host requests as well).

Post

Another point of view; why use a dispatcher at all? The interface could be assumed to look for symbols that are exported. The specification would be a list of those exported symbols.

Post

mystran wrote: Mon Jul 06, 2020 5:25 pm I feel like anything that requires a bunch of functions pointers tends to result in a lot of pointless boilerplate. So what I'd like to see would be an API where literally everything goes through a common dispatcher that just gets the operation code and a void* to a operation-specific struct
yes, indeed. for host-to-plugin requests, i would vote for one dispatcher-function, one process32() and one process64() function for single and double precision. if the host allocates the data-structure*, it could pre-fill it out with its own "do-nothing" or "clear-buffer" functions and the plugin, inside its instantiation code, could replace the function pointers with its own processing functions (so the plugin doesn't need to write boilerplate code for them, if it doesn't want to support them - of course, it will be a quite useless plugin, if it leaves both unassigned). whether single and/or double precision processing is supported could be detected by the host by looking at the function pointers after instantiation - if they have changed, the plugin supports them. the processEvents call before the process call seems also be a weird decision to me. away with that! i'd rather just pass the buffer of events along as another parameter to the audio-processing function. also, we don't need separate "processReplacing" functions. if the host chooses so, it could just pass identical pointers for input- and output-buffers, if it wants "replacing" mode. of course, it then needs to spelled out in the standard, that the plugin must not assume the input and output buffers to be distinct

(*) if i'm not mistaken, in vst, it was the plugin that allocated the AEffect data-structure and the host deallocated it, right? you said earlier in this thread that the only scheme that works is that whoever allocates also deallocates, so would you say, the host should allocate the struct?
Last edited by Music Engineer on Tue Jul 07, 2020 12:01 am, edited 3 times in total.
My website: rs-met.com, My presences on: YouTube, GitHub, Facebook

Post

why use a dispatcher at all? The interface could be assumed to look for symbols that are exported.
because we don't want a proliferation of various different callback functions for the open-ended number of all the different possible kinds of requests. for the realtime audio callbacks, it probably makes sense to have them not have to go through a dispatcher - for efficiency reasons...maybe for automation as well - but perhaps automation should be handled as events that are passed directly to the audio processing callback anyway
My website: rs-met.com, My presences on: YouTube, GitHub, Facebook

Post

camsr wrote: Mon Jul 06, 2020 5:45 pm Another point of view; why use a dispatcher at all? The interface could be assumed to look for symbols that are exported. The specification would be a list of those exported symbols.
Because having a bunch of functions is more copy-pasta than using a dispatcher.

Post

Music Engineer wrote: Mon Jul 06, 2020 6:32 pmfor the realtime audio callbacks, it probably makes sense to have them not have to go through a dispatcher - for efficiency reasons...maybe for automation as well - but perhaps automation should be handled as events that are passed directly to the audio processing callback anyway
I would beg to differ. The overhead of using a dispatcher for real-time stuff is so tiny it's not even funny and if you seriously don't trust your compilers ability to generate efficient switches (eg. if the opcode in use are more or less continuous, you would generally expect a jump-table that's about the same overhead as a function pointer and probably less than a virtual function), you can always check for the relevant opcode explicitly before the main switch. Trying to optimize these kinds of things is pure ABI bloat, IMHO.

Post

Music Engineer wrote: Mon Jul 06, 2020 6:32 pm
why use a dispatcher at all? The interface could be assumed to look for symbols that are exported.
because we don't want a proliferation of various different callback functions for the open-ended number of all the different possible kinds of requests. for the realtime audio callbacks, it probably makes sense to have them not have to go through a dispatcher - for efficiency reasons...maybe for automation as well - but perhaps automation should be handled as events that are passed directly to the audio processing callback anyway
In the early development stages, opting for exported symbols is open-ended. There's no reason that, at a later date, a dispatcher could not be added, calling those same exported functions. At that point, symbols could remain exported if there is a need. Starting without a dispatch also helps to discover what said dispatcher may require in the future.

I am talking from the plugin side BTW. Host will probably need use a dispatch.

Post

I once designed and implemented a humble API between a host and plug, using C++ with something similar to COM. The final design that I decided to go for is NO dispatcher. I can't remember exactly why but one of the reasons is the clarity and ease I get when defining parameters for each callback and callforward function. i.e no need for void *. Only two exports were really necessary, the rest is either predefined defined or communicated through the initial setup to support any multitude of functions.
www.solostuff.net
Advice is heavy. So don’t send it like a mountain.

Post

one of the reasons is the clarity and ease I get when defining parameters for each callback and callforward function. i.e no need for void *
i think, that would be the job of the c++ wrapper convenience class - like AudioEffect(X) in vst(2) - where you had to override virtual functions like setSampleRate,suspend,resume, etc. - all with specific parameters
My website: rs-met.com, My presences on: YouTube, GitHub, Facebook

Post

from the host we need to pipe->

midi
audio
timing
automation
window state
control state

in and out

we need sample/bit rate in

is there anything else?
the host could be designed around these

I am not adept, a simple soul

perhaps the design could have pins for these,
and whatever else we need to communicate and that's all-
the design should best allow any other coding technique, method, language, call to be available in the plug

Post

mystran wrote: Mon Jul 06, 2020 7:43 pm I would beg to differ. The overhead of using a dispatcher for real-time stuff is so tiny it's not even funny. ... Trying to optimize these kinds of things is pure ABI bloat, IMHO.
hmm - ok - even better. then we would literally only need one function for host -> plugin requests and another for plugin -> host requests. not even a struct for storing the handful of function-pointer fields would be needed anymore. that's pretty nice - because having thought about it some more, i could imagine that we may want some more variants of realtime processing facilities (besides the 2 standard process32/64 functions): for example, for variable I/O (for time-stretchers -> numOutputSamples != numInputSamples) or sample-by-sample I/O (to allow modular hosts to build feedback loops with only one sample delay in the feedback path) - if all of this could be handled in a unified way, good.
My website: rs-met.com, My presences on: YouTube, GitHub, Facebook

Post

i also tend to think, that instruments, audio-effects and midi-effects should be treated in a uniform way - because especially the line between instruments and audio-effects is rather blurry anyway and i found the enforced distinction (and different handling in a daw depending on type of plugin) often inconvenient. you can have midi-controlled effects - so an effect may want midi-input (for example, comb-filters controlled by the note(s) played), audio-controlled synthesizers - so an instrument may want audio-input.
My website: rs-met.com, My presences on: YouTube, GitHub, Facebook

Post

Music Engineer wrote: Mon Jul 06, 2020 9:06 pm
one of the reasons is the clarity and ease I get when defining parameters for each callback and callforward function. i.e no need for void *
i think, that would be the job of the c++ wrapper convenience class - like AudioEffect(X) in vst(2) - where you had to override virtual functions like setSampleRate,suspend,resume, etc. - all with specific parameters
Yea, that I used. But will not work for static functions, global functions, or from plugin to host communication. I needed explicit pointers for those.

Basically, I think one better go either all dispatcher or all NO dispatcher. Combining both would probably add more confusion than benefit. Yet for the sake of VST compatibility and possible future wrappers, may be it's the opposite :ud:
www.solostuff.net
Advice is heavy. So don’t send it like a mountain.

Post

An opcode system only has a bit of extra overhead compared to direct calls and is easily extensible. As previously stated, jump tables are your friend. The header would be a couple of enums/structs, a function to accept opcodes and one to send. Maybe a few extra lines for a C++ wrapper.

I got bored and just stripped all the garbage out of the VST 2.4 files and now have a single header/cpp file that add up to only 1200 LOC, down from the original 4500. My executable lost about 50k bytes of ugly fat. There's no reason a well-designed plugin header couldn't come in under a couple hundred lines, and be very easy to read and understand.
I started on Logic 5 with a PowerBook G4 550Mhz. I now have a MacBook Air M1 and it's ~165x faster! So, why is my music not proportionally better? :(

Locked

Return to “DSP and Plugin Development”