About CAT

DSP, Plugin and Host development discussion.
Post Reply New Topic
RELATED
PRODUCTS

Post

beginzone wrote: Sat Jul 11, 2020 6:10 pm :D

I tried to use the units (of the different things) so that it matches somehow. And "key" clashes with keyboard keys, so much that at some time I started to say "pc_keyboard_key".

Scala uses cents in some way. I believe, when there is a dot in the pitch description, it's cents. Like MIDI key * 100. With additional decimal digits after the point. And fractions (everything that has no dot in the string) are frequencies. So I just used that, yesterday, somehow.
Forgot about Scala. What a gawdawful standard that is! Probably why I forgot... :hihi:

I'd like to put the Scala and WAV formats into Thunderdome and hope neither came out. Both are just ugly to program.
I started on Logic 5 with a PowerBook G4 550Mhz. I now have a MacBook Air M1 and it's ~165x faster! So, why is my music not proportionally better? :(

Post

:lol:

... there is something old-fashioned about it.


Scales are cool in rap music that contains riffs. I mean, I believe some people - maybe - only reuse recorded things because the tuning of instruments was so dark and special. And the scales for piano are cool, too. Things sound so special with them.

Post

Music Engineer wrote: Sat Jul 11, 2020 2:35 pm
I just would find it cool to see something in Opi that forces everyone to do the best interpretation
i think, the plugin format should give the plugin the number of transition/smoothing samples - as mystran's proposal does - but not specify the transition curve/method. it should be up to the choice of the plugin implementor to choose the appropriate curve, because only the plugin implementor can possibly know, what is most appropriate. imagine a simple gain plugin and the plugin receives an automation event to transition from its current gain value of 0.25 (in normalized units) to 0.75 within 100 samples. if the format specifies that there should be a linear transition between *normalized* parameter values, it might not be the best thing to do. instead, it may be better to convert current and target value to dB and do a linear transition in the dB domain and convert back - the rationale is that this should give a "perceptually linear" transition. similar considerations apply to frequencies (a linear transition in the pitch (i.e. log-frequency) domain) may be more meaningful. ...i'm assuming here, that the plugin interface uses normalized parameters in the range 0..1. of course, it could also use another scheme, supporting arbitrary ranges by reporting min/max values to the host and possibly even mapping functions - but i think, that would be overly complicated
I think it's all a great idea actually. I have before made a gain plugin (for personal use) that allows to change the transfer function on demand, and I found it useful. The same principles can be applied to other parameter modulations. But the real questions are related to how the plugin interface would facilitate these things.

Post

camsr wrote: Sat Jul 11, 2020 8:11 pm But the real questions are related to how the plugin interface would facilitate these things.
Could it be a header file in C++? (or C if inline is no problem)

I would have it easier as plugin developer anyway. Maybe it could help me process the event structure.

I do
#include "automation_helper.h"

it defines
float signal_or_nan(state_t*); // just to tell me delta times of events that want immediate change
float signal_and_hold(state_t*);
float signal_as_line_sements(state_t*);
float signal_as_filter_of_certain_type(state_t*);
float signal_as_filter_of_another_certain_type(state_t*);

Maybe of each there should be a version that advances the frame, and one that does not advance the frame. And a helper for the mystran's heuristic system.

And I could choose any of these for my transformation and further processing.

produce_sound(plugin_t* plugin) {

setup_state(&state);

for int i = 0; i < num_frames; i++ {
float gain = transform_from_0_1_to_my_plugins_scale(signal_as_filter_of_certain_type(&state));
...
}
}

If another language replaces C++, I could try to translate the automation helper into the new language. And from Go I kind of plan to call into C or C++ anyway.

Although, I don't know about the corner cases in problematic scenarios. And the heuristics. I thought, "it's something with live performance", but I lost track. And I never made any plugin that was used, so I cannot know.

So if someone could explain how the current situation,

setParameter,
with note-align-sliced numFrames for VST processing function,
and with a custom filter system that is special,

can or cannot be achieved with any such helper function, then this would be helpful. Would the list of helpers be too long? I know only two smoothing methods, and I'm unskilled at math.

Post

mystran wrote: Sat Jul 11, 2020 12:04 pm edit: The nice thing about this kind of scheme is that it can deal with inconsistent data. If the user is turning a MIDI knob or adjusting a GUI parameter on the fly, you don't really know when the next event is going to arrive, but you can try to make some sort of a heuristic estimate
Why does the plugin look into the future?
And what kind of issue causes the inconsistent data?

That is what I tried to refer before.

Post

beginzone wrote: Sat Jul 11, 2020 10:38 pm
mystran wrote: Sat Jul 11, 2020 12:04 pm edit: The nice thing about this kind of scheme is that it can deal with inconsistent data. If the user is turning a MIDI knob or adjusting a GUI parameter on the fly, you don't really know when the next event is going to arrive, but you can try to make some sort of a heuristic estimate
Why does the plugin look into the future?
And what kind of issue causes the inconsistent data?
The main problem is that when the user is tweaking something manually, the intervals between events are unpredictable and the best you can do is try to guess how much to smooth. More often than not you guess is going to be somewhat wrong. This is particularly nasty with the usual 7-bit MIDI controllers, where if the user is slowly turning a knob, the actual hardware might not even send a new event for every block you are processing, yet you would still prefer to smooth over it. The host can't really help you here either, because it doesn't have any more information either.

Post

mystran wrote: Sun Jul 12, 2020 12:55 am The host can't really help you here either, because it doesn't have any more information either.
My first thought was, a very,very,very light runtime (public domain) in C/C++, could be a "community audio technology" thing that makes plugin development easy, which was one of the initial ideas Vertion had.

But the live situation needs clarification. What information would help, and why does the host not have it?
Can we divide into parts [mouse usage] and [midi device input]?

Maybe, if the host gets MIDI input data which is "too sketchy", transferring the "too sketchy data" to everyone's plugin, to be handled differently, might not be an "easy development" situation. It would be more like a competition who handles problems the best way, wouldn't it?

I mean, maybe there should be something like "error handling versus musical intention". The runtime would do error handling and provide the plugin developer every type of data that is important for implementing musical intentions. Fast, and in original form, if possible. But not with "device errors", if possible.

Hm.. I can only say how I once tried to handle MIDI input, without heuristics unfortunately.

Microsoft's documentation mentions a 32-bit timestamp, which I believe I tried to use, but somehow I never did, I believe. Do you know how reliable the timestamp in dwParam2 in the midi callback is?

I'm no good host developer. In my host, I would have midi input via the CALLBACK_FUNCTION, but did not use dwParam2 for some reason.

I did/do use QueryPerformanceCounter. I have read that in rare cases it's not monotonic, when the CPU is clocked down by Intel speed step, and Windows doesn't correct it. Is that one of the problems?

How did you get a MIDI timestamp in your host, or how do the best hosts do this?

I don't want to force the idea of a "plugin format runtime library" upon anyone, but what do you think about such a possible thing, in general?

Sorry if I ask a bit much.

Post

mystran wrote: Sun Jul 12, 2020 12:55 am the actual hardware might not even send a new event for every block you are processing
Do the midi input events "arrive with jitter"?
Or did you mean, the event was being dropped and lost?
Or did you mean, the events just happen too seldomly, if the block size is very small? <-- That's it.

I'll stop asking questions for now. It was all bit unclear for me, and with more knowledge, I would not make possibly wrong suggestions for the plugin format. :oops:


Just in case you mean the host would have to correct a jitter problem:

Hopefully you don't mind if it try to write it down in a way so that in the end I could understand it. Sorry if I write much text.

There is a situation. I assume three facts.

0) There have been no jitter errors, and everything went fine so far. There is a certain *constant* latency between midi input and perceived audio from my speakers, for the reasons of block size mainly.

1) Block 1 is over. All possible midi input events for it has been sent to the plugin.

2) Block 2 is about to be requested. There is a new event that can be sent to the plugin.

Now, is there a reason that should make a host assume, that the new event actually happened in block 0 instead of block 1?

In my mind this would have to be a timestamp that the host did not create itself.

E.g. the dwParam2 timestamp that I never used for some reason. Or some measurement that tells you that the event took place earlier than one might think.
Last edited by beginzone on Sun Jul 12, 2020 7:08 am, edited 6 times in total.

Post

beginzone wrote: Sat Jul 11, 2020 10:38 pm
mystran wrote: Sat Jul 11, 2020 12:04 pm edit: The nice thing about this kind of scheme is that it can deal with inconsistent data. If the user is turning a MIDI knob or adjusting a GUI parameter on the fly, you don't really know when the next event is going to arrive, but you can try to make some sort of a heuristic estimate
Why does the plugin look into the future?
And what kind of issue causes the inconsistent data?

That is what I tried to refer before.
Did go for a walk outside. Now I suddenly understand it?

It's not really about the future here, right? It's only a prediction, because there is no event at all for a certain block, but the host estimates, that the user is still turning the knob.

E.g. with 32 samples block size:

block 4 has event 1
block 5 has none
block 6 has event 2

The plugin extrapolates the slope that event 1 has caused, and so there is now estimated data for block 5, too.
And *that* could overlap with event 2 (from block 6), and your system cares for the overlapping of an extrapolated guess. That is why there is an overlapping at all, right?

I totally failed to understand this. Hopefully I got it now? I did not think, that your linear-thing would be a parameter smoother at the plugin side.

I almost always compose and never turn knobs, so I thought always only about my wish for a perfectly transmitted signal. (for note expression vibrato etc)

Now, I think, although there might in theory be a transmission system (the runtime library idea) that cares for both scenarios, and allows for both scenarios.... the two things we were talking about are really different.

That is, why it's so hard to combine. Maybe it's wrong to try to combine a [planning-thing for note expressions] with a [live-performing thing]. Maybe it should be two separate sets of functions.

// LIVE PARAMETERS
// *plugin* does it's own smoothing, maybe with heuristics

a_float = plugin_api::get_next_live_signal_float_if_existing_otherwise_nan()
plugin.parameter_live_performance_smoother(a_float)

versus

// NOTE EXPRESSION
// note expression *helper* helps to reconstruct a perfect signal

a_float = plugin_api::get_next_note_expression_signal_float()
plugin.note_pitch_input(a_float)

I just wonder whether the transmitted raw data would be the same.
And if the transmission structures (event structures) for the different functions would be the same.
And what the plugin and host would handshake via opcodes to be congruent whether it's about live parameters or note expression.

Post

I'll stop spamming this thread, and will keep editing this post until one of you would be so kind to answer. :D

Please, you can pick any completely different topic you wish to post it here (or in the dsp forum, where I'm more quiet). I didn't intend to overtake or hijack any topic or thing going on. Also, maybe the whole automation thing is either exactly right already in Opi, or not that problematic at all.

There may be many errors and flaws in the following text, but I typed in an absolutely basic idea of a set of helpers to be use as inlined function, for those plugin developers who wish not to use a event structure directly. If it's a slow approach (code path issues?), please tell me. I only act as non-expert programmer here, who acts more like an enthusiast.

Just to keep some discussion going on (please say something :oops:):

--------

Helper methods. Only ONE of them must be used, depending on the situation. There will not be so many in the end, no helpers hell intended.

Some helpers might make no sense (e.g. helping to feed the plugin for heuristics). And others might need a opcode handshake between plugin and host, to use a different event structure.

automation_helper.h

In the base API there is everything to access OpiEvent directly.
This is a helper file to help those who don't want to do it directly themselves.

"Planned" == Better for note expression
"Live" == Better for knob tweaking
"Planned & Live" == Better for per-key-aftertouch
"Raw" == Your plugin gets a raw fullrate automation buffer with floats. Can be slow.

Helper_1: Live.
Plugin does smoothing heuristically. The plugin calls this helper to get a [FLOAT or NAN], and the helper advances to the next frame.

Helper_2: Live.
Helper does smoothing heuristically. The plugin calls this helper to get a heuristically improved signal, one [VALID FLOAT] per frame, with advance.

Helper_3: Planned.
The Helper returns does smoothing heuristically. The plugin calls this helper to get a heuristically improved signal, one [VALID FLOAT] per frame, with advance.

Helper_4: Planned & Live.
A per-key-after-touch situation. It's meant to be used for a almost perfect signal, but the helper also *supports* heuristics.

Helper_5: Planned & Live. A per-key-after-touch situation. It's meant to be used for a almost perfect signal, but the helper also *does the* heuristics.

Helper_6: Live.
Helper does smoothing with latency and a filter of type input*x+old_value*(1-x)

Helper_7: Raw. Full-rate automation signal floats buffer.
Plugin does smoothing itself.

// These will get renamed when the list of helpers is finished.
float helper_1(state_t*);
float helper_2(state_t*);
float helper_3(state_t*);
float helper_4(state_t*);
float helper_5(state_t*);
float helper_6(state_t*);
float helper_7(state_t*);

Post

What problem exactly do you want to solve with this overengineering? It makes no sense to me at all.

Post

Vokbuz wrote: Sun Jul 12, 2020 3:54 pm What problem exactly do you want to solve with this overengineering? It makes no sense to me at all.
The problem that appeared to me, was that I didn't know how to use

struct OpiEventAutomation
{
uint32_t type; // OpiEventType
uint32_t delta; // delta frames from start of block

uint32_t paramIndex;
float targetValue;

uint32_t smoothFrames; // frames to interpolate over (0 = snap)
};

to automate e.g. pitch of a note, for getting vibrato. Or, maybe having greater control over the resulting signal in general.

May I ask, do you know what events to generate to transmit a signal like pitch or cutoff, or something like this, with high precision to the plugin?

I honestly did have problems to grasp what the host has to send, and what the plugin has to do with it. Maybe I was trying to go away from VST2 for no good reason. I'm more into composing, and my view on the plugin format was different.

Post

I should be more precise.

I didn't know the guarantees:
Is targetValue guaranteed to be reached?
Is a snap high on CPU? Or, how many snaps are okay.
Is it a problem that for "starting a envelope segment", you need two events?
Is smoothFrames allowed to be greater that block_size-delta? Edit: I guess so.
Is the plugin allowed to do a heuristic approach with the events, or can this be disabled? (As it is not needed in certain situations).

I'm not against the OpiEventAutomation. I just thought, I'll tell you what's in my mind, and how I "might abuse" the structure. And I'm indeed a bit of a "wants a safe place" programmer, but just because in the past I had some problems with VST when making music.

Post

beginzone wrote: Sun Jul 12, 2020 5:26 pm Is targetValue guaranteed to be reached?
No. Target value is simply the desired target value. If the plugin smooths the parameter and another event happens before target value is reached, then one should continue smoothing towards the new value instead.
Is a snap high on CPU? Or, how many snaps are okay.
By "snap" I just mean "don't interpolate" which is really as cheap as parameter changes can get.
Is it a problem that for "starting a envelope segment", you need two events?
You don't. You start from whatever there previous value in the plugin was.
Is smoothFrames allowed to be greater that block_size-delta? Edit: I guess so.
Yes.
Is the plugin allowed to do a heuristic approach with the events, or can this be disabled? (As it is not needed in certain situations).
The plugin is allowed to do whatever it wants. The question though is whether these should be some hint for host to indicate that a heuristic should be applied.
I'm not against the OpiEventAutomation. I just thought, I'll tell you what's in my mind, and how I "might abuse" the structure. And I'm indeed a bit of a "wants a safe place" programmer, but just because in the past I had some problems with VST when making music.
The "safe place" should probably not be on the ABI level, but rather at the level of a higher level wrapper or (preferably) framework code.

Post

mystran wrote: Sun Jul 12, 2020 5:40 pm
beginzone wrote: Sun Jul 12, 2020 5:26 pm Is targetValue guaranteed to be reached?
No. Target value is simply the desired target value. If the plugin smooths the parameter and another event happens before target value is reached, then one should continue smoothing towards the new value instead.
Hi mystran :)
There's another thing I'm not sure about. Is a plugin allowed to not reach the targetValue, even if it seemingly should? E.g. block_size=100, smoothFrames=50 at delta 20.

In this case every plugin should have reached targetValue at frame 70, right? Or may it feature a slower smoothing for some reason?

Post Reply

Return to “DSP and Plugin Development”