Key press stolen in FL / Studio One

DSP, Plugin and Host development discussion.
RELATED
PRODUCTS

Post

Tale wrote:
mystran wrote:There's a "vendorSpecific" (not really, it's common to all hosts) in VST to handle mouse wheel: check for index == 0x73744341 && value == 0x57686565, then "opt" parameter will be -1 or 1 depending on direction.

edit:
Since I'm not sure if I've change the parameter names, here's the prototype

VstIntPtr MyPlug::vendorSpecific(VstInt32 index, VstIntPtr value, void* ptr, float opt);
Well, all hosts... REAPER doesn't seem to support it. But indeed FL Studio and VSTHost do, so thanks for the info. :)
Well "all hosts" in the sense of "not really vendor specific".

Post

mystran wrote:Well "all hosts" in the sense of "not really vendor specific".
I found this, I now understand what you mean:
http://www.asseca.com/vst-24-specs/efVe ... cific.html

Do we know which hosts do/don't support this? Here is my shortlist so far:

Hosts that support effVendorSpecific mouse wheel
  • FL Studio
  • VSTHost
  • Studio One
Hosts that don't
  • REAPER*
  • Mixcraft
* In REAPER the plug-in window does seem to receive WM_MOUSEWHEEL messages without setting keyboard focus

Post

What I personally do is synthesize WM_MOUSEWHEEL messages upon receiving the effVendorSpecific mouse-wheels, so if a host such as Reaper instead sends WM_MOUSEWHEEL messages I won't notice any difference. :P

But anyway, the situation kinda sucks.

Post

mystran wrote:There's a "vendorSpecific" (not really, it's common to all hosts) in VST to handle mouse wheel: check for index == 0x73744341 && value == 0x57686565, then "opt" parameter will be -1 or 1 depending on direction.
The reason for the funny numbers comes out better if you use the constants 'stCA' and 'Whee' instead.
"Until you spread your wings, you'll have no idea how far you can walk." Image

Post

Waking up this topic.
I stumbled upon the same issue, from host perspective, that when the plugin editor takes focus then pressing eg spacebar to start/stop the host doesn't work. It should work like this: When the VST editor has focus, eg for mousewheel or certain VST editor specific key actions, and then the user presses eg spacebar and spacebar is not consumed by the VST editor, then DefWindowProc should forward that key event to the parent window and so it will reach the host window where the host can properly process it.
But as far as i could see DefWindowProc does not forward unprocessed key events to the parent window and that's the essential issue and i think that's a conceptual mistake in Windows. Pls correct me if i'm wrong about this.
To solve this i think each plugin on Windows should do this quite simple trick in its window procedure: (pseudo code)

Code: Select all

case WM_KEYDOWN:
case WM_KEYUP:
  if (ProcessKey(Msg,wParam,lParam)) {
    return(0);
  }
  else {
    host_hwnd=(HWND)AEffEditor->systemWindow;
    return(CallWindowProc((WNDPROC)GetWindowLongPtr(host_hwnd,GWLP_WNDPROC),host_hwnd,Msg,wParam,lParam));
  }
Right?

Post

mutools wrote:When the VST editor has focus, eg for mousewheel or certain VST editor specific key actions, and then the user presses eg spacebar and spacebar is not consumed by the VST editor, then DefWindowProc should forward that key event to the parent window and so it will reach the host window where the host can properly process it.
But as far as i could see DefWindowProc does not forward unprocessed key events to the parent window and that's the essential issue and i think that's a conceptual mistake in Windows.
Unfortunately it's not very common for hosts to properly forward keyboard events to where they actually handle them (because Windows is not really designed to do this normally, which is obviously a mistake, but what can you do) if you simply pass them to the plugin editor's parent, so this doesn't work most of the time.

You also won't ever see messages that the host has set as "shortcuts" using the normal accelerator API as these get intercepted from the message loop before normal dispatch, ie. if your plugin needs reliable full keyboard (when focused) you're stuck using a keyboard hook (ugly) or running the editor in a separate thread (which I wouldn't recommend trying).

Post

mystran wrote:Unfortunately it's not very common for hosts to properly forward keyboard events to where they actually handle them
I meant forwarding from the plugin to the host. I'm looking at this from host perspective.
if you simply pass them to the plugin editor's parent, so this doesn't work most of the time.
The little code section i added is not simply forwarding it to the plugin editor's parent window but explicitly to the plugin's editor host window which guarantees that the host will receive any unprocessed key messages. I noticed that some plugins directly attach their editor to the host window, some plugins create a child window and do the GUI in that child window, some plugins use a more complex windowing approach eg with multiple windows aside the main plugin window. Using the above code (ie. the 'else' section) even unprocessed key messages of the separate plugin windows are properly forwarded to the host window. And as far as i see it's a clean trick.

So i'd suggest that all windows VST plugins and GUI frameworks like JUCE, iPlug, WDL, VstGui,... integrate this trick if that's not yet the case. (didn't check it, but it's clear that there are many Windows VST plugins not properly forwarding unused key messages)

What do you think?

Post

mutools wrote:
mystran wrote:Unfortunately it's not very common for hosts to properly forward keyboard events to where they actually handle them
I meant forwarding from the plugin to the host. I'm looking at this from host perspective.
Right, and I was talking about the situation where the plugin passes the events to the host-owned parent window, which then (typically) just ignores the event, because the host would normally do it's keyboard processing in some other window.

By "editor's parent window" I was specifically referring to the host-provided system window that would be the parent of the top-most editor window (ie. the window where you propose the events to be sent). Obviously if the plugin/toolkit uses "heavy-weight" controls (ie. creates additional system-level window hierarchy) then it'll need to forward further than the immediate parent of some child window with focus (or bubble up through the whole hierarchy), but that's not the point.

Rather my point is that in addition to convincing plugin developers to pass such events up to parent (owned by the host), you should then also convince host developers to expect these events there (eg. rather than intercept them directly from the message loop or handle them in a specific window somewhere that normally holds focus, or whatever else any given host happens to do). Otherwise you're essentially asking people to special case for your host.

ps. Don't get me wrong, I think it's really a nice proposal for handling this issue, but I'm not convinced it's going to work everywhere without additional support from both sides (ie. not just plugins) of the interface.

Post

mystran wrote:Right, and I was talking about the situation where the plugin passes the events to the host-owned parent window, which then (typically) just ignores the event, because the host would normally do it's keyboard processing in some other window.
Really? That would amaze me from theoretic pov but i didn't check it so you may be right from practical pov.
Rather my point is that in addition to convincing plugin developers to pass such events up to parent (owned by the host),
I propose to call it "the plugin's host window" then, just to avoid confusion.
Each VST plugin editor has a single host window ie the VST 'systemWindow'.
That's the important UI event gate towards the host.
you should then also convince host developers to expect these events there (eg. rather than intercept them directly from the message loop or handle them in a specific window somewhere that normally holds focus, or whatever else any given host happens to do).
Sure, indeed!

Personally i think that working with hooks is not the right way to solve this key messaging issue, it's a hacky way. The neat structured way to do key message processing is that key messages are delivered to the focused window/control and then travel upwards in the hierarchy as long as they're not processed aka consumed. And that's where Windows lacks something. And that's why i propose to fill in that gap with the 'else' clause in the above code section.

But you're right that everyone (hosts + plugins) should use a correct uniformous way.

And even then there still is a potential user issue where both the plugin editor and the host would use a same key. But the good thing about the structured way (hierarchic event processing) is that the user can solve that by choosing the right focus, whereas with a hacky hook then the user has no choice, it's a unsolvable conflict then; And i assume there could also be multiple hooks fighting eachother... All disadvantages of a non-structured method.

Pls correct me if i miss something.

Post

mutools wrote: And even then there still is a potential user issue where both the plugin editor and the host would use a same key. But the good thing about the structured way (hierarchic event processing) is that the user can solve that by choosing the right focus, whereas with a hacky hook then the user has no choice, it's a unsolvable conflict then; And i assume there could also be multiple hooks fighting eachother... All disadvantages of a non-structured method.
Multiple (correctly written) hooks can generally co-exist just fine (they just get processed one by one), but I certainly agree that it'd be nice to do without (although you still need it if the host uses TranslateAccelerator or similar and you insist on getting the affected messages as well).

Either way, the whole keyboard handling in WinAPI is kinda messed up in general. Another example I can think of is how a normal "text-book" message loop would call TranslateMessage to synthesise WM_CHAR (or WM_UNICHAR) messages, but you can't really rely on these in a plugin either, so WM_KEYDOWN needs to do something like GetKeyboardState into MapVirtualKey into ToUnicode just to get basic text input (which appears to be the minimum to get dead-key composites to work; no idea what one would need to do in order to also support IME properly).

ps. the same dance is also required for WM_SYSKEYDOWN (so layouts with "Alt Gr" can work) except here you must first special case VK_F4 and VK_SPACE because these need to go to DefWindowProc instead... :P

Post

mutools wrote:So i'd suggest that all windows VST plugins and GUI frameworks like JUCE, iPlug, WDL, VstGui,... integrate this trick if that's not yet the case. (didn't check it, but it's clear that there are many Windows VST plugins not properly forwarding unused key messages)

What do you think?
iPlug/WDL is doing this trick. Copied the behaviour because users complained immediately. I think you can count on most plugins doing it.
Checkout our VST3/VST2/AU/AAX/LV2:
Inner Pitch | Lens | Couture | Panagement | Graillon

Post

Thx for your replies.
Unfortunately many of the plugs i try do not properly forward unused key events.
The biggest victim is the user.
I don't understand why we as an industry can't solve this Windows gap using a common standard rule :?

Post Reply

Return to “DSP and Plugin Development”