Polyphonic Guitar to MIDI VST/AU "MIDI Guitar"- BETA TEST

VST, AU, AAX, CLAP, etc. Plugin Virtual Instruments Discussion
Post Reply New Topic
RELATED
PRODUCTS

Post

tboneUS wrote:
I too look forward to strumming. Strumming has been the one aspect of midi guitar systems where technology has failed.
Happy New Year!
Yes it is impressive software, esp. on iPhone when it does the same as Roland's boxes.

It has low latency however not all notes are detected. As I remember Sonuus (G2M) stated that latency is easily observable but not as important as notes' dynamic range and precise detection.

My experience so far is that 6-5-4 note chords do not register all notes most of the time. The demos I saw hide this by plying monophonic pieces and by using primary guitar sound over the MIDI sounds.

If somebody could record MIDI notes while strumming chords and compare it with a hex. system, would be a good usability test. Or simply play chords without primary guitar sound.

Post

Reaper Ver 4.31 /i386 rev 7d61bf (Nov 23 2012)
Midi Guitar 0.5.2 BETA VST (32 bit version - I have not tried the 64 bit version)

Pitch Prediction - Not Saving when Default recognition selected
Sensitivity - Saves fine (did save after setting one or more of the Sensitivity sliders in the sensitivity page, i.e loads the default value of 25 if the recognition preset was set to the Electric Guitar, Default)
Pitch Shift - Saves Fine
Fixed Velocity - Saves Fine
Output Volume - Saves Fine
Output Type - Saves Fine
Midi Status - not saving the chosen status when closing the project. Further testing I noticed that even just closing the window reverted the setting back to the enable state.
Instrument - saves fine



Sensitivity Page - All save Fine

I did not notice that the reason Sensitivity in the recognition section was linked to the main sensitivity slider in the sensitivity page so it was defaulting to the value set by the default preset.

-------------

I primarily use Ableton so I carried out a few more tests than for reaper

Ableton Live Ver 8.3.4 Build 2012-08-30 b858ac4acd 32 bit

Pitch Prediction - Not Saving when Default recognition selected
Sensitivity - Save fine (did save after setting one or more of the Sensitivity sliders in the sensitivity page, i.e loads the default value of 25 if the recognition preset was set to the Electric Guitar, Default)
Pitch Shift - Saves Fine
Fixed Velocity - Saves Fine
Output Volume - Saves Fine
Output Type - Saves Fine
Midi Status - not saving the chosen status when closing the project. Further testing I noticed that even just closing the window reverted the setting back to the enable state.
Instrument - saves the status fine (i.e no instrument or test piano)



Sensitivity Page - All save Fine

I did not notice that the reason Sensitivity in the recognition section was linked to the main sensitivity slider in the sensitivity page so it was defaulting to the value set by the default preset.

Not sure what has happened here, it appears to be fine, yet projects I have created over the last few days did not appear to be saving the settings… After further investigation I found that changing the settings in Midi Guitar alone was not enough to save the settings, even clicking on mute for a channel in live would allow correct recall of values.

Ok, so after much testing… it appears to save ok when using a custom recognition sensitivity setting but when using Electric Guitar, Default on load the plugin defaults to default values for Mono / Poly, Pitch prediction.

Also when saving a project that has a VST loaded for the instrument, the plugin does not recall the vst used, instead defaults to No Instrument

In ableton I saved the plugin as an Audio Effects Rack. The first time I called up a new instance of it the settings were set to the default values and instrument textbook read Loading Test Piano, closing then opening the vst window again displayed the correct values on the plugin

I noticed I could not rename a recognition preset, when clicking rename, the text box is not in focus for the user to type a new name.

Also maybe something had to initialize before saving and recall worked… or may be it was just me lol.

Are there any plans to expose the plugins values so they can be used in ableton as a macro controls? Currently no values are shown when clicking Unfold Device Parameters?

Hope this is of some use, keep up the good work
"It's alright to say things can only get better,
You haven't lost your brand new sweater.
Pure new wool, and perfect stitches,
Not the type of jumper that makes you itches."

Sultans of Ping

Post

JamOrigin wrote:
tboneUS wrote: Another thing I found was that loading kontakt into the plugin rather than outputting the to the host and loading kontakt the seemed to reduce the latency even further... Any thoughts or feedback ??
I have haerd others having the same feeling but im still puzzled about this. MIDI Guitar process its hosted synth as directly in the audio thread with no delay frames, so thats optimal for latency, but I would think than any DAW would do that. Anybody understand why a DAW would introduce artificial latency here?
This is normal and should be true for any instrument plug-in. It's because live MIDI events are always sent too late, because the host is already rendering the current buffer when your note comes in, so it has to wait until at least the next process call cycle to send your MIDI event to the instrument(s). But if you host the instrument in the same plug-in that's doing the detection, you save (at least) one buffer's worth of latency because you can render onto the output buffer in the same process call that did the detection of the input event.

MIDI is never in the same time as you play it when playing a virtual instrument live on a DAW. However, on playback the host may choose to correct the timing of the MIDI events back to when they actually happened. (Each event has a deltaFrames value, describing how far in sample time into the NEXT buffer process cycle the note should go on/off at. So pre-recorded notes can happen anywhere. Live notes however can never get out in time because it's too late to add them to the vstEvents list for the buffer that currently already has process call(s) rendering into it.)

This is true with all MIDI processors on DAWs. They always add one buffer of latency. So yes, for best real time performance, using MIDI-Guitar as a plug-in host itself will provide the lowest latency.

Post

AdmiralQuality wrote:
This is true with all MIDI processors on DAWs. They always add one buffer of latency. So yes, for best real time performance, using MIDI-Guitar as a plug-in host itself will provide the lowest latency.
Interesting, thanks Admiral
"It's alright to say things can only get better,
You haven't lost your brand new sweater.
Pure new wool, and perfect stitches,
Not the type of jumper that makes you itches."

Sultans of Ping

Post

I have done some more testing while playing with the plugin and it looks like if you group the plugin as am effects rack in live it will use the settings from the recognition profile. Therefore if you add a new instance of the effects rack item just created and alter the settings ableton will set the first effects rack values to the same as this second instance.

Ideally, I think it would be nice to just set the values in the Ableton effects rack so that you could create multiple instances that are easily recalled using lives browser rather than having to store a separate recognition template for different setups, especially during testing of the plugin where it is good to compare one set of settings to another.
"It's alright to say things can only get better,
You haven't lost your brand new sweater.
Pure new wool, and perfect stitches,
Not the type of jumper that makes you itches."

Sultans of Ping

Post

skiffcapt wrote:Happy New Year to all!

When I first joined the MIDI Guitar beta several weeks ago I was using it with a beautiful cello from a popular orchestral VSTi and my wife burst in to the room and commented that I sounded like Beethoven! Best compliment ever! :oops:

Just wanted to report in that the x64 version of the MIDI Guitar 0.5.2 VST is working VERY well for me within Sonar X2 x64.

The simple Sonar specific configuration detailed in this thread and on the MIDI Guitar website worked perfectly, by the way.

I have so much more room on my desk now that I got rid of that cursed MIDI keyboard spanning across it. I was doing my best pretending but really struggled with the keys. I am a guitarist!

MIDI Guitar is a dream come true for me. Thanks JamOrigin and continued success in the new year!
Awesome! Thanks skiffcapt!
We didnt expect to hear from so many users that wiping out keyboards, guitar converter boxes and cables is a selling point for MIDI Guitar. Maybe we should rebrand it to "MIDI Guitar - The Desktop Cleanup Utility" :)

AdmiralQuality wrote:
JamOrigin wrote:
tboneUS wrote: Another thing I found was that loading kontakt into the plugin rather than outputting the to the host and loading kontakt the seemed to reduce the latency even further... Any thoughts or feedback ??
I have haerd others having the same feeling but im still puzzled about this. MIDI Guitar process its hosted synth as directly in the audio thread with no delay frames, so thats optimal for latency, but I would think than any DAW would do that. Anybody understand why a DAW would introduce artificial latency here?
This is normal and should be true for any instrument plug-in. It's because live MIDI events are always sent too late, because the host is already rendering the current buffer when your note comes in, so it has to wait until at least the next process call cycle to send your MIDI event to the instrument(s). But if you host the instrument in the same plug-in that's doing the detection, you save (at least) one buffer's worth of latency because you can render onto the output buffer in the same process call that did the detection of the input event.

MIDI is never in the same time as you play it when playing a virtual instrument live on a DAW. However, on playback the host may choose to correct the timing of the MIDI events back to when they actually happened. (Each event has a deltaFrames value, describing how far in sample time into the NEXT buffer process cycle the note should go on/off at. So pre-recorded notes can happen anywhere. Live notes however can never get out in time because it's too late to add them to the vstEvents list for the buffer that currently already has process call(s) rendering into it.)

This is true with all MIDI processors on DAWs. They always add one buffer of latency. So yes, for best real time performance, using MIDI-Guitar as a plug-in host itself will provide the lowest latency.
Yes, we thought of this, and its part of the motivation for hosting synths inside MIDI Guitar. Its clearly true for keyboard events etc which comes from an external process, but i fail to understand how DAWs can delay MIDI in a plugin chain where MIDI Guitar produce midi in the audio thread and the synth consume it in the same thread?

I mean if MIDI Guitar can avoid this latency other DAWs can surely do it as well. After all you can see MIDI Guitar as a tiny DAW hosting one plugin.

tboneUS wrote:Reaper Ver 4.31 /i386 rev 7d61bf (Nov 23 2012)
Midi Guitar 0.5.2 BETA VST (32 bit version - I have not tried the 64 bit version)

Pitch Prediction - Not Saving when Default recognition selected
Sensitivity - Saves fine (did save after setting one or more of the Sensitivity sliders in the sensitivity page, i.e loads the default value of 25 if the recognition preset was set to the Electric Guitar, Default)
Pitch Shift - Saves Fine
Fixed Velocity - Saves Fine
Output Volume - Saves Fine
Output Type - Saves Fine
Midi Status - not saving the chosen status when closing the project. Further testing I noticed that even just closing the window reverted the setting back to the enable state.
Instrument - saves fine



Sensitivity Page - All save Fine

I did not notice that the reason Sensitivity in the recognition section was linked to the main sensitivity slider in the sensitivity page so it was defaulting to the value set by the default preset.

-------------

I primarily use Ableton so I carried out a few more tests than for reaper

Ableton Live Ver 8.3.4 Build 2012-08-30 b858ac4acd 32 bit

Pitch Prediction - Not Saving when Default recognition selected
Sensitivity - Save fine (did save after setting one or more of the Sensitivity sliders in the sensitivity page, i.e loads the default value of 25 if the recognition preset was set to the Electric Guitar, Default)
Pitch Shift - Saves Fine
Fixed Velocity - Saves Fine
Output Volume - Saves Fine
Output Type - Saves Fine
Midi Status - not saving the chosen status when closing the project. Further testing I noticed that even just closing the window reverted the setting back to the enable state.
Instrument - saves the status fine (i.e no instrument or test piano)



Sensitivity Page - All save Fine

I did not notice that the reason Sensitivity in the recognition section was linked to the main sensitivity slider in the sensitivity page so it was defaulting to the value set by the default preset.

Not sure what has happened here, it appears to be fine, yet projects I have created over the last few days did not appear to be saving the settings… After further investigation I found that changing the settings in Midi Guitar alone was not enough to save the settings, even clicking on mute for a channel in live would allow correct recall of values.

Ok, so after much testing… it appears to save ok when using a custom recognition sensitivity setting but when using Electric Guitar, Default on load the plugin defaults to default values for Mono / Poly, Pitch prediction.

Also when saving a project that has a VST loaded for the instrument, the plugin does not recall the vst used, instead defaults to No Instrument

In ableton I saved the plugin as an Audio Effects Rack. The first time I called up a new instance of it the settings were set to the default values and instrument textbook read Loading Test Piano, closing then opening the vst window again displayed the correct values on the plugin

I noticed I could not rename a recognition preset, when clicking rename, the text box is not in focus for the user to type a new name.

Also maybe something had to initialize before saving and recall worked… or may be it was just me lol.

Are there any plans to expose the plugins values so they can be used in ableton as a macro controls? Currently no values are shown when clicking Unfold Device Parameters?

Hope this is of some use, keep up the good work
Thanks for the extensive testing, tboneUS!

Part of the confusion comes from that we thought people should not modify/overwrite the default presets because we have seen some get lost with bad settings. I realize this is not a good choice, and that we should save all settings that are modified. We can still prevent users from modifying advanced pitch sensitivity settings for factory presets.

We will go carefully though this in the next update. Thanks for reporting.

We want to expose the parameters and also add MIDI Learn functionality.

ironhead wrote:
tboneUS wrote:
I too look forward to strumming. Strumming has been the one aspect of midi guitar systems where technology has failed.
Happy New Year!
Yes it is impressive software, esp. on iPhone when it does the same as Roland's boxes.

It has low latency however not all notes are detected. As I remember Sonuus (G2M) stated that latency is easily observable but not as important as notes' dynamic range and precise detection.

My experience so far is that 6-5-4 note chords do not register all notes most of the time. The demos I saw hide this by plying monophonic pieces and by using primary guitar sound over the MIDI sounds.

If somebody could record MIDI notes while strumming chords and compare it with a hex. system, would be a good usability test. Or simply play chords without primary guitar sound.
Please mind that the mobile app version is much inferior to the VST/AU/Standalone at this point.
Last edited by JamOrigin on Wed Jan 02, 2013 10:26 pm, edited 1 time in total.
JamOrigin.com

Like us on Facebook.com/JamOrigin and follow us on Twitter @JamOrigin

Post

JamOrigin wrote:
AdmiralQuality wrote:
JamOrigin wrote:
tboneUS wrote: Another thing I found was that loading kontakt into the plugin rather than outputting the to the host and loading kontakt the seemed to reduce the latency even further... Any thoughts or feedback ??
I have haerd others having the same feeling but im still puzzled about this. MIDI Guitar process its hosted synth as directly in the audio thread with no delay frames, so thats optimal for latency, but I would think than any DAW would do that. Anybody understand why a DAW would introduce artificial latency here?
This is normal and should be true for any instrument plug-in. It's because live MIDI events are always sent too late, because the host is already rendering the current buffer when your note comes in, so it has to wait until at least the next process call cycle to send your MIDI event to the instrument(s). But if you host the instrument in the same plug-in that's doing the detection, you save (at least) one buffer's worth of latency because you can render onto the output buffer in the same process call that did the detection of the input event.

MIDI is never in the same time as you play it when playing a virtual instrument live on a DAW. However, on playback the host may choose to correct the timing of the MIDI events back to when they actually happened. (Each event has a deltaFrames value, describing how far in sample time into the NEXT buffer process cycle the note should go on/off at. So pre-recorded notes can happen anywhere. Live notes however can never get out in time because it's too late to add them to the vstEvents list for the buffer that currently already has process call(s) rendering into it.)

This is true with all MIDI processors on DAWs. They always add one buffer of latency. So yes, for best real time performance, using MIDI-Guitar as a plug-in host itself will provide the lowest latency.
Yes, we thought of this, and its part of the motivation for hosting synths inside MIDI Guitar. Its clearly true for keyboard events etc which comes from an external process, but i fail to understand how DAWs can delay MIDI in a plugin chain where MIDI Guitar produce midi in the audio thread and the synth consume it in the same thread?
Because real-time is an illusion in computer DAWs with perceptible latency. So it's not really the MIDI that's delayed, rather, the audio. A host under load is spending most of its CPU time running all the various processes (its own as well as plug-ins) on various audio buffers. These need to be called in the order they're patched. So chances are, when a MIDI event (or any other real-world event input) comes in, the processes are already running producing the next buffer that you're going to hear. (You're currently hearing the previous buffer that was already rendered a few milliseconds ago, right?) And as these processes are already going, that means that processEvents has ALREADY been called for them, handing them the lists of events that are supposed to occur during this buffer, and what deltaFrames sample offset into this buffer they happen on.

So, as what you're hearing NOW was, in fact, constructed a short time ago, it's ALWAYS too late to apply any real-time input.

And that problem is compounded when a plug-in outputs MIDI events, because any events it outputs with sendVstEventsToHost() will not get sent by the host calling the other plugins' processEvents() until the current chain of process calls is complete. (Because once process() is running, processEvents() can't be called.)

But because a plug-in hosted within another plug-in doesn't need to wait for the host to send the events (because it implements the call to processEvents() itself rather than routing through the host) it can avoid this extra latency. Though of course it is still subject to the audio input latency, but at least the "effect" can come out in the same buffer process that detected the "cause", rather than having to be scheduled for the next round of process() calls.

I mean if MIDI Guitar can avoid this latency other DAWs can surely do it as well. After all you can see MIDI Guitar as a tiny DAW hosting one plugin.
Hosting one plug-in is a far simpler problem than hosting many that can be patched in all kinds of different topologies. (Feedback loops are another classic issue where this stuff becomes exposed.) And very few hosts have any audio-to-MIDI functionality anyway, so it's not that MIDI-guitar is doing a better job than ALL other hosts (which all have to deal with the audio driver's latency using roughly the same strategy), it's just doing a different and more simple job.

But also note that that MIDI only needs to be late when it's played LIVE. Once recorded, the deltaFrames can be adjusted to push them back to the actual time they were recorded. This can be a good or a bad thing, depending on how you look at it and how bad your latency is. Some hosts (like Reaper) let you control this behavior. Does MIDI play back as you HEARD it, or as you PLAYED it? Neither answer is really correct. Most hosts don't even give you the choice.

But yes, you might want to consider sending the MIDI events to the host with NEGATIVE deltaFrames values, so it can correct their timing on playback. (Are you sending deltaFrames at all or does every event conceptually happen only at the start of each buffer?) But again, that can only help make playback more accurate. Events created LIVE, from a keyboard controller, key or mouse click, etc, HAVE to be late. Events generated from analyzing audio have this same problem, PLUS the input audio latency.

Post

I did a little video (my first one! LoL) of the MidiGuitar. Enjoy and have patience with my english and some sloppy playing :-)

http://youtu.be/u8EjROwrqAY

Mavid

Post

AdmiralQuality wrote:
JamOrigin wrote:
AdmiralQuality wrote:
JamOrigin wrote:
tboneUS wrote: Another thing I found was that loading kontakt into the plugin rather than outputting the to the host and loading kontakt the seemed to reduce the latency even further... Any thoughts or feedback ??
I have haerd others having the same feeling but im still puzzled about this. MIDI Guitar process its hosted synth as directly in the audio thread with no delay frames, so thats optimal for latency, but I would think than any DAW would do that. Anybody understand why a DAW would introduce artificial latency here?
This is normal and should be true for any instrument plug-in. It's because live MIDI events are always sent too late, because the host is already rendering the current buffer when your note comes in, so it has to wait until at least the next process call cycle to send your MIDI event to the instrument(s). But if you host the instrument in the same plug-in that's doing the detection, you save (at least) one buffer's worth of latency because you can render onto the output buffer in the same process call that did the detection of the input event.

MIDI is never in the same time as you play it when playing a virtual instrument live on a DAW. However, on playback the host may choose to correct the timing of the MIDI events back to when they actually happened. (Each event has a deltaFrames value, describing how far in sample time into the NEXT buffer process cycle the note should go on/off at. So pre-recorded notes can happen anywhere. Live notes however can never get out in time because it's too late to add them to the vstEvents list for the buffer that currently already has process call(s) rendering into it.)

This is true with all MIDI processors on DAWs. They always add one buffer of latency. So yes, for best real time performance, using MIDI-Guitar as a plug-in host itself will provide the lowest latency.
Yes, we thought of this, and its part of the motivation for hosting synths inside MIDI Guitar. Its clearly true for keyboard events etc which comes from an external process, but i fail to understand how DAWs can delay MIDI in a plugin chain where MIDI Guitar produce midi in the audio thread and the synth consume it in the same thread?
Because real-time is an illusion in computer DAWs with perceptible latency. So it's not really the MIDI that's delayed, rather, the audio. A host under load is spending most of its CPU time running all the various processes (its own as well as plug-ins) on various audio buffers. These need to be called in the order they're patched. So chances are, when a MIDI event (or any other real-world event input) comes in, the processes are already running producing the next buffer that you're going to hear. (You're currently hearing the previous buffer that was already rendered a few milliseconds ago, right?) And as these processes are already going, that means that processEvents has ALREADY been called for them, handing them the lists of events that are supposed to occur during this buffer, and what deltaFrames sample offset into this buffer they happen on.

So, as what you're hearing NOW was, in fact, constructed a short time ago, it's ALWAYS too late to apply any real-time input.

And that problem is compounded when a plug-in outputs MIDI events, because any events it outputs with sendVstEventsToHost() will not get sent by the host calling the other plugins' processEvents() until the current chain of process calls is complete. (Because once process() is running, processEvents() can't be called.)

But because a plug-in hosted within another plug-in doesn't need to wait for the host to send the events (because it implements the call to processEvents() itself rather than routing through the host) it can avoid this extra latency. Though of course it is still subject to the audio input latency, but at least the "effect" can come out in the same buffer process that detected the "cause", rather than having to be scheduled for the next round of process() calls.

I mean if MIDI Guitar can avoid this latency other DAWs can surely do it as well. After all you can see MIDI Guitar as a tiny DAW hosting one plugin.
Hosting one plug-in is a far simpler problem than hosting many that can be patched in all kinds of different topologies. (Feedback loops are another classic issue where this stuff becomes exposed.) And very few hosts have any audio-to-MIDI functionality anyway, so it's not that MIDI-guitar is doing a better job than ALL other hosts (which all have to deal with the audio driver's latency using roughly the same strategy), it's just doing a different and more simple job.

But also note that that MIDI only needs to be late when it's played LIVE. Once recorded, the deltaFrames can be adjusted to push them back to the actual time they were recorded. This can be a good or a bad thing, depending on how you look at it and how bad your latency is. Some hosts (like Reaper) let you control this behavior. Does MIDI play back as you HEARD it, or as you PLAYED it? Neither answer is really correct. Most hosts don't even give you the choice.

But yes, you might want to consider sending the MIDI events to the host with NEGATIVE deltaFrames values, so it can correct their timing on playback. (Are you sending deltaFrames at all or does every event conceptually happen only at the start of each buffer?) But again, that can only help make playback more accurate. Events created LIVE, from a keyboard controller, key or mouse click, etc, HAVE to be late. Events generated from analyzing audio have this same problem, PLUS the input audio latency.
Thanks AdmiralQuality for another deep and insightful post!

My view is based upon seeing a network of VST plugins in a DAW patch (without feedback loops) as a canonical example of a topological sorted directed acyclic graph, where each plugin can be processed in a sequence based on their dependencies. I.e. this is very simple to schedule and it is processable in one audio frame, in one thread, regardless of whether the plugins produce/consume audio or midi.

I can only think of two minor problems with this view:
1. External messages comes in one frame late, as you describe. MIDI Guitar's midi messages are internal to the graph so they are not late in this sense.
2. If there are feedback loops its getting hairy and obviously some messages will have to be delayed, but most often there aren't and its easy enough for the DAW to determine when there are loops.

I must admit I'm not into the VST API enough to say whether processEvents() or sendVstEventsToHost() cause delays. I'm under the impression they are called in the audio thread just before and after main proccess() call in which case they would be "extensions" of the main process() and can be chained optimally.

Anybody knows of DAWs that are particularily optimized for low latency processing, or DAWs that are know to be bad and cause delays?
JamOrigin.com

Like us on Facebook.com/JamOrigin and follow us on Twitter @JamOrigin

Post

Mavid wrote:I did a little video (my first one! LoL) of the MidiGuitar. Enjoy and have patience with my english and some sloppy playing :-)

http://www.youtube.com/watch?v=u8EjROwrqAY

Mavid
Thanks for posting this video Mavid!
I think its one of the best introductions to MIDI Guitar so far :)
JamOrigin.com

Like us on Facebook.com/JamOrigin and follow us on Twitter @JamOrigin

Post

JamOrigin wrote: I must admit I'm not into the VST API enough to say whether processEvents() or sendVstEventsToHost() cause delays. I'm under the impression they are called in the audio thread just before and after main proccess() call in which case they would be "extensions" of the main process() and can be chained optimally.
It's really the same in all plug-in APIs. It kind of has to be, it's the only way to deal with the latency imposed by the audio drivers. (Though I do look forward to the day when we can achieve 1 sample latency! A dedicated real-time OS could do this no problem, but Windows and OSX aren't designed for real-time applications, though we can come close enough for rock 'n roll.)

And yes, processEvents() should be called immediately before process(). I suppose it could be possible to make a host that analyzed the current patching topology, particularly in terms of the MIDI event inputs and outputs, and gave downstream effects the chance to receive events generated by upstream effects within the same render cycle. But that's complicated and wouldn't work in every case (and again, I doubt they were thinking about audio analysis to MIDI event processors like this one). So, as most MIDI events come from hardware controllers, it makes sense if all the processEvents() calls to every plug-in, regardless of patching order, represent the same snapshot in time. Even though we know they are processed sequentially, we pretend they're all happening at the same time.

And even after years of doing this, I'm not sure of the peculiarities of each host (and each version of each host) in regards to this. You can figure it out through experimentation, but it's a rather tedious process. Some behaviors can be revealed by adjusting your hardware driver settings to make a long buffer, then you might be able to clearly hear what the host is doing to compensate.)

Post

JamOrigin wrote:
Mavid wrote:I did a little video (my first one! LoL) of the MidiGuitar. Enjoy and have patience with my english and some sloppy playing :-)

http://www.youtube.com/watch?v=u8EjROwrqAY

Mavid
Thanks for posting this video Mavid!
I think its one of the best introductions to MIDI Guitar so far :)

Wow ! Well done!

Post

Thanks :-)

Post

I just took the plunge on this, every time I think I have found a flaw I either find out how to deal with it or find out it is in hand for the next update.

All I can say is, Fishman... thanks for piggin about so I had time to find this and thanks for giving me a reason to sell my Roland in the first place.

I have always found Piano to be the hardest thing to play on synth guitar and this works better than my GK3 + AxonAX50 or Roland GI20 or Roland GR33 for that

Post

Really been enjoying this plug in, though I do have one issues with the latest version. For some reason the MIDI Guitar plug in GUI will always cover the GUI of the plug in it's hosting. I use it with Omnisphere in Ableton Live 9 (beta), and the Omni GUI is always covered up by MIDI Guitars. There's no way to force it to the front of the open windows, and if I close the MIDI GUitar GUI, Omni's closes as well.

Other than that, the only real issue I have is the weird glitches and distortion spikes at pitch detection 3 settings. That one by far works the best, but I still get the odd glitch at any latency setting (Lynx Hilo).

Still, this revolutionized how I work with my DAW now, after 20+ years of playing guitar, no more needing to fake playing piano just to play software synths :)

Post Reply

Return to “Instruments”