Why even modern VST Synths can't sound like 20 year-old Hardware VA Synths?

DSP, Plugin and Host development discussion.
Post Reply New Topic
RELATED
PRODUCTS

Post

Hmm latency does not have a sound. It's easy to compensate. Then again, if one owns all this famous old hw then it would still be up to their skills what to produce with it.

No matter the production, it is a process of crafting sound in a certain way. I have about 100 90s tracks as a reference. All these tracks sound completely different. While many of them use the same hw.

So there's "Jump" and "The Final Countdown" but it just means there are thousands of other songs where the same synths were used. Which have a synth riff. It did not come down to a specific synth. It's more that the riffs stand out, not the synth.

Post

excuse me please wrote: Tue Jun 02, 2020 7:46 am Hmm latency does not have a sound.
No if you are in the feedback loop playing realtime, latency has a feel. Some gifted can play realtime about any latency, pipe organs in cathedrals or stiff klunky tape mellotron keys.

I am not so gifted and latency above a certain threshold bugs the crap out of me. There is also note-on timing jitter. An uncomfortable realtime player wont play the same and be demoralized and not real motivated to keep at it. You record a track as tight possible then listen back and say, "Holy hell I'm not a flash but I know I didnt play it that shitty!" :)

After notes are properly in a daw, then the daw can completely compensate and render sample accurate perfect timing from the midi track data. But that is later on after the real time player is no longer in the feedback loop.

Post

@JCJR
Ok, but I assume that has to do with live peformances. Anyway, to create "that sound" is a very subtle process afaik.

Post

excuse me please wrote: Tue Jun 02, 2020 10:10 am @JCJR
Ok, but I assume that has to do with live peformances. Anyway, to create "that sound" is a very subtle process afaik.
I already acknowleged that some methods of work do not rely on realtime performance, but in my way of thinking "recording" is "live performance after you clicked the record button." So it seems a false dichotomy.

And what is wrong with live performance anyway? A person would buy a synth good for some kind of quantized laboriously after-the-fact corrected "recording" but useless or unsatisfying for live playing? A non-trivial track would need practice playing the thing many times, perfecting the piece before you are ready to click the record button.

For much of history, perfecting "that sound" was working out how you play the instrument, not mucking with the timbre of the instrument. Synths are great fun but ought to be musical instruments. You can make all kinds of different music with basically the same available sound palette.

Post

Hi JCJR, I'm not sure I buy that playing soft synths necesseraly incurs too great a latency/jitter hit (I must add I play guitar through through virtual amps all the time, but I only have limited skills on the keys, so I wouldn't come into contact with potential midi jitter issues so much). If it is true then sure hardware is the preferable option in that regard.

The article you linked to used a windows 98 machine as the test system and refers to USB 2.0 as upcoming technology, so I'd be interested if anyone has accurate up to date info on midi jitter on current systems.

Post

@JCJR
Oh sorry, my bad, usually I jump to this forum after a mix session, so I was completely blank when writing my comment; of course recording a vsti introduces latency. I usually plug notes in my piano roll, so I forgot what it's like to actually play a keyboard :oops:

That said, Mozart wrote music in his head, so I wonder if he even would use a keyboard nowadays.

Post

matt42 wrote: Wed Jun 03, 2020 8:44 am Hi JCJR, I'm not sure I buy that playing soft synths necesseraly incurs too great a latency/jitter hit (I must add I play guitar through through virtual amps all the time, but I only have limited skills on the keys, so I wouldn't come into contact with potential midi jitter issues so much). If it is true then sure hardware is the preferable option in that regard.

The article you linked to used a windows 98 machine as the test system and refers to USB 2.0 as upcoming technology, so I'd be interested if anyone has accurate up to date info on midi jitter on current systems.
Thanks Matt42. Yeah the only reason I linked that article was the little bit about measurement method of hardware midi latency and reality check figures of (fairly low) hardware synth latency measurements, using a simple method which should be immune to computer latency issues. And should be just as easy to use today as any time in the past. Dunno "generalities" about modern computer DAW midi jitter, though there are so many hardware/software combinations maybe it would be impossible to generalize.

Maybe wasn't clear enough, but I would generally consider synth plugins "innocent bystanders" in whatever latency and jitter issues might plague hardware/driver/DAW combinations. Unless a synth plugin has been written with threading design which might encourage race conditions with tiny audio buffers-- Or maybe some softsynth designs can't really have ultra-low latency because of big FIR filters or FFT/IFFT processes. But unless the plugin is somehow shooting itself in the foot, then it can't have any lower jitter or latency than allowed by the MIDI interface, MIDI and audio drivers, Operating System, and however the DAW attempts to make the mess work as good possible.

I'm mainly bitching about play-thru latency and jitter. Lets posit for example that a computer/interface/DAW system is capable of perfect jitter-free timestamping of every received midi event, and on playback it is capable of doing sample-accurate rendering of that data. OK fine, but if the playthru that the guy hears during record might have 20 ms latency and substantial jitter. then If the fella is "superhuman" and can ignore that horrible timing coming out of the playthru, then maybe he could play a perfect part, which on playback would be perfectly rendered by a softsynth. But that latency and jitter "messes with the mind" of all but the most pig-headed players. The player keeps trying to adjust his timing so that the playthru notes come out with the correct feel along with the other tracks.

That "chasing the latency and jitter" during recording causes real imperfect timing MIDI track to get recorded. And then when you hit play it sounds like shat and the player says "Damn I can't believe I played it that bad." So then he will typically open up a piano roll editor and quantize out and mouse-adjust all the screw-ups. But the cause of the screwups was bad time-distortion in the playthru during recording. Causing needless frustration and demoralization, an attitude that one can't even play the simplest part without having to quantize afterwards.

Surely there are combinations of MIDI interface, audio interface, computer hardware and DAW where you can crank the audio buffers down to dropout-free 32 or 64 samples at 44.1k or whatever. But it is an iffy enterprise. It is possible to spend the big bucks and do all the tweaks and still get stuck with more playthru latency than you wanted. A lot of trouble.

OTOH it is purt easy to get everything EXCEPT playthru working GREAT. I've had "entirely satisfactory multitrack systems" for everything except playthru since late 1990's with a PPC Mac, Motu audio and midi interfaces, running Digital Performer. A computer maybe 1/100 or 1/1000 the power of a modern computer.

My current setup works great except playthru. But even my hardware MIDI playthru, DIN in to DIN out, on my combination of HW and SW, is worse than it was midi-only on a toaster mac running Opcode Vision or MOTU Performer (or such MIDI software I wrote so far as that goes). At that time macs didn't really do multitasking and a program could take over complete low level ownership of the serial and timer chips. They were pitiful slow contraptions by modern standards and couldn't do audio at all but could have purt tight hardware MIDI.

I could probably hunt around and find some combination of HW and software that has tighter playthru than I'm currently getting, but damn it all works real nice except playthru. Seems like a lot of trouble. I ain't got the energy. :)

Post

It is 2020. with any half decent setup latency lower than 10 ms is not a problem at all. With a good setup less than 5 ms is possible. 5 ms are not noticeable.
my music:
soundcloud.com/septimon-band
blend.io/septimon

Post

JCJR wrote: Tue Jun 02, 2020 8:39 am
excuse me please wrote: Tue Jun 02, 2020 7:46 am Hmm latency does not have a sound.
No if you are in the feedback loop playing realtime, latency has a feel. Some gifted can play realtime about any latency, pipe organs in cathedrals or stiff klunky tape mellotron keys.

I am not so gifted and latency above a certain threshold bugs the crap out of me. There is also note-on timing jitter. An uncomfortable realtime player wont play the same and be demoralized and not real motivated to keep at it. You record a track as tight possible then listen back and say, "Holy hell I'm not a flash but I know I didnt play it that shitty!" :)

After notes are properly in a daw, then the daw can completely compensate and render sample accurate perfect timing from the midi track data. But that is later on after the real time player is no longer in the feedback loop.
If you’re having those kinds of latency/jitter issues then there is something wrong with your setup or settings.

Musicians have been dealing with latency since the dawn of time. Think about the latency one would deal with when sitting around the tribe’s fire while beating on logs and singing. Often visual cues are used (Ala a conductor) when groups get too large, because light is really low latency. ;) Anyway, the “chorus” effect is called chorus because of the effect one gets with a bunch of instruments being slightly out of tune as well as coming from different physical locations. Anyway, we all instinctively deal with latency, though when it starts getting too high, it does become an issue, but if you’re using a basic modern computer from the last... 15 years maybe, and a good audio interface, you should not have to worry. Just keep your buffer at 128 samples or lower. I run mine at 64 with no problems and move it up when dealing with mix-down or mastering.

Jitter is a bigger issue, but again, if you’re using pretty good gear on a modern computer, you shouldn’t have a lot of jitter, or any in a noticeable range. Definitely less than the random timing issues you’d have because you’re human. Maybe you don’t care enough to figure out what’s wrong, but if you are reading this and worrying that using a computer may be troubled with such issues, know that it doesn’t have to be.
Zerocrossing Media

4th Law of Robotics: When turning evil, display a red indicator light. ~[ ●_● ]~

Post

Septimon wrote: Wed Jun 03, 2020 4:08 pm It is 2020. with any half decent setup latency lower than 10 ms is not a problem at all. With a good setup less than 5 ms is possible. 5 ms are not noticeable.
I play instruments that range form acoustic guitar to plugin synthesizers and unless there’s something that’s actually wrong (like I’ve introduced a plugin that increases latency too much) I can instantly compensate without even thinking about it. I can even go from my Microfreak to my traditional key MIDI keyboard! I never think, “oh man, the time between when my finger hits that key to the point where the key triggers the note is throwing me off!” :lol: Could it be that I spent 40 years playing in bands? What person would say, “I’ve got to sit on the drummer’s lap or I can’t play in time!”
Zerocrossing Media

4th Law of Robotics: When turning evil, display a red indicator light. ~[ ●_● ]~

Post

Thanx and congrats that yous guys are such fab adaptable musicians not an ignorant complaining wimp like me. :)

The desktop is a few years old, i7 4790 3.6 Ghz, 16 GB RAM, dual 1 TB SSDs. Audio Behringer XR18 digital mixer, 44.1k, 512 samples, control panel claims Input Latency 710 samples 16.1 ms, Output latency 551 samples 12.49 ms. MOTU MIDI Express 128 Midi interface. Usually use Reaper.

Hey, at this setting I never get glitches even on big projects. At lower settings this is not always the case. I like the XR18 fine, it sounds good, minimal noise and hum. I suspect Reaper may have non-optimal din-in-to-din-out MIDIThru handling but it is possibly wrong suspicion. If so probably can't be blamed to optimize it better for softsynths than din-to-din playthru nowadays.

I got no problems with the setup except playthru latency. I don't care about softsynths but wish it did a little better on din midi playthru. However din midi playback seems solid. I suspect Reaper might be doing din-to-din midi thru only on audio buffer bounds, which might explain what I observe. Of course that would not be necessary but might be logical if one's customers mainly use softsynths.

Post

Anyone else noticed that the OP left this thread after three posts or so? :D

Post

zerocrossing wrote: Wed Jun 03, 2020 4:34 pm Could it be that I spent 40 years playing in bands? What person would say, “I’ve got to sit on the drummer’s lap or I can’t play in time!”
Well what ya gotta put up with doesn't always equate to ideal if you can get it. :) Just because you know how to drive a klunker doesn't mean you'd rather not drive a nice car.

My last steady gig before started programming full time for a living was 1982 to 1995, 13 years of 6 nights a week except for 7 nights a week in the summer tourist season. It was a medium-volume trio gig at a touristy supper club. Drums, geetar/bass, me on keys/keybass.

Stage was maybe 25 feet wide. Drums stage left, keys stage right, frets guy in the middle. For instrumental stage monitor we used a pair of those old EV 18" three-way cabs with the thiele 8" midrange speakers and T35 tweeters. Drums were routed into my monitor, keys were routed into the drummer's monitor, guitar/bassist was routed thru both monitors.

So on a 25 foot stage we were all basically sitting in each other's laps and it was quite pleasant working conditions. The drums were about a meter away from my ears, mixed with my keys and the guitar/bassist on my EV stage monitor. Everybody heard everybody else with about a 3 ms delay. Sure we could have got used to the max 25 ms stage delay but why do that if you don't have to?

Post

JCJR wrote: Wed Jun 03, 2020 5:20 pm Thanx and congrats that yous guys are such fab adaptable musicians not an ignorant complaining wimp like me. :)

The desktop is a few years old, i7 4790 3.6 Ghz, 16 GB RAM, dual 1 TB SSDs. Audio Behringer XR18 digital mixer, 44.1k, 512 samples, control panel claims Input Latency 710 samples 16.1 ms, Output latency 551 samples 12.49 ms. MOTU MIDI Express 128 Midi interface. Usually use Reaper.

Hey, at this setting I never get glitches even on big projects. At lower settings this is not always the case. I like the XR18 fine, it sounds good, minimal noise and hum. I suspect Reaper may have non-optimal din-in-to-din-out MIDIThru handling but it is possibly wrong suspicion. If so probably can't be blamed to optimize it better for softsynths than din-to-din playthru nowadays.

I got no problems with the setup except playthru latency. I don't care about softsynths but wish it did a little better on din midi playthru. However din midi playback seems solid. I suspect Reaper might be doing din-to-din midi thru only on audio buffer bounds, which might explain what I observe. Of course that would not be necessary but might be logical if one's customers mainly use softsynths.
well 512 samples is way too high. I'm not sure if the Behringer mixer is made for low latency tasks. Maybe you' want to try a simple 2 channel audio interface (MOTU, Steinberg, Presonus, Audient...) for playing synths.
my music:
soundcloud.com/septimon-band
blend.io/septimon

Post

Septimon wrote: Fri Jun 05, 2020 12:23 pm well 512 samples is way too high. I'm not sure if the Behringer mixer is made for low latency tasks. Maybe you' want to try a simple 2 channel audio interface (MOTU, Steinberg, Presonus, Audient...) for playing synths.
Thanks Septimon. Yeah tradeoffs. From 1995 to retirement circa 2012-2014 I always had at least 1 midi and audio setup Mac and midi and audio setup PC for programming and such but was too busy to record. After retirement spent a couple of years writing some Reaper jsfx and remastering a bunch of old songs. Only got to trying to record new stuff the last few years. I didn't worry about audio latency because I have all the sounds I need in MIDI hardware and ASSUMED that MIDI would be tight even with big audio latency, merely because that is the way it worked for a long time in the past. DIN Midi i/o timing just didn't have anything to do with the audio, just a separate thang running "in parallel" with the audio tempo timing.

So it was kinda rude to discover DIN Midi timing problems possibly traceable to audio latency, when DIN Midi shouldn't have to have any reliance at all on audio latency.

Before the XR18 I had a couple of 16 channel analog mixers, wired with lots of snakes thru patch bays and a Focusrite Pro24 Firewire interface. So far as I recall the "never glitches" buffer size for the Pro24 was 256 samples. Maybe it would run stable at 128 samples but if so I think I would have remembered it. That Pro24 I think handles up to 14 inputs but dunno if it always sends 14 down the pipe if you are not using all of them.

One of the old analog mixers got too flakey to use and I tore it out, then the other old 16 channel mixer got too flakey. For awhile was looking around for a 16 or 24 channel analog mixer with real tiny footprint and low pricetag but couldn't find any. The XR18 wasn't expensive, is only 3 rack spaces with no knobs at all, and allowed removal of Pro24, patch bays, numerous 8 channel snakes going from everywhere to everywhere else.

So far as I can tell the XR18 sounds overall better than the Pro24. I know for sure the XR18 Mic inputs are cleaner and lower noise, and the Pro24 only had 2 built-in not-all-that-great mic/instrument inputs, and the XR18 has 16 Mic inputs, 2 instrument inputs, 18 line inputs. Not all at the same time-- Ch 1 & 2 are mic/instrument. Ch 1 thru 16 work as Mic/Line. And there are two more line-only inputs at 17 and 18.

So yeah the latency is a little weird that is the tradeoff. Audio latency is auto compensated, no problem. It is just that Din Midi weirdness. Nothing is ever perfect. :) If I get rid of the XR18 would have to replace it with some other brand of digital mixer that might have just as bad latency issues for all I know. People recording a live band with a digital mixer, huge latency has never been a problem for a couple of decades. The DAW compensates.

My eyes have got bad, can only see the computer a few hours a day, which is why I don't want to get into any long debug science projects eating up limited "somewhat clear vision" time.

Probably will do some more debugging on it. I really like Reaper. Don't want to get chased off Reaper if it turns out that is the only way to get decent din-in-to-din-out latency. Had a copy of Traction I never tried. Installed and lightly tested today. Set it to a 1024 sample buffer, big latency, and the din-in-to-din-out latency seemed "instant" about like just a MIDI cable from controller to synth. Then test recording it got a little weird. Playing real simple whole note chords on audio metronome downbeats, during record the piano was hitting right ontop of the audio metronome. But then the playback had the din Midi running consistently late, flamming behind the audio metronome. Setting a parm in Tracktion to apply about a -40 ms (!!!!) offset to the recorded MIDI tracks seemed to get back the same thing I was hearing while recording the track. Weird.

If that would "always work" then it would be OK I guess. Maybe buried somewhere in reaper are settings to offset the MIDI, but I'm purt sure reaper seems to be doing something else to it as well. If I set real big audio buffers, din playthru goes to complete crap, like it is saving up din MIDI thru for audio buffer bounds for some silly reason. But maybe misdiagnosing the problem, or maybe there is a setting buried in there somewhere that would fix it. :)

Post Reply

Return to “DSP and Plugin Development”