@myk: I haven't used SplineEQ, but as I understand it, EVERY linear phase EQ introduces latency because of the way linear phase adjustments are computed (the signal must be delayed a bit, kind of obvious if you think about it). That's why they're typically used for mastering and final mixing, as opposed to tracking. Seems kind of unfair (if not dead wrong?) to mark it down for something that's inherent to its way of working.
@myk Just locate the latency knob and adjust it for lower latency, you should be just fine.
The reason why you want to be able to change latency, is for rendering purposes. If you are really anal about getting the best possible sound, you should increase the lantency when you render, and that will give you an even sharper sound. As you might have noticed with the "dotted real processing" line that shows up on the lower frenquencies,
it will become more accurate with higher lantecy settings, this is something you can't escape when processing audio the way linearphase eq's does it.
It's not fixable, it's just something you have to learn, with ALL linearphase eq's :)
just a little thing to say I tried this out on a audio cleaning project i had and it kept the tone of the voice but got rid of the noise I mean whistling right next to the vocal frequency. no other eq i had was doing it right and this baby kicked ass.
You know what is awesome is if you set it to minimal delay and stuff right to find something like if there is this annoying hum then you cut the hell out of it swap back to max mode and render you will be amazed on how high the quality of the eq its like you never did anything but the hum is gone. its very non destructive
well i currently experience narrow bandpass (rather than bandstop) to isolate locust's stridulation at almost a precise frequency then rendering, not much different technique as you know, that what made me amaze about spline eq