Dynamic Range is not (in short, mathematical term: !=) Loudness Range
The loudness range is (simply speaking) a measured "range" between the average loudness and the maximum loudness ((ITU-R BS.1770-x specs, SLk ballistics, gated).
The Dynamic Range in the statistics sheet is the range (mathematical value) between unweighted RMS max (300ms) and digital true peak max. Which is actually closer to a realtime readout instead of offline measurements.
The original DR offline meter is actually a bit more complicated - but in short, it takes the "average" signal strength, or better said takes an "average" of 20% of the over 10k measurement RMS avg points (max average values btw!) and subtracts the digital max peak to declare the DR value. I had to refresh my memory on that with the official tech paper, and now I know why it never succeeded - it was just too complicated to understand, it was too program dependent, and you could never reproduce your measurements (since the 20% were declared randomly).
Anyway - I just revisited the Statistics sheet and went for "RMS avg - dBTP = DR value", and suddenly the readouts look way different.
Here is the original Statistics Sheet (RMS max to dBTP max = DR value)
And this is the overhauled Statistics Sheet (RMS avg to dBTP max = DR value)
Night and day - but still revealing (I'll get to that in a minute).
So what does that all mean?
Most of you participants have DR values between 8 and 11 (if we go by RMS max to Digital Peak Max from the original statistics sheet). DR values are very dependent on how dense a track was mixed - or in other words: program dependent. Chances are you can reach a very high DR if you have strong transients, while your "average" signal is just a perceived sound sausage. Therefore it's not an objective measurement (that meter never was, as too many parameters were needed to be put into consideration, and each genre needed own "acceptable parameters"). The value I am offering (RMS max - dBTP max = DR value), can be seen as using the old (realtime) TT DR Meter from either the Pleasurize Music Foundation or the re-issued plugin version by Brainworx themselves, not the offline meter (which is more like RMS avg - dBTP max = DR value).
It's a bonus piece of information for those still transitioning - or those that want to compare metering values. By today's standards, DR meters are non-precise, very random and pretty much outdated.
Now - what is really important?
In our example, the mix challenge (emphasis: mixing - not mastering!), actually a good and healthy peak headroom without(!) using a limiter on the master bus. Therefore you're automatically also using a lower average signal strength - or even a suitable reference level. Both the "Loudness Range" (LRA) and the "Dynamic Range" (DR) show you how dense your mix is (again, simple speaking). There are no ideal values, but give you indications when things are just too much.
For example: a DR of 5 tells me, that the mix was either already pre-mastered, or heavy compression/limiting was used/applied (remember: range between average signal strength max in this case, to maximum digital peak). On the other hand - a DR value beyond 14 can mean "wow - this track must be uber-'dynamic' - absolutely not squashed at all", but this can be terribly misleading (see "sound saussage, while having strong transients" example above).
A LRA of 3LU or 5LU might show me, that the track doesn't really "breathe", while a LRA of +15LRA shows me, that it might fluctuate too much. Keep in mind, we're constantly talking about measuring music, not a broadcast stream with voice overs, FX and music. So these values are also program dependent, but... they still might tell you "wait a minute - something is wrong".
While these values are definitely interesting for a mastering engineer, musicians or plain "mix engineers" usually don't need to bother with these readouts on the meter. Their focus should be: is there a healthy headroom? Is my average signal strength within boundaries?"
These statistics are for a continuing (process of) learning.
Now on to the question by
I just listened to the track again, and took a dive at your statistics.psycho43142 wrote:I just sent Uncle E my votes. There were some really good mixes in there. I was also curious why you felt my track had been premastered as it had one of higher DR numbers?
I wrote in the statistics, that your track is either pre-mastered or used a high reference level. And you wonder why.
Taking a closer look at the values of the original statistics sheet alone (-0,19dBTP, -11,12dB RMS max, +5,9LU LRA, 10,9 DR), this gives me an indication that you worked at really hot levels, have a dense mix, while you might have used heavily limiting. A short listen to the track confirms it. Various instruments are pumping, so heavy compression was involved (the vocals are the most prominent example).
Why does it still have a DR of 10,9? Because the RMS max was -11,12dB and the max digital peak was -0,19dBTP. This gives the mathematical readout. If we go by the statistics revision (again: RMS avg - dBTP max peak = DR value), it's even 14,2 DR. Does that mean this track is "uber-dynamic"? Well yeah - at least according to this value readout and looking at the waveform. But if we look at the Loudness Range (LRA) and having a listen to the mixdown - maybe a bit too much compression was used.
Long story short:
As you can see - analyzing a track at either scenario (post mixing/pre mastering or post mastering i.e. with a regular release) is definitely not simple. Especially not with one value alone.
Meters can be stupid sometimes (for example: unweighted ones respond stronger to bass, the original DR meter even has in the manual, that the measurements can't be reproduced!). You have to put the readouts into perspective and work with that information.
Does that automatically make you a better mixer if you improve on that?
It's really hard to tell - but at least you're thinking about overusing compression and limiting. Read: your awareness is definitely brightened if you know what's going on, and what your metering tool shows you on the GUI.
Updated the "voting" post on page 10 with the re-visioned statistics.