The last time that I had made an audio compressor (a few years ago, never released as a plugin and wasn't a plugin) the level detection algorithm was based on the fact that unlike float point arithmetic running integer summation is exact. i.e. assume that you want to know the summation of the last 100 samples. To do so, for each iteration I had added the absolute value of the current sample and subtracted the absolute value of [t-100]'th sample. If you divide that number with length(100), you'll have a voltage average of the last 100 samples. Alternatively you can square sample values and have an energy average (power estimate). This wasn't made to make a GR meter because I didn't have any meters or any fancy GUI. It was used as an input to determine the current gain level to achieve the necessary compression ratio, but it should be possible to use the same algorithm to make a GR meter. Just measure both input and output using the same method and the gain reduction is the ratio between them converted to decibels. You'll need to convert floats to integers and make sure that the conversion (and running summation) does not overflow integer value range or lose too much precision.
Not that I'm an expert of the subject matter anyway but just an idea you could try..
p.s. you don't need to delay audio by 100 samples just to do this
p.s2. you can try very large summation window lengths if you want, because all will run in O(1) time anyway.
Alternatively you can rectify both input and output and lowpass filter the results and take the ratio between these two, that is the same idea with floating point arithmetic.