One-Synth-Challenge: General discussion thread

VST, AU, AAX, CLAP, etc. Plugin Virtual Instruments Discussion
Post Reply New Topic
RELATED
PRODUCTS

Post

Hey everyone!

So I've been observing a few things since I started participating in this competition:

1. Since non-participants are allowed to vote, there is a lot of random/biased voting that happens from non-participants which really skews the final rankings a lot. If you want proof, please analyze the final voting sheets for the past 6 months. You'll inevitably see a trend there.
I know that this is a friendly challenge and a lot of us(including me) participate and take this as a challenge, but a lot of people do care about final rankings(whether they admit it or not). So I request that voting be restricted to only the participants as they have some skin in the challenge and try their best to judge/rate the tracks in a fairer way.

2. There is a huge variation in the loudness across the tracks and it is quite jarring while listening to the tracks on SoundCloud. Some tracks are extremely loud in the range of -6 to -7 LUFS integrated and this is quite painful. A lot of us don't take the pain to download and normalize the tracks. So I'd request you to make a rule addition to restrict the loudness of the submissions to approximately -14 LUFS, give or take a few dBs which is the standard loudness across all major streaming and distribution platforms.

Looking forward to everyone's views on the above.

Post

Hey, thanks for sharing your thoughts, Kaustav - very interesting! Here's my (highly subjective) take:

1. Rankings are far removed from my mind - I'd say non-existent - whenever I make music. But if I'm perfectly honest, I hate everything but a victory once we get to the voting process. A bit embarrassing to admit it, but even when I get all the way to #2, there's a part of me that goes "argh, should've had this!"

But at the same time I have to say that I'm willing to accept the camaraderie and what sometimes may seem like slightly absurd voting (I've never actually peeked at the scorecards, so I'm making no assumptions here), and keep the vote open for everyone. Humans are irrational beings, driven by all sorts of prejudices and incentives, and no matter how the "electorate" is composed, we'll bump into these weaknesses somehow - if, indeed, that's what they are! So if people want to snub my entries because I'm annoying (or just too damn perfect :-D ), then so be it!

2. I think this is a good idea. We can still mix and master our tracks for -6 LUFS if we want to - but it would be nice if everyone adjusted the volume to approx. the same level to improve the listening experience.

Easily done, no real sacrifice, unless you want to compete with Soundcloud tracks outside of the competition - and Soundcloud tracks are usually, uniformly, very LOUD! But in that case we could keep our track at -14 for OSC purposes, and replace it with the louder one once the results are in. At least if you have a Pro account - or if you are willing to lose your listener count. Okay, I do see some flaws... still, I support the idea in principle.
All Ted Mountainé's Songs on Spotify | Soundcloud | Twitter | His Latest Videos
The Byte Hop, the virtual home of Ted Mountainé – news as they might have happened.

Post

I'm not normalizing for Soundcloud publishing. There is no LUFS suggestion there so it's shooting your own music in the foot if it comes up in a mix. If releases targeted another platform that actually cared about that I would.

I'm fine with ratings open to all. The more 'realistic' the perception of your music is, the better for you imo. If people are concerned about ratings pollution, just look to see who rated you what. If you are impressing people who understand or connect with your vision, I would say you are getting the ratings that matter.

Post

-14 LUFS is fairly low really. I've never aimed for high accuracy but I do typically aim to get my mixes to measure below RMS -12 dB, which is approximately equal to -14 dB LUFS for "normal" ("sounds like music") mixes.

In my opinion it should be very hard to get anything typical to measure RMS greater than -12 dB... you'd be looking at continuous peaking (square waves, etc) or similar to get much above that. For example -12 dB RMS is a pure square peaking continuously at -15 dB.

Generally speaking the easy rule to follow is provide 1/2 bit (3.01 dB) headroom for peaks. Keep the main part of the signal including most of the peaks under that level. The RED or AMBER lamps (>3 dB) should only illuminate rarely if ever - only barely visible instantaneous blinking - but never turned on fully.

After mixing you can then measure the RMS or LUFS level using the right tool after render. The tool gets applied to the rendered recording and not in real-time! If the RMS/LUFS level is below your target you can either apply a compressor or a simple gain to the render to make up the difference. Likewise you can decrease the gain slightly if you are just over your target level (such as -13.5 LUFS = apply -1/2 (?) dB gain.)

I agree that such mixing advice (if not a RMS/LUFS rule) should be offered for participants and I also agree that only competitors should have votes counted. One option would be to offer rewards split between "popular vote" and "participants vote". Unfortunately it is so easy to get "friends" to vote for you when prizes are involved... allowing such a mechanism for cheating to exist is only asking for it to be utilized.

It would be interesting to see the split between participants and popular (listeners) votes too; since it should be well known that our tastes differ quite a bit from usual with our involvement in production. What might be considered fascinating to a producer (perhaps something in a track recognized as difficult to achieve in practice) might not always line up exactly with what a casual listener would enjoy.
Last edited by aciddose on Sat May 11, 2019 6:23 am, edited 1 time in total.
Free plug-ins for Windows, MacOS and Linux. Xhip Synthesizer v8.0 and Xhip Effects Bundle v6.7.
The coder's credo: We believe our work is neither clever nor difficult; it is done because we thought it would be easy.
Work less; get more done.

Post

There are also participants who finally do not publish their track as it doesn’t fit their own standards. If only publishing participants vote, it could lock into a one genre competition... I’d keep it open...

Post

The idea of splitting the vote (and rewards) or in the least publishing the results in such a way would provide more useful information without having any notable impact.
exponent1 wrote: Fri May 10, 2019 4:06 pm 1. Since non-participants are allowed to vote, there is a lot of random/biased voting that happens from non-participants which really skews the final rankings a lot. If you want proof, please analyze the final voting sheets for the past 6 months. You'll inevitably see a trend there.
Are such results published normally? I've never seen them and haven't been able to find where they're published if they are.
Free plug-ins for Windows, MacOS and Linux. Xhip Synthesizer v8.0 and Xhip Effects Bundle v6.7.
The coder's credo: We believe our work is neither clever nor difficult; it is done because we thought it would be easy.
Work less; get more done.

Post

aciddose wrote: Sat May 11, 2019 6:29 am The idea of splitting the vote (and rewards) or in the least publishing the results in such a way would provide more useful information without having any notable impact.
exponent1 wrote: Fri May 10, 2019 4:06 pm 1. Since non-participants are allowed to vote, there is a lot of random/biased voting that happens from non-participants which really skews the final rankings a lot. If you want proof, please analyze the final voting sheets for the past 6 months. You'll inevitably see a trend there.
Are such results published normally? I've never seen them and haven't been able to find where they're published if they are.
Yes, they are. I don't know if you've participated in OSC before, but every OSC result post has a google spreadsheet published for everyone else to see. As an example, this is the final voting sheet from the last OSC https://docs.google.com/spreadsheets/d/ ... I0jKvGoorc
Tj Shredder wrote: Sat May 11, 2019 6:23 am There are also participants who finally do not publish their track as it doesn’t fit their own standards. If only publishing participants vote, it could lock into a one genre competition... I’d keep it open...
In EVERY OSC, there are diverse genres from the participants as well. I'm sure you'd know that, since you vote in almost every OSC.

Post

exponent1 wrote: Fri May 10, 2019 4:06 pm 1. Since non-participants are allowed to vote, there is a lot of random/biased voting that happens from non-participants which really skews the final rankings a lot. If you want proof, please analyze the final voting sheets for the past 6 months. You'll inevitably see a trend there.
I know that this is a friendly challenge and a lot of us(including me) participate and take this as a challenge, but a lot of people do care about final rankings(whether they admit it or not). So I request that voting be restricted to only the participants as they have some skin in the challenge and try their best to judge/rate the tracks in a fairer way.
That's an interesting perception. I do look at the sheet out of curiousity each month. I've never really seen a pattern like that (not to say it doesn't exist - I don't know :neutral: ). I do notice that there aren't really very many non-participants voting on the typical month, my unscientific impression is about 1 or 2 per month but that could be wrong of course.
I would like to know more about this trend I am supposed to see!
To be clear - you are saying the trend you are seeing is coming from *non-participant* votes, specifically?
2. There is a huge variation in the loudness across the tracks and it is quite jarring while listening to the tracks on SoundCloud. Some tracks are extremely loud in the range of -6 to -7 LUFS integrated and this is quite painful. A lot of us don't take the pain to download and normalize the tracks. So I'd request you to make a rule addition to restrict the loudness of the submissions to approximately -14 LUFS, give or take a few dBs which is the standard loudness across all major streaming and distribution platforms.
That's a really interesting idea. Listening at a more consistant volume would be nice. But some potential issues maybe:
1) I wouldn't want to make participation more technically challenging or off putting to people who don't know how to measure LUFS or what to do about it
2) Different genres have different standards and how do we say what is 'correct'?
3) Getting a good level is part of the skill that can be learned and developed, it's part of what is evaluated in the challenge itself, as I understand it.

Interesting points, both of them!

Post

More constant volume would be nice.
I'm suspecting that you have to be more technically advanced to achieve -7 lufs. If you as a beginner only watch the meters and their green/yellow/red you would end up at -12/-14 lufs.

Post

exponent1 wrote: Fri May 10, 2019 4:06 pm 1. Since non-participants are allowed to vote, there is a lot of random/biased voting that happens from non-participants which really skews the final rankings a lot. If you want proof, please analyze the final voting sheets for the past 6 months. You'll inevitably see a trend there.
I know that this is a friendly challenge and a lot of us(including me) participate and take this as a challenge, but a lot of people do care about final rankings(whether they admit it or not). So I request that voting be restricted to only the participants as they have some skin in the challenge and try their best to judge/rate the tracks in a fairer way.
Restricted voting will not solve anything. You should analyze more then 6 months to see some "terrorists" here and most of them are participants. :lol:
exponent1 wrote: Fri May 10, 2019 4:06 pm 2. There is a huge variation in the loudness across the tracks and it is quite jarring while listening to the tracks on SoundCloud. Some tracks are extremely loud in the range of -6 to -7 LUFS integrated and this is quite painful. A lot of us don't take the pain to download and normalize the tracks. So I'd request you to make a rule addition to restrict the loudness of the submissions to approximately -14 LUFS, give or take a few dBs which is the standard loudness across all major streaming and distribution platforms.
Yes some standard in loudness is good idea. As I already show in OSC 121 even soundcloud that is used for this competition change overall loudness for track. Going through all entries is pain because of very different loudness in every track.
zarf wrote: Sat May 11, 2019 11:16 am1) I wouldn't want to make participation more technically challenging or off putting to people who don't know how to measure LUFS or what to do about it
Everyone can use google and find some useful informations and free tools to measure LUFS and overall statistics for loudness. It is only up to people if they want to learn new stuff in order to improve skills or not.
zarf wrote: Sat May 11, 2019 11:16 am2) Different genres have different standards and how do we say what is 'correct'?
Can you provide any material related to this that will support your statement? Because what I learn is that target levels depends on the destination platform and not music genre.

Post

TrojakEW wrote: Sat May 11, 2019 4:53 pm
exponent1 wrote: Fri May 10, 2019 4:06 pm 1. Since non-participants are allowed to vote, there is a lot of random/biased voting that happens from non-participants which really skews the final rankings a lot. If you want proof, please analyze the final voting sheets for the past 6 months. You'll inevitably see a trend there.
I know that this is a friendly challenge and a lot of us(including me) participate and take this as a challenge, but a lot of people do care about final rankings(whether they admit it or not). So I request that voting be restricted to only the participants as they have some skin in the challenge and try their best to judge/rate the tracks in a fairer way.
Restricted voting will not solve anything. You should analyze more then 6 months to see some "terrorists" here and most of them are participants. :lol:
exponent1 wrote: Fri May 10, 2019 4:06 pm 2. There is a huge variation in the loudness across the tracks and it is quite jarring while listening to the tracks on SoundCloud. Some tracks are extremely loud in the range of -6 to -7 LUFS integrated and this is quite painful. A lot of us don't take the pain to download and normalize the tracks. So I'd request you to make a rule addition to restrict the loudness of the submissions to approximately -14 LUFS, give or take a few dBs which is the standard loudness across all major streaming and distribution platforms.
Yes some standard in loudness is good idea. As I already show in OSC 121 even soundcloud that is used for this competition change overall loudness for track. Going through all entries is pain because of very different loudness in every track.
zarf wrote: Sat May 11, 2019 11:16 am1) I wouldn't want to make participation more technically challenging or off putting to people who don't know how to measure LUFS or what to do about it
Everyone can use google and find some useful informations and free tools to measure LUFS and overall statistics for loudness. It is only up to people if they want to learn new stuff in order to improve skills or not.
zarf wrote: Sat May 11, 2019 11:16 am2) Different genres have different standards and how do we say what is 'correct'?
Can you provide any material related to this that will support your statement? Because what I learn is that target levels depends on the destination platform and not music genre.
Totally agree man.

Post

TrojakEW wrote: Sat May 11, 2019 4:53 pm
zarf wrote: Sat May 11, 2019 11:16 am1) I wouldn't want to make participation more technically challenging or off putting to people who don't know how to measure LUFS or what to do about it
Everyone can use google and find some useful informations and free tools to measure LUFS and overall statistics for loudness. It is only up to people if they want to learn new stuff in order to improve skills or not.
That is certainly true. Still, it can be a bit overwhelming when you are starting out. For myself, when I think back to when I first joined OSC, if there had been a rule like that it might just have made me think it was too technical for me. I suppose it is more about perception than actual difficulty.
Is the suggestion that a track that is not within the right LUFS range will be disqualified?
That seems a bit harsh, maybe. In the spirit of inviting participation from new creators.
TrojakEW wrote: Sat May 11, 2019 4:53 pm
zarf wrote: Sat May 11, 2019 11:16 am2) Different genres have different standards and how do we say what is 'correct'?
Can you provide any material related to this that will support your statement? Because what I learn is that target levels depends on the destination platform and not music genre.
No, maybe I'm talking nonsense :D
As is becoming apparent, I'm not exactly an expert in these matters!
Perhaps I was confusing LUFS with dynamic range. Or, perhaps I'm more confused than that :)
I sit advised (which I find a more accurate way of saying 'I stand corrected').

Perhaps I'm just so busy with other things that the idea of *having* to get my head around all this properly in order for me to continue to submit tracks to the OSC seems like a burden :? I don't really understand LUFS, I don't know how to measure it inside my DAW, but of course it is on my 'to do' list to figure this out and do it right.

Anyway - thanks for clarifying and correcting my misunderstanding :tu:

Post

zarf wrote: Sat May 11, 2019 5:22 pm
TrojakEW wrote: Sat May 11, 2019 4:53 pm
zarf wrote: Sat May 11, 2019 11:16 am1) I wouldn't want to make participation more technically challenging or off putting to people who don't know how to measure LUFS or what to do about it
Everyone can use google and find some useful informations and free tools to measure LUFS and overall statistics for loudness. It is only up to people if they want to learn new stuff in order to improve skills or not.
That is certainly true. Still, it can be a bit overwhelming when you are starting out. For myself, when I think back to when I first joined OSC, if there had been a rule like that it might just have made me think it was too technical for me. I suppose it is more about perception than actual difficulty.
Is the suggestion that a track that is not within the right LUFS range will be disqualified?
That seems a bit harsh, maybe. In the spirit of inviting participation from new creators.
A lot of folks(including past winners) used or did a lot of stuff worthy of disqualification, but they still won :D Having this rule should be more of a guideline rather than grounds for disqualification.
TrojakEW wrote: Sat May 11, 2019 4:53 pm
zarf wrote: Sat May 11, 2019 11:16 am2) Different genres have different standards and how do we say what is 'correct'?
Can you provide any material related to this that will support your statement? Because what I learn is that target levels depends on the destination platform and not music genre.
zarf wrote: No, maybe I'm talking nonsense :D
As is becoming apparent, I'm not exactly an expert in these matters!
Perhaps I was confusing LUFS with dynamic range. Or, perhaps I'm more confused than that :)
I sit advised (which I find a more accurate way of saying 'I stand corrected').

Perhaps I'm just so busy with other things that the idea of *having* to get my head around all this properly in order for me to continue to submit tracks to the OSC seems like a burden :? I don't really understand LUFS, I don't know how to measure it inside my DAW, but of course it is on my 'to do' list to figure this out and do it right.

Anyway - thanks for clarifying and correcting my misunderstanding :tu:
It's about the average loudness of the track. -14 to -12 LUFS is the standard used for Spotify, Apple Music etc. It's very easy to bring the loudness down using the limiter, but I do understand that it takes skill to make things loud, yet still sounding clear and professional sounding.

Post

TrojakEW wrote: Sat May 11, 2019 4:53 pm
zarf wrote: Sat May 11, 2019 11:16 am2) Different genres have different standards and how do we say what is 'correct'?
Can you provide any material related to this that will support your statement? Because what I learn is that target levels depends on the destination platform and not music genre.
Oh, I'd say zarf is definitely speaking the truth. I'm moving outside the pure scope of OSC here, but consider classical music, with a far more dynamic expression - I guess that would be the most obvious example (https://www.gearslutz.com/board/remote- ... dness.html).

But this is also relevant for a lot of pieces submitted to OSC with a more progressive touch and long buildups.

For a lot of modern music in the -6 LUFS bracket, there's so little loudness leeway that you *have* to take out the bass to make room for the kick. And the expression basically becomes a part of the musical style.

Needless to say, this would sound totally out of place on a Steely Dan record. And if you start squeezing the drums and the peaks too much during a remaster, the more upfront, in-your-face and evenly matched vocals, guitars and pianos will become - creating a completely unnatural feel for this kind of music.

Of course, the way most things are mixed/mastered today, all sounds seem to be as "close to the ear," or mic, as the engineer can get them - but listen to something like "Babylon Sisters" from 1980, which I suppose is about -18 LUFS on the original LP version - so much space! And I won't even get into old Nelson Riddle sessions done with a handful of room mikes in the late 1950s.

So yeah, I'd say musical genre definitely plays a role.

Another question entirely is whether you are actually moving into, I don't know, "historically informed mastering" if you go down this road.

After all, all new recordings have pushed the envelope further, whether it be jazz, classical or pop. But in my opinion this changes the fundamental feel of the music, and in many cases it loses some of its original appeal. But I digress - that's for another thread!
All Ted Mountainé's Songs on Spotify | Soundcloud | Twitter | His Latest Videos
The Byte Hop, the virtual home of Ted Mountainé – news as they might have happened.

Post

schiing wrote: Sat May 11, 2019 5:58 pm Oh, I'd say zarf is definitely speaking the truth. I'm moving outside the pure scope of OSC here, but consider classical music, with a far more dynamic expression - I guess that would be the most obvious example (https://www.gearslutz.com/board/remote- ... dness.html).
Well you are talking about loudness range (loudness variation of your entire track LU) and exponent1 and I are talking about overall loudness (average loudness measured over entire duration of material LUFS). :P

Post Reply

Return to “Instruments”