Yup, your $700 cables are BS

Anything about MUSIC but doesn't fit into the forums above.
Post Reply New Topic
RELATED
PRODUCTS

Post

I wouldn't throw it away
Outstanding USB Cable

I have used many USB cables to connect my laptop to my DAC for playback of lossless and high rez music files. This cable is the best I've experienced. It is detailed, with crisp, yet not sharp highs, natural midrange and extended bass. It lets the soundstage extend beyond the speakers, with great depth. The imaging is extremely precise. I can highly recommend this cable.
https://www.amazon.com/gp/product/B00C6 ... bw_c_x_4_w


It's the fine difference what makes life interesting :wink:
|\/| _ o _ |\ |__ o
| |__> |(_ | \(_/_|

Post

rifftrax wrote: Sat Nov 17, 2018 12:29 am
farlukar wrote: Fri Nov 16, 2018 7:15 pm Fair enough, but anything with sturdy connectors and semi-decent shielding should suffice.
No. That's literally not the case at all for high-frequency signals. Again I think you vastly underestimate what goes into building cables that handle >ghz frequency data/clock/sync signals.

For example, you now have to aggressively account for things like skin effect, signal reflection, line impedance, cable LCR properties, insertion loss, cross-talk, propagation delay, and interference. Everything from quality of the soldered joints to the physical bend radius of every part of the signal path to the materials at every juncture becomes relatively important.

Maybe want to do some light reading:
https://www.edn.com/design/automotive-d ... Cs-Part-1-

Again...many many people have the deep and abiding misconception that "digital is digital". This simply could not be farther from the truth.
farlukar wrote: Mon Nov 12, 2018 5:21 pm HDMI provides perfect signal reproduction or no signal at all, with nothing in between. Any $10 cable is exactly as good is this one.
Here is a great example of this. It's a bit mystifying to try to trace down where this particular ridiculous idea originated from but a ton of people share it even though it's completely false. It belies a fundamental lack of understanding around how the decoding chips handling the incoming video signal deal with a real-time transmission "failure" at the bit level.
So, where are the visual and listening examples?
Cats are intended to teach us that not everything in nature has a function | http://soundcloud.com/bmoorebeats

Post

rifftrax wrote: Sat Nov 17, 2018 12:29 am
farlukar wrote: Mon Nov 12, 2018 5:21 pm HDMI provides perfect signal reproduction or no signal at all, with nothing in between. Any $10 cable is exactly as good is this one.
Here is a great example of this. It's a bit mystifying to try to trace down where this particular ridiculous idea originated from but a ton of people share it even though it's completely false. It belies a fundamental lack of understanding around how the decoding chips handling the incoming video signal deal with a real-time transmission "failure" at the bit level.
Yes, but, once we get to a reasonable degree of reliable transmission, where reasonable is defined by errors that are imperceptible by some objective standard, then the value of additional "quality" is moot. Moreover, how the protocol deals with errors is key to how lower quality cables will be perceived.

The claims made about USB 2.0 cables as expressed above, for example, are complete bullshit. I think that it's hard for a consumer to know which cables are going to be susceptible and at what point those issues will arise. Hence we get ad-hoc rules of thumb no different than tin foil on rabbit ears, for those old enough to remember.

I think that what leads to digital is digital is the idea that we are starting with a base level of imperceptibly reliable transmission, and if we're there, the statement holds. With HDMI there seem to be more environmental factors that have an impact on this, for example, people more often want longer cables than they might for say their audio interface over USB.

It's also not really "completely false." Errors in transmission manifest in a certain way, but it's not exactly the same as a high frequency scope probe because there we're concerned about visualizing the integrity of the analog signal. So, yes, the cable itself is carrying an analog high speed signal and all of the properties impact the ability of the cable to transmit that signal, but, the information being carried is encoded and what we see is the digital representation of the original signal modulo how well the protocol copes with errors. But, I'm repeating myself.
Last edited by ghettosynth on Sat Nov 17, 2018 1:49 am, edited 1 time in total.

Post

BMoore wrote: Sat Nov 17, 2018 1:08 am
rifftrax wrote: Sat Nov 17, 2018 12:29 am
farlukar wrote: Fri Nov 16, 2018 7:15 pm Fair enough, but anything with sturdy connectors and semi-decent shielding should suffice.
No. That's literally not the case at all for high-frequency signals. Again I think you vastly underestimate what goes into building cables that handle >ghz frequency data/clock/sync signals.

For example, you now have to aggressively account for things like skin effect, signal reflection, line impedance, cable LCR properties, insertion loss, cross-talk, propagation delay, and interference. Everything from quality of the soldered joints to the physical bend radius of every part of the signal path to the materials at every juncture becomes relatively important.

Maybe want to do some light reading:
https://www.edn.com/design/automotive-d ... Cs-Part-1-

Again...many many people have the deep and abiding misconception that "digital is digital". This simply could not be farther from the truth.
farlukar wrote: Mon Nov 12, 2018 5:21 pm HDMI provides perfect signal reproduction or no signal at all, with nothing in between. Any $10 cable is exactly as good is this one.
Here is a great example of this. It's a bit mystifying to try to trace down where this particular ridiculous idea originated from but a ton of people share it even though it's completely false. It belies a fundamental lack of understanding around how the decoding chips handling the incoming video signal deal with a real-time transmission "failure" at the bit level.
So, where are the visual and listening examples?
There are no sound differences, but here you go.

https://www.cnet.com/news/why-all-hdmi- ... -the-same/

Post

There is a difference between cheapo cables and good cable. If there’s a $650 difference between good cables and ‘audiophile’ cables isn’t likely (or hearable to a human.)
It also depends what the cable is used for: is it a high impeadance/low voltage source (guitar or bass g)? Or high voltage/low impeadance (line level)? Or very high voltage or current (amp to speakers)? Or very high frequency (digital)?
All these uses make different demands on a cable. The first and last example are more sensitive than the middle two. The connectors alone can make a difference. In the example of guitar a high capacitance cable will make a noticable difference, as will cheapo connectors, as will even just the flexibility of the cable. I’ve seen (on a scope) the signal degredation a crap cable will do in the digital realm. Speaker cables as long as they can handle the load are way less of an issue. Even so, i wouldn’t trust my amp and speakers to a crap set of cables which are way more likely to fail.

Yes, cable quality makes a difference up to a point. And yes, your $700 cables are BS (unless they’re gold, inside and out.)
gadgets an gizmos..make noise https://soundcloud.com/crystalawareness Restocked: 3/24
old stuff http://ww.dancingbearaudioresearch.com/
if this post is edited -it was for punctuation, grammar, or to make it coherent (or make me seem coherent).

Post

BMoore wrote: Sat Nov 17, 2018 1:08 am So, where are the visual and listening examples?
This isn't pertinent for really low-bandwidth signals (i.e. 2-channel redbook PCM) so "listening" examples are typically pretty much moot.

What I'm talking about mainly covers video but can also come when pushing the limit of the USB 2.0 spec in the case of say recording 10+ channels of >44.1khz audio. Looking at understanding how eye-diagram tests gives a visual sense of what digital signal degradation looks like is a good start:

https://www.youtube.com/watch?v=opotHMXKefA
https://www.youtube.com/watch?v=o8DPlqWVmzk
https://www.youtube.com/watch?v=vMR1r0nDK7s

Also, blue jeans cable has a REALLY good rundown on basics of what "digital" really means in real-world application sense:
http://www.bluejeanscable.com/articles/ ... analog.htm

Now, the thing to remember is that digital degradation can actually "look" a lot like analog degradation in that if you have an intact or intermittent clock signal that is still being locked onto by the chip handling signal processing while the data lines are having rising/falling edge voltage transitions being skewed far enough to "flip" certain bits then you can get all kinds of crazy stuff happening where signals are readily being transmitted/decoded (with a portion of the video or audio left "intact") but your output suddenly has a combination of most-significant to least-significant bits being flipped all over the place.

In video this could translate to looking like anything from a soft amount of random added "noise", hsv/rgb value shifts, to eventually reaching harsh and obvious patterns of "dots" or lines that are overlaid on the screen. At some point, yes... your clock signal will become so corrupted that the decode-side chip will basically give up and say it can't read the signal at all but better implementations of the embedded + MCU side will attempt clock recovery at least try to present a signal with some corruption (which is of course better than no signal at all). Going beyond this, you have to remember that any real-time signal protocol (i.e. anything dealing with audio/video streaming) can't have error-correction so the problem is that the side taking in the signal has no idea if what it is decoding has "errors" or not. As long as the clock is there in some form, there is the potential for distortion in a measurable sense to crop into the data lines as well...

In audio this could translate to anything from very subtle distortion to sudden impulse "pops" (say if a bunch of MSB values are flipped).
Snare drums samples: the new and improved "dither algo"

Post

ghettosynth
There are no sound differences, but here you go.

https://www.cnet.com/news/why-all-hdmi- ... -the-same/
there’s no difference between a $3.50 HDMI cable and a $20 or $100 cable? Haha, only if you got really lucky and got a great deal or got unlucky and bought a high priced crapper cable (which obviously happens.)

farlukar
Fair enough, but anything with sturdy connectors and semi-decent shielding should suffice.
That’s part of the definition of a good cable. You aren’t likely to get good connectors on cheap cables.
gadgets an gizmos..make noise https://soundcloud.com/crystalawareness Restocked: 3/24
old stuff http://ww.dancingbearaudioresearch.com/
if this post is edited -it was for punctuation, grammar, or to make it coherent (or make me seem coherent).

Post

ghettosynth wrote: Sat Nov 17, 2018 1:36 am
There are no sound differences, but here you go.

https://www.cnet.com/news/why-all-hdmi- ... -the-same/
Ok for one...the guy's analysis is a joke. You can obviously see from the picture he himself supplies that every single one of those errant pixels has a slightly different hsv/rgb value (that's a far cry from "on" or "off"). And FFS differential signaling only eliminates noise induced by the run of cable itself - it doesn't mitigate issues based on delay propagation, signal reflection, impedance mismatching, poor LCR properties, or bad driver/transmitter performance. Trying to say 10b encoding is some kind of ultimate panacea for eliminating all possibility of digital degradation is total nonsense as well. It's a mitigation technique that reduces the effective bandwidth requirement for transmission.

It's fair to say that digital degradation is often "easier" to detect and it is definitely very possible to transmit a bit-for-bit perfect signal in the majority of constrained cases for like HDMI video with runs under 25 ft but to say it "doesn't exist" is simply inaccurate.
Last edited by rifftrax on Sat Nov 17, 2018 2:05 am, edited 1 time in total.
Snare drums samples: the new and improved "dither algo"

Post

rifftrax wrote: Sat Nov 17, 2018 1:42 am
In video this could translate to looking like anything from a soft amount of random added "noise", hsv/rgb value shifts, to eventually reaching harsh and obvious patterns of "dots" or lines that are overlaid on the screen. At some point, yes... your clock signal will become so corrupted that the decode-side chip will basically give up and say it can't read the signal at all but better implementations of the embedded + MCU side will attempt clock recovery at least try to present a signal with some corruption (which is of course better than no signal at all). Going beyond this, you have to remember that any real-time signal protocol (i.e. anything dealing with audio/video streaming) can't have error-correction so the problem is that the side taking in the signal has no idea if what it is decoding has "errors" or not. As long as the clock is there in some form, there is the potential for distortion in a measurable sense to crop into the data lines as well...

In audio this could translate to anything from very subtle distortion to sudden impulse "pops" (say if a bunch of MSB values are flipped).
Yes, but, none of this will translate to the kinds of positive effects that are often claimed for expensive digital cables. That the colors are brighter or that, my favorite, the "soundstage opens up." That shit is nonsense and what leads people to claim digital is digital.

Post

rifftrax wrote: Sat Nov 17, 2018 2:02 am
ghettosynth wrote: Sat Nov 17, 2018 1:36 am
There are no sound differences, but here you go.

https://www.cnet.com/news/why-all-hdmi- ... -the-same/
Ok for one...the guy's analysis is a joke. You can obviously see from the picture he himself supplies that every single one of those errant pixels has a slightly different hsv/rgb value (that's a far cry from "on" or "off"). And FFS differential signaling only eliminates noise induced by the run of cable itself - it doesn't mitigate issues based on delay propagation, signal reflection, impedance mismatching, poor LCR properties, or bad driver/transmitter performance.

It's fair to say that digital degradation is often "easier" to detect and it is definitely very possible to transmit a bit-for-bit perfect signal in the majority of constrained cases for like HDMI video with runs under 25 ft but to say it "doesn't exist" is simply inaccurate.
It was probably a bad choice of link, it was just the first thing that I found with actual pictures.

Post

ghettosynth wrote: Sat Nov 17, 2018 2:03 am Yes, but, none of this will translate to the kinds of positive effects that are often claimed for expensive digital cables. That the colors are brighter or that, my favorite, the "soundstage opens up." That shit is nonsense and what leads people to claim digital is digital.
This is absolutely true. I'm not arguing against that. Real-time digital signal transmission only has a goal of creating a bit-for-bit perfect copy on the other side with as little reliance on error-correction as possible.
Snare drums samples: the new and improved "dither algo"

Post

CrystalWizard wrote: Sat Nov 17, 2018 1:55 am ghettosynth
There are no sound differences, but here you go.

https://www.cnet.com/news/why-all-hdmi- ... -the-same/
there’s no difference between a $3.50 HDMI cable and a $20 or $100 cable? Haha, only if you got really lucky and got a great deal or got unlucky and bought a high priced crapper cable (which obviously happens.)
I didn't say that, I just posted a link with pictures of how degradation happens. That said, I buy $40ish HDMI cables on sale for $15 for anything longer than about six feet. For six feet and under I buy the absolute cheapest cables that I can get from Fry's. I have not experienced any reliability issues, for the most part. Recently My XBOX started dropping out occasionally and I've been meaning to track that down. It might be the cable, it might be my receiver, I don't know.

I have some cheap cables that are over 25 ft and those vary from not working in some cases, to some of the issues that rifftrax points out above. It's very much device dependent on both ends.

I doubt very seriously for short runs that a $100 cable is worth the price. The longer the run, the more important it is to buy better cable.

Keep in mind, I'm not a hardcore video consumer. I use a 4K tv as a computer monitor and I have a short run of about 20ft between rooms. That connection has been sensitive to cable choice. My TV is just 1080p and not even a very good TV.

Post

ghettosynth wrote: Sat Nov 17, 2018 1:31 am Moreover, how the protocol deals with errors is key to how lower quality cables will be perceived.
One more nit-pick here. You realize the issue with error "correction" for any real-time protocol is that it...effectively doesn't exist. How do you re-transmit a time-critical packet that needed to arrive before another packet without some insane buffer?

Answer: you don't

That's the point. With any real-time protocol, you have only error detection/mitigation. True error correction is effectively impossible.
ghettosynth wrote: Sat Nov 17, 2018 1:31 am It's also not really "completely false." Errors in transmission manifest in a certain way, but it's not exactly the same as a high frequency scope probe because there we're concerned about visualizing the integrity of the analog signal. So, yes, the cable itself is carrying an analog high speed signal and all of the properties impact the ability of the cable to transmit that signal, but, the information being carried is encoded and what we see is the digital representation of the original signal modulo how well the protocol copes with errors. But, I'm repeating myself.
And that encoding is prone to transmit-edge distortion like any other analog voltage. Even more so because it's at a very high-frequency. See above for the problem of any assumption about "error" handling.

My point with all of this is just to point out that digital is not some automatic guarantee of signal integrity...especially when you are dealing with real-time transmission.

If we're talking a non-realtime protocol like TCP where you have multiple checksums happening at various OSI layers and it can take its sweet time to get a proper payload through then great. Otherwise you have no guarantees.
Snare drums samples: the new and improved "dither algo"

Post

rifftrax wrote: Sat Nov 17, 2018 6:48 amHow do you re-transmit a time-critical packet that needed to arrive before another packet without some insane buffer?

Answer: you don't
It depends on your definition of sanity. If sanity means the data must arrive and it must arrive in order and error free, then a protocol that allows for transmission errors is the only sane option. Where there's no need to worry about delay -- like in receiving video over a distance, where a few seconds almost certainly isn't an issue -- then a protocol like TCP seems to make far more sense than an insanely expensive cable.

Post

rifftrax wrote: Sat Nov 17, 2018 6:48 am My point with all of this is just to point out that digital is not some automatic guarantee of signal integrity...especially when you are dealing with real-time transmission.
Yes, but you were exaggerating a bit. It's not "completely false", that is an extreme response not unlike "digital is digital." My point was that there is a difference in how analog and digital signal degradation drives perception.

How a protocol deals with errors, as you say, varies but the consumer generally is not aware of these differences. So, reliable, quasi-realtime protocols help drive the perception of "digital is digital."

All I'm saying is that there are a number of factors that drive this perception, as I've discussed, and it's not completely false. It's more fair to say that it's not completely true.

Post Reply

Return to “Everything Else (Music related)”