So then. HDMI cables...
#1
Fucking superstar........
Thread Starter
Join Date: May 2004
Location: Argyll.... It's lonely...
Posts: 13,240
Likes: 0
Received 0 Likes
on
0 Posts
So then. HDMI cables...
Now, call me stupid if you like, but surely a digital cable either works or does not work?
So, a 4.99 cable will do exactly the same job as a £45 cable?? As it's not analouge so there can't possibly be any signal degridation....
Sooooo, why are there expensive cables... As the woman in the telly shop was trying to get me to buy a cable at 45 quid. I said, why, you have another one there for 4.99... Am I just being Scottish and tight? Or would the expensive one be £41.01 well spent
So, a 4.99 cable will do exactly the same job as a £45 cable?? As it's not analouge so there can't possibly be any signal degridation....
Sooooo, why are there expensive cables... As the woman in the telly shop was trying to get me to buy a cable at 45 quid. I said, why, you have another one there for 4.99... Am I just being Scottish and tight? Or would the expensive one be £41.01 well spent
#3
Professional Waffler
The back to back test I saw put a Tesco Value HDMi against a branded posh one at about £50. There was a difference in quality but you needed posh analysing equipment to see/hear it. Use cheapies imho
#5
Too many posts.. I need a life!!
iTrader: (1)
Join Date: Aug 2006
Location: Newcastle upon tyne
Posts: 553
Likes: 0
Received 0 Likes
on
0 Posts
same here I just paid £4.99 for a HDMI cable no point in buying stupid expensive cables that you can see any difference from a cheap cable..unless you're going to be really anal about having the expensive cable..by telling everyone ooo my cable cost me £50 and it's gold plated.. pmfsl
#6
If in Doubt Flat Out...
It more or less depends on your equipment. If you have a normalish TV you wont notice a massive difference to be honest, its only when you start running top of the range Bravias and so on. better connections (i.e. gold over plated) can make a difference, and I'm colour-blind so for it to make a difference it has to be a fair bit. for everyday use and connections don't see it matters.
#7
Chasing Radders
No difference....... Fact!!!!
I did find a quote on another thread from a top geek bloke saying there is no difference at all..... Like all ready has been said, it's a binary signal so it's 0 and 1's
I did find a quote on another thread from a top geek bloke saying there is no difference at all..... Like all ready has been said, it's a binary signal so it's 0 and 1's
Trending Topics
#8
PassionFord Post Troll
pmsl
how can a gold plated interface make "0" and "1" any better quality...
So facts...
Lead its self makes no difference...
quality of the manufacture will make the lead last...
4 quid one from asda - ends come off
ten quid one from asda - perfect...
Only different is quality of materials not quality of picture as its digital LOL
how can a gold plated interface make "0" and "1" any better quality...
So facts...
Lead its self makes no difference...
quality of the manufacture will make the lead last...
4 quid one from asda - ends come off
ten quid one from asda - perfect...
Only different is quality of materials not quality of picture as its digital LOL
#9
Too many posts.. I need a life!!
iTrader: (1)
Join Date: Aug 2006
Location: Newcastle upon tyne
Posts: 553
Likes: 0
Received 0 Likes
on
0 Posts
pmsl
how can a gold plated interface make "0" and "1" any better quality...
So facts...
Lead its self makes no difference...
quality of the manufacture will make the lead last...
4 quid one from asda - ends come off
ten quid one from asda - perfect...
Only different is quality of materials not quality of picture as its digital LOL
how can a gold plated interface make "0" and "1" any better quality...
So facts...
Lead its self makes no difference...
quality of the manufacture will make the lead last...
4 quid one from asda - ends come off
ten quid one from asda - perfect...
Only different is quality of materials not quality of picture as its digital LOL
#13
If in Doubt Flat Out...
pmsl
how can a gold plated interface make "0" and "1" any better quality...
So facts...
Lead its self makes no difference...
quality of the manufacture will make the lead last...
4 quid one from asda - ends come off
ten quid one from asda - perfect...
Only different is quality of materials not quality of picture as its digital LOL
how can a gold plated interface make "0" and "1" any better quality...
So facts...
Lead its self makes no difference...
quality of the manufacture will make the lead last...
4 quid one from asda - ends come off
ten quid one from asda - perfect...
Only different is quality of materials not quality of picture as its digital LOL
#14
Advanced PassionFord User
Join Date: Feb 2005
Location: Solihull
Posts: 2,090
Likes: 0
Received 0 Likes
on
0 Posts
The difference becomes evident as cable length increases - capacitance and impedance increase with cable length. Capacitances causes 'smearing' of the digital signal, rounding off the edges and resulting in data corruption. Impedance is proportional to frequency as well as cable length, so the higher the data rate the more the signal is attenuated. This is why cheaper cables may fine at 720p/1080i but not at 1080p - more data, higher data rate, greater attenuation.
Now that said, there's no guarantee that a £45 cable has more appropriate capacitance and impedance than a £10 one, and furthermore is the issue of impedance matching between cable and connectors - if a connector has significantly (10% or more as a guide) higher impedance than the cable at the desired frequency, it can result in reflections within the cable which obviously corrupts the data.
FWIW, by default I specify Kramer cables for short runs - they are a pro-av manufacturer and have done the maths. They are available from CPC at very reasonable prices.
At 5 - 10m, I would look at QED or Chord Company leads - not cheap but they have specifically designed them for longer runs and use larger conductors. Chord also do one with an in-built repeater.
Beyond 10m, I prefer to use HDMI-Over-CAT6 converters. It's not cheap, but guaranteed to work and future proof (HDMI 1.4? stick new converters on the end). I have used the £50/60 ebay ones with varying success so now specify the Cyan series by Magenta Research which are about £180 a set.
HTH
Chris
Now that said, there's no guarantee that a £45 cable has more appropriate capacitance and impedance than a £10 one, and furthermore is the issue of impedance matching between cable and connectors - if a connector has significantly (10% or more as a guide) higher impedance than the cable at the desired frequency, it can result in reflections within the cable which obviously corrupts the data.
FWIW, by default I specify Kramer cables for short runs - they are a pro-av manufacturer and have done the maths. They are available from CPC at very reasonable prices.
At 5 - 10m, I would look at QED or Chord Company leads - not cheap but they have specifically designed them for longer runs and use larger conductors. Chord also do one with an in-built repeater.
Beyond 10m, I prefer to use HDMI-Over-CAT6 converters. It's not cheap, but guaranteed to work and future proof (HDMI 1.4? stick new converters on the end). I have used the £50/60 ebay ones with varying success so now specify the Cyan series by Magenta Research which are about £180 a set.
HTH
Chris
#15
PassionFord Post Troll
Firstly it depends on run size. Just because its 0's and 1's doesn't mean data cant get corrupted or lost. better quality connections and materials improve the chances of better signals reaching the output. The difference between a £4.99 and a £50 cable might make no differnece, but once you get on longer runs it can make a difference.
Signal strength can and will be lost over distance but that wasn't really the question
ok so a rule of thumb...
say 2m plus then look at the cable specs as it might need better quality cabling...
Having said that its still 0 and 1's and cat5 seems to manage it ok for 200m lol
#16
If in Doubt Flat Out...
agreed that length will play apart but most tv's are next to the av reciever in the home...
Signal strength can and will be lost over distance but that wasn't really the question
ok so a rule of thumb...
say 2m plus then look at the cable specs as it might need better quality cabling...
Having said that its still 0 and 1's and cat5 seems to manage it ok for 200m lol
Signal strength can and will be lost over distance but that wasn't really the question
ok so a rule of thumb...
say 2m plus then look at the cable specs as it might need better quality cabling...
Having said that its still 0 and 1's and cat5 seems to manage it ok for 200m lol
It Depends what signal your sending down the Cable. We send 0's 1's down cat 5 for our equipment and it needs a booster if you go over 119m.
When ever i buy a lead i don't look at price i look at the ratings and tech spec. its very rare that i'll buy something that isn't Cambridge audio because i know what i'm getting. Plus i tend to always be moving my AV kit so always buy longer then i need.
Best thing, if you don't care get a crap one. if you do, don't go out and buy an expensive one just because. Look online for reviews or speak to a specialist shop like Sevenoaks.
#20
PassionFord Post Troll
I got a cambridge audio one myself way before i'd tested this but it was it had a bendy connection for wall mounting - never wall mounted it and tested a few others...
TO my eyes and ears they sound exactly the same..
TO my eyes and ears they sound exactly the same..
#24
PassionFord Post Troll
I use these ones for my XBox (Living room & bedroom) & they have been fine so far.
http://cgi.ebay.co.uk/ws/eBayISAPI.d...=STRK:MEWAX:IT
http://cgi.ebay.co.uk/ws/eBayISAPI.d...=STRK:MEWAX:IT
#25
Chasing Radders
"Question: Is there any difference between a cheap (i.e. $10 HDMI cable) and an expensive (i.e. $150 HDMI cable)???"
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
#27
Professional Waffler
"Question: Is there any difference between a cheap (i.e. $10 HDMI cable) and an expensive (i.e. $150 HDMI cable)???"
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
Seriously, nice write up mate
#28
PassionFord Post Troll
Join Date: May 2003
Location: DERBYSHIRE
Posts: 2,767
Likes: 0
Received 0 Likes
on
0 Posts
"Question: Is there any difference between a cheap (i.e. $10 HDMI cable) and an expensive (i.e. $150 HDMI cable)???"
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
So in a nutshell, a expensive cable is better than a cheap one then
#31
PassionFord Post Troll
Join Date: May 2003
Location: DERBYSHIRE
Posts: 2,767
Likes: 0
Received 0 Likes
on
0 Posts
"Question: Is there any difference between a cheap (i.e. $10 HDMI cable) and an expensive (i.e. $150 HDMI cable)???"
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...
That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"
I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...
A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...
That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...
Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...
With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?
That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...
Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...
Now, for a slightly easier to understand analogy...
B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....
NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...
THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL
So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.
Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....
I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....
So there you have it... Hopefully it's clear enough to understand and hopefully it will help prevent a few posts...
#33
Advanced PassionFord User
Join Date: Feb 2005
Location: Solihull
Posts: 2,090
Likes: 0
Received 0 Likes
on
0 Posts
With respect to the guy writing Tiff's post, broadcast engineers don't use HDMI cables. They use SDI/HD-SDI, AES/EBU and other professional formats, which are designed with long runs in mind - they have strong line drivers and cables are simple single twisted pairs or coax/triax and so have intrinsically lower capacitance and impedance than a multi-pair, screened cable incorporating a DC signal!
Cable length DOES matter for digital signals - here are some images to demonstrate what happens:
- Digital pulse waveform 0v-5v @ 100MHz
- 1st image, no capacitance
- 2nd image, with capacitance, as introduced by long cables
As you can see the signal is somewhat distorted. Now. yes 1nF is an exagerated capacitance, but it's there to illustrate the point - that signal is total corrupted. The analyzer is so confused it's 'tristated' / given up. What that translates to in the real world is incorrect edge/level detection, corrupt data and hence a corrupt picture.
Now, you would need a pretty damn crap cable to affect the signal to that extent, but slightly smearing can result in a noticeable bit error rate, which in HDMI/DVI signals manifests itself as 'sparkles' on the screen.
If you don't believe me, buy a cheap 20m HDMI cable and try your luck. Denying that the effect is real and measurable is to deny basic electrical principals!
FWIW i've been installing for about 3 years, was a professional sound engineer for 5/6 years before hand and worked for BBC OB's before that.
Chris
Cable length DOES matter for digital signals - here are some images to demonstrate what happens:
- Digital pulse waveform 0v-5v @ 100MHz
- 1st image, no capacitance
- 2nd image, with capacitance, as introduced by long cables
As you can see the signal is somewhat distorted. Now. yes 1nF is an exagerated capacitance, but it's there to illustrate the point - that signal is total corrupted. The analyzer is so confused it's 'tristated' / given up. What that translates to in the real world is incorrect edge/level detection, corrupt data and hence a corrupt picture.
Now, you would need a pretty damn crap cable to affect the signal to that extent, but slightly smearing can result in a noticeable bit error rate, which in HDMI/DVI signals manifests itself as 'sparkles' on the screen.
If you don't believe me, buy a cheap 20m HDMI cable and try your luck. Denying that the effect is real and measurable is to deny basic electrical principals!
FWIW i've been installing for about 3 years, was a professional sound engineer for 5/6 years before hand and worked for BBC OB's before that.
Chris
#34
Professional Waffler
iTrader: (6)
With respect to the guy writing Tiff's post, broadcast engineers don't use HDMI cables. They use SDI/HD-SDI, AES/EBU and other professional formats, which are designed with long runs in mind - they have strong line drivers and cables are simple single twisted pairs or coax/triax and so have intrinsically lower capacitance and impedance than a multi-pair, screened cable incorporating a DC signal!
Cable length DOES matter for digital signals - here are some images to demonstrate what happens:
- Digital pulse waveform 0v-5v @ 100MHz
- 1st image, no capacitance
- 2nd image, with capacitance, as introduced by long cables
As you can see the signal is somewhat distorted. Now. yes 1nF is an exagerated capacitance, but it's there to illustrate the point - that signal is total corrupted. The analyzer is so confused it's 'tristated' / given up. What that translates to in the real world is incorrect edge/level detection, corrupt data and hence a corrupt picture.
Now, you would need a pretty damn crap cable to affect the signal to that extent, but slightly smearing can result in a noticeable bit error rate, which in HDMI/DVI signals manifests itself as 'sparkles' on the screen.
If you don't believe me, buy a cheap 20m HDMI cable and try your luck. Denying that the effect is real and measurable is to deny basic electrical principals!
FWIW i've been installing for about 3 years, was a professional sound engineer for 5/6 years before hand and worked for BBC OB's before that.
Chris
Cable length DOES matter for digital signals - here are some images to demonstrate what happens:
- Digital pulse waveform 0v-5v @ 100MHz
- 1st image, no capacitance
- 2nd image, with capacitance, as introduced by long cables
As you can see the signal is somewhat distorted. Now. yes 1nF is an exagerated capacitance, but it's there to illustrate the point - that signal is total corrupted. The analyzer is so confused it's 'tristated' / given up. What that translates to in the real world is incorrect edge/level detection, corrupt data and hence a corrupt picture.
Now, you would need a pretty damn crap cable to affect the signal to that extent, but slightly smearing can result in a noticeable bit error rate, which in HDMI/DVI signals manifests itself as 'sparkles' on the screen.
If you don't believe me, buy a cheap 20m HDMI cable and try your luck. Denying that the effect is real and measurable is to deny basic electrical principals!
FWIW i've been installing for about 3 years, was a professional sound engineer for 5/6 years before hand and worked for BBC OB's before that.
Chris
im glad someone explained it properly
my mate lives and breathes this stuff and works for Pinewood Studios testing DVD / Bluray quality - the type of cable you run DOES make a difference to your output quality - obviously over distance this does multiply the affect
#35
Guys, stop trying to out do each other and answer the very simple question!
The op needs a HDMI cable. Chances are, as he hasn't said he needs 5000 meters of it, that he just needs a stock 1m cable to connect his DVD/BRD/Sky/Virgin/Whatever box to his TV
Answer is : no, expensive cable will make fuck all difference over one ten times as cheap. I have two HDMI cables - a Philips branded one from Tesco that cost £6.50 and a "Sky" freebie HDMI cable a Sky fitter gave me. Both work perfectly.
So make your choice - pay for name or don't. Cos you ain't paying extra for "better picture quality"
The op needs a HDMI cable. Chances are, as he hasn't said he needs 5000 meters of it, that he just needs a stock 1m cable to connect his DVD/BRD/Sky/Virgin/Whatever box to his TV
Answer is : no, expensive cable will make fuck all difference over one ten times as cheap. I have two HDMI cables - a Philips branded one from Tesco that cost £6.50 and a "Sky" freebie HDMI cable a Sky fitter gave me. Both work perfectly.
So make your choice - pay for name or don't. Cos you ain't paying extra for "better picture quality"
#36
Chasing Radders
Guys, stop trying to out do each other and answer the very simple question!
The op needs a HDMI cable. Chances are, as he hasn't said he needs 5000 meters of it, that he just needs a stock 1m cable to connect his DVD/BRD/Sky/Virgin/Whatever box to his TV
Answer is : no, expensive cable will make fuck all difference over one ten times as cheap. I have two HDMI cables - a Philips branded one from Tesco that cost £6.50 and a "Sky" freebie HDMI cable a Sky fitter gave me. Both work perfectly.
So make your choice - pay for name or don't. Cos you ain't paying extra for "better picture quality"
The op needs a HDMI cable. Chances are, as he hasn't said he needs 5000 meters of it, that he just needs a stock 1m cable to connect his DVD/BRD/Sky/Virgin/Whatever box to his TV
Answer is : no, expensive cable will make fuck all difference over one ten times as cheap. I have two HDMI cables - a Philips branded one from Tesco that cost £6.50 and a "Sky" freebie HDMI cable a Sky fitter gave me. Both work perfectly.
So make your choice - pay for name or don't. Cos you ain't paying extra for "better picture quality"
#37
Professional Waffler
iTrader: (6)
Thrush the sky cable is wank - im fooking blind in comparison to my mate and even i can see the difference. I spent 50 quid for a 1m cable at the weekend - actually make that 100quid as i bought 2
The sky cable isnt even gold plated - which you want at the very least.
The sky cable isnt even gold plated - which you want at the very least.
#38
Chasing Radders
#39
Professional Waffler
iTrader: (6)
Tiff - to be honest I am probably the worst person to speak to about household stereo / av equipment as its never bothered me - i even run a cheap ten quid ebay HDMI special lead on my ps3 as its never bothered me.
But downstairs in the lounge we have spent a fair bit of money in the equipment there and still not finished as want a decent set or rear speakers and a sub - but the cable quality does affect it, especially when you remove the shitty sky hdmi lead.
Personally you need HDMI eyes to see HD - as my mate says and according to him i dont have this which i guess is the same for the majority on here.
For example, my mate has to spot a tiny pixel flicker in a movie so he has sharp eyes - me personally would be oblivious to this so never notice it
But downstairs in the lounge we have spent a fair bit of money in the equipment there and still not finished as want a decent set or rear speakers and a sub - but the cable quality does affect it, especially when you remove the shitty sky hdmi lead.
Personally you need HDMI eyes to see HD - as my mate says and according to him i dont have this which i guess is the same for the majority on here.
For example, my mate has to spot a tiny pixel flicker in a movie so he has sharp eyes - me personally would be oblivious to this so never notice it
#40
Chasing Radders
Tiff - to be honest I am probably the worst person to speak to about household stereo / av equipment as its never bothered me - i even run a cheap ten quid ebay HDMI special lead on my ps3 as its never bothered me.
But downstairs in the lounge we have spent a fair bit of money in the equipment there and still not finished as want a decent set or rear speakers and a sub - but the cable quality does affect it, especially when you remove the shitty sky hdmi lead.
Personally you need HDMI eyes to see HD - as my mate says and according to him i dont have this which i guess is the same for the majority on here.
For example, my mate has to spot a tiny pixel flicker in a movie so he has sharp eyes - me personally would be oblivious to this so never notice it
But downstairs in the lounge we have spent a fair bit of money in the equipment there and still not finished as want a decent set or rear speakers and a sub - but the cable quality does affect it, especially when you remove the shitty sky hdmi lead.
Personally you need HDMI eyes to see HD - as my mate says and according to him i dont have this which i guess is the same for the majority on here.
For example, my mate has to spot a tiny pixel flicker in a movie so he has sharp eyes - me personally would be oblivious to this so never notice it
I know what you mean, i have spent a bit on my TV set=-up/HD ect ect....
But there comes a time when you need to just sit and enjoy the film instead of picking fault with the set-up you have......picture isnt "Million pound" perfect, the bass isnt deep enough, ect ect ect.......