Discover Habbo's history
Treat yourself with a Secret Santa gift.... of a random Wiki page for you to start exploring Habbo's history!
Happy holidays!
Celebrate with us at Habbox on the hotel, on our Forum and right here!
Join Habbox!
One of us! One of us! Click here to see the roles you could take as part of the Habbox community!


Page 3 of 3 FirstFirst 123
Results 21 to 26 of 26
  1. #21
    Join Date
    Aug 2004
    Location
    Essex
    Posts
    23,585
    Tokens
    9,258

    Latest Awards:

    Default

    Quote Originally Posted by Android View Post
    Instead of being ignorant, why don't you read he bit when I said 'I am not talking about the HDMI cable.'

    I repeat for you, 'I am not talking about the HDMI cable'
    To be fair though, the thread is about hooking up an HDMI to a Sky HD Box and an HD TV, which is probably why they are discussing the use of HDMI cables while you were talking about fibre optics, as far as I can tell at least What they say is true about HDMI cables though. I've not heard about fibre optics having much to do with televisions, so have no place to judge, but gold plated scarts for example seem to make a difference, but gold plated should never cost more than about £20 depending on length, any more and you're being ripped off (though if other factors like cable thickness, noise reduction and length etc may bump up prices).

  2. #22
    Join Date
    May 2005
    Location
    /etc/passwd
    Posts
    19,110
    Tokens
    1,139

    Latest Awards:

    Default

    The only thing I can think of is audio for this "fibre-optic" he's talking about??

    If it was Virgin or something it would go from the wall to the provided box... but I can't see how spending £150 on a fibre cable would really make any difference to bouncing light down a long strip of glass/plastic because that is still digital AFAIK
    Last edited by Recursion; 28-03-2010 at 04:15 PM.
    Quote Originally Posted by Chippiewill View Post
    e-rebel forum moderator
    :8

  3. #23
    Join Date
    Jan 2007
    Posts
    16,195
    Tokens
    3,454

    Latest Awards:

    Default

    Quote Originally Posted by GommeInc View Post
    To be fair though, the thread is about hooking up an HDMI to a Sky HD Box and an HD TV, which is probably why they are discussing the use of HDMI cables while you were talking about fibre optics, as far as I can tell at least What they say is true about HDMI cables though. I've not heard about fibre optics having much to do with televisions, so have no place to judge, but gold plated scarts for example seem to make a difference, but gold plated should never cost more than about £20 depending on length, any more and you're being ripped off (though if other factors like cable thickness, noise reduction and length etc may bump up prices).
    I went on about the other cables after he aid he cannot see the difference with Sky+HD I will try to remember to get a pic later.


  4. #24
    Join Date
    Jun 2005
    Posts
    4,795
    Tokens
    0

    Latest Awards:

    Default

    So, what cable are you talking about since you quoted Recursion's post about HDMI cables and the general topic of this post is about HDMI cables? Also what cable were you expecting them to hook up to the TV in the electronics shop like you said that we should go do?

    Even if you are not talking about HDMI cables, any digital transmission works in the same way. It is simple science to understand why a digital signal is not affected by the same problems as an analogue signal.

    Lets take a really simple example. In a digital signal there are two states - on and off (high & low, 0 & 1, whatever you want to call them). An Analogue signal can have a theoretical infinite number of states. Expressed on a time/intensity graph you get:


    Analogue


    Digital.

    In the analogue graph each point on the graph represents some sort of information represented by a certain voltage (Can be other things, but lets keep it simple). The digital graph is either on or off depending on the voltage.

    Lets introduce some interference due to a poor quality/shielded cable.



    As you can see in the analogue signal we've completely lost the original wave shape and can no longer recover the same information that was originally sent - the image has lost quality. For the digital signal despite having significant noise it is clearly possible to reconstruct the original signal since we only need to decide if it is on or off which is possible at all points on the graph (Above dashed line = on, below dashed line = off). The perfect original digital signal is still able to be reconstructed and, therefore, has lost no quality despite being exposed to significant interference.

    Now, if the interference is so great that on the digital graph it is no longer possible to differentiate between on or off then the signal is useless and nothing will be displayed as no (usable) data is available. So with digital signals you either get the original perfect binary data up to a certain amount of interference. After that threshold where it becomes impossible to tell between on/off then there is no data available at all, zilch.

    However, with analogue signals unless you have absolutely no interference whats so ever then you are never going to be able to get the same data that was original transmitted as the number of different values that that point could of been is infinite (Rendering it impossible to reconstruct the original signal) therefore causing a loss in quality of the signal.

    It also applies to both signals that if there is not enough strength to get the signal to where ever it needs to go there there is obviously going to be no signal received.

    So, to summarize:

    • Digital signals can either be differentiated between on and off resulting in a perfect, exactly the same as the original signal or, if interference is so great, unable to be differentiated resulting in no data/signal.
    • Analogue signals, unless exposed to zero interference (Near impossible), are never the same as the original signal once transmitted and their perceived quality is determined by the level of interference.


    So what does this mean for video?

    Well, lets start with a digital signal transmitted down a cable. Since we can always reconstruct the original signal up to a certain level of noise (interference) then we either get the perfect exact video, only affected by the quality of the display or none at all. Regardless of what cable I use there are only two possible options as to what will happen: The original, perfect signal displaying on my TV or Nothing displaying on my TV. However, in transmissions like Freeview/Sky that are wireless and digital there are more conditions that can introduce noise to the signal. If there is a prolonged period of interference for whatever reason then you'll completely lose the picture for your TV channel. Although, it is more likely you'll get short spikes of significant amounts of noise resulting in a temporary inability to reconstruct the original signal resulting in a stuttering video or graphical anomalies appearing as the decoder for whatever codec the stream is encoded in attempts to decode video despite being staved of data to do it's job.

    Moving on to analogue and once again starting with signals transmitted along a cable. Since each possible value for the analogue signal represents something different in what we see then as the signal becomes exposed to more and more noise as it is transmitted along the cable we start move away from seeing what was originally transmitted (aka lower quality). Also due to the fact there arw a infinite number of possibilies to the value of a point in an analogue signal we can't guess what that value could of been. Since I can get different cables with varing level of quality it is true that, typically, the more expensive and therefore high quality cables result in lower noise in the signal and therefore a more true to what was originally sent signal resulting in a higher quality image. If I attach a low quality analogue cable to a TV and then swap it for a high quality analogue cable there will be a difference.

    End wall of text.

    Quote Originally Posted by Android View Post
    Instead of being ignorant, why don't you read he bit when I said 'I am not talking about the HDMI cable.'

    I repeat for you, 'I am not talking about the HDMI cable'

  5. #25
    Join Date
    Jun 2005
    Location
    /dev/null
    Posts
    4,918
    Tokens
    126

    Latest Awards:

    Default

    The cable does make a tiny difference.

    There is always some "bits" as in the digital bits lost in every cable. The devices at the end do error correction recreate the lost bits. Essentially the signal either works or doesn't as there's either enough data remaining to do error correction or there isn't. Better cables will lose less bits over longer distances so error correction is more likely to be successful, but if you're getting a "good enough" signal from a cheap cable it should give exactly the same end result as a "good enough" signal from an expensive cable.

  6. #26
    Join Date
    Oct 2006
    Posts
    12,405
    Tokens
    0

    Latest Awards:

    Default

    Quote Originally Posted by preposterous View Post
    when we got it, the guy who installed it said we'd have to buy our own HDMI cable as they don't provide them. so if you didn't have one already, or buy one, then this could be one of the problems?
    Our Sky+ came with a HDMI cable (was only about a month ago).

    I notice more of a difference in the audio rather than the picture but there is an improvement with the picture. Overall, if you are using a HDMI cable, the HD channels will look better, require higher volume on the TV and will have a slight delay compared to the non-HD channels.
    Last edited by Black_Apalachi; 29-03-2010 at 03:42 AM.

Page 3 of 3 FirstFirst 123

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •