What the consumer doesn’t know…

Can’t hurt, right? Sorry, I do not agree… so this article will explore hidden mazes within HDMI as well as electronic communications cabling in general and shed light wherever possible.

Rather than dive into deep technical depths, we will take a critical, practical view of all information and misinformation provided so kindly to the simple consumer by salesmen peddling goods with specious arguments in this important arena. The author confesses to being a practicing engineer working upon advanced circuits, systems and intellectual property for electronics communications.

So what are these secretive aspects? Have you come across DVD players at $29, or even as low as $19? And shopped around for cables to connect these handy, high-performance devices to your television sets? And wondered why the cables are 10 times as expensive, or at their cheapest, just as expensive as the compact DVD players?

As an engineer, I’ll admit to being shocked when I first came across a DVD player at $19. This is a Digital Versatile Disc player that provides high-definition images in real-time video, including sophisticated mechatronics, optoelectronics, motors, sensors, lasers, remote control, and sophisticated high-performance integrated and discrete electronics as well as power conversion and delivery that enables its operation.

All this for $19, to me as an engineer, is magic. But this makes sense; it is the magic of integration and volume production that is the beauty of commerce, benefiting the simple consumer. What doesn’t make sense to me is the cost of the cable to connect such a wondrous device to a television. It is akin to having to pay 10 times what you pay for a soft drink for the straw you use to sip your drink. Would you, kind reader, pay $10 for a plastic straw?

Blogs and online articles [see 2] make admirable attempts to educate the consumer. Following that example, I’ll share and entertain with this engineer’s insights into HDMI, and electronic communications cables in general.

It all began with USB

Or perhaps, with Firewire, the IEEE 1394 standard (an industry-wide standard) that has also been adopted as the High definition Audio-Video Network Alliance (HANA) standard connection interface for A/V component communications and control.

USB came right along about a year or so later in 1996, with similar user-friendly features, but lower speed. But what I am talking about is the marketing of such cables. Recall seeing advertisements that indicated that if you buy the ‘better’ USB cable, the print quality out of your printer will be better?

Let’s dissect that argument and understand it a bit. These communications standards deal with digital data transmission over serial links. Put simply, information is transmitted bit by bit in sequence, at relatively high speeds, and USB started out with 12 million bits per second. Some of these bits transmitted could be recognized incorrectly at the far end, even though these are black-and-white symbols (digital, binary, true-or-false symbols) transmitted.

These errors (corresponding to a Bit Error Rate, BER, of the cable) relate to picture quality, and presumably, the ‘better’ cables offered lower BER.

When the next version rolled along as USB 2.0, the standards group settled on 480 million bits of information per second, a 40 times increase in the bit-rate. More interesting than the puzzling 40X increase was the fact that the very same cables that were meant to transmit 12M bits per second could be employed to transmit the 480M bits per second without failing the standard!

One would rightly think that increasing the data throughput by 40X should amplify BER in the so-called ‘worse’ cable. BER is related to signal loss, and signal loss is strongly correlated with data throughput and corresponding frequency. Figure 1 shows signal losses vs. frequency on a 2-meter long, 50-ohm signal interconnect.

log frequency, Hz

Figure 1: Signal dielectric, skin-effect & cumulative loss on 2m 50Ohm line [7]

Other problems affecting BER, such as signal reflections from interconnect impedance discontinuities etc. should also lead to substantially increased BER at the 40X data rate. The fact that the very same cable qualified for 12Mbps also qualifies for 480Mbps indicates good quality in the cable design.

In other words, any cable good enough (as per the standard) for 12Mbps transmission of serial data that is also good for 480Mbps probably lacked nothing in quality in the first place, and a ‘better’ cable couldn’t have given the consumer any more discernible value.

I started to see the picture – the role specious marketing plays – but only in hindsight. Luckily, I hadn’t spent the money for the so-called ‘better’ USB cables!


High Definition Multimedia Interface, described nicely in the “Connected” May/Jun 2004 article [3], is an industry development enabling high definition television and audio. While there is no industry-wide (IEEE) standard for such an interface at this time, many entertainment devices such as DVD players, Set-top boxes, and high-definition televisions are including HDMI as their connectivity interface of choice. And as the Christmas article in the New York Times [1] indicates, DVD players are taking over consumer households and bringing HDMI with them.

HDMI evolved from another interface termed Digital Visual Interface (DVI) developed primarily as a connectivity link between personal computers and digital displays such as LCD displays. DVI and HDMI are both connectivity interfaces promoted by select groups of companies.

There is perhaps no consumer representation in the development of these interfaces, nor are there any regulations controlling the development of yet another consumer digital, serial interface. Companies claim to know what consumers want, which is perhaps why we have HDMI cables costing $199.95 for a 16 foot length.

The lack of industry-wide standardization of such interfaces, and more specifically, the lack of cable/interconnect performance standards for such interfaces creates confusion, and opportunities to profit at the consumer’s expense in the free market economy. Let us look at some interesting aspects, examples and comparisons to learn more.


The DVI specification, DVI 1.0, was created by the Digital Display Working Group in 1999. The document only included transmitter and receiver signal quality requirements, thus indirectly indicating acceptable cable performance.

The signaling method specified effectively eliminated pre-emphasis or de-emphasis, restricting transmitted signal quality while requiring the links to transmit at data rates up to 1.65 Gbps per channel. In other words, the specification expressly prevented compensation through transmitted signal enhancement for signal loss, while placing a burden of high data rate transmission over multiple (4, red, green, blue and clock) channels through the cable.

While pre-emphasis has been a known technique in the radio industry, that also dealt with information transmission over distances, it was expressly excluded in the DVI 1.0 specification. Perhaps this was an oversight – but the engineer in me knows that digital data transmission (black and white symbols) has the maximum contrast, or high spectral content, and therefore is greatly prone to degradation by a bandwidth-limited channel such as a relatively long cable.

Prior video transmission cables transmitted analogue signals that required far lesser bandwidth, and therefore no special signal enhancement techniques. It does not make sense to transition to digital signaling without considering pre-conditioning schemes for the inevitable loss of high-frequency spectral content along interconnect links expected to be anywhere from 1 to 15 meters in length.

So how was the consumer affected? There appeared a broad variety of cables of a decade-wide range in cost, all claiming conformance to the specification and many claiming to be ‘better’. Cable manufacturers claimed the need for sophisticated materials and manufacturing, driven by a specification placing the burden upon interconnect.

The hidden HDMI reprieve

With the above realization sinking in to the industry, at least those who closely studied the specification, a correction was quietly included in versions of HDMI. This was done by ‘relaxing signal voltage levels’ at the transmitter, allowing for wider fluctuations in the voltage value representing the signal. What was termed as a signal quality issue (overshoot, undershoot) in the DVI specification has been ‘relaxed’ in the HDMI version.

Cleverly done! Was this an oversight in the DVI specification, or was it intentional?

Interestingly, source (transmitter, better) termination for higher transmission speeds has been cited as the cause for the relaxed voltage levels. Signal pre-conditioning continues to be excluded from the specification. One wonders why. There are other aspects to signaling and data transmission architecture as defined in the DVI as well the HDMI specifications that merit investigation for optimality to the desired function.

Cable architecture

Figure 2: Cable assembly architecture for a) HDMI at 1/2 size, and b) Ethernet

Companies have developed and very effectively marketed their co-axial, twin-axial, dual-shielded and triple-shielded cables to the confused DVI / HDMI consumer. Various dire consequences can result from the use of ‘cheap’ cables… or so the simple consumer is educated by such companies.

Figure 2 illustrates an HDMI cable, shown approximately at one-half scale, and Ethernet cables [9] following industry standards at approximately their actual sizes. The HDMI cable is substantially thicker and inflexible presumably because of shielding architecture incorporated within. This naturally increases the cost of the cable assembly.

But do HDMI cables have to be double and triple-shielded in such manner, when data is transmitted through wire pairs within in low-swing, differential, low-EMI manner? Or is such cable architecture necessary for the high data rates required for high-definition video transmission?

Studies conducted most recently at ComLSI, disclosed in [4, 5], show that Cat-5e cabling can be just as good from a signal transmission and reception perspective over very significant lengths (25m+) as any advanced cable architecture. A prior paper goes farther, disclosing SXGA video transmission over 300 meters of Cat 5 cabling [8].

Some companies tout silver linings to the consumer, with signal wires coated with silver within the cable. What they tell you is that silver makes the signal flow better. What they do not tell you, referring back to figure 1, is that silver plating, if and only if done with careful surface roughness control, can assist in reducing the skin-effect component of signal loss, or the middle, red line in the plot of loss versus frequency.

At very high frequencies, though, the dominant component of resistive loss is dielectric loss that is proportional to the frequency, whereas skin-effect is proportional to the square-root of the frequency and plays a much smaller role. In short, silver is only marginally helpful, if at all, at the 3.4Gbps per channel rates of HDMI 1.3.

Companies have also developed and recommended ‘Active Cables’ as detailed in the Wikipedia page [10] for HDMI. These include optical fiber based links and dual Cat-5 links for distances of 100m and more.

On the flip side…

Ahead of all the fatuous efforts discussed this far, a recent development also includes an effort to stuff sophisticated electronic circuits into the assembly of an HDMI cable. A video of this “innovation” may be viewed at http://www.ceprotv.com/.

This company would have you believe that HDMI-qualified cables from established vendors generate hundreds of pixel errors they incorrectly call ‘pixelation’, continually visible on-screen. Sorry, video images with hundreds of consistent pixel errors correspond to a BER of at best 100 in 2 million (1080p resolution), or 50 in 1 million, at least three orders of magnitude worse than the DVI specification of 1 error in 1 billion.

Could a manufacturer have supplied an HDMI-certified cable with this performance or absence of quality check? And no, I do not think removing sophisticated electronic circuits easily integrated into chips at either system end of a typical HDMI link and stuffing them into connectors in the cable is lower cost, higher reliability, easier to use, or of better quality! Do you?


The HDMI specification has come a long way from its DVI roots, including necessary circuit techniques such as source-termination and sink-equalization that can effectively compensate for cable inadequacies at HDMI 1.3 maximum data rates.

There is therefore no need to believe cable vendors touting strange advancements in their cables… simple Cat-5 cables can do the job! In actual fact, Cat-5 and Cat-6 cables have the necessary standardization and design maturity ensuring optimality for high-speed data transfer.

As a practical matter, such Ethernet cables have even addressed mechanical locking to systems chassis as well as snag-free operation in pulling cables through, an aspect as yet unaddressed by HDMI.

10GBase-T developments [6] indicate feasibility of HDMI data rates over 100m of Cat-5e, supporting the argument that HDMI cables should be cheap! Recent cable/system studies conducted at ComLSI [4, 5] supports this conclusion. Rather than a DVD player free with an HDMI cable, every HDTV system ought to come with an HDMI cable free!