Jump to content

Talk:HDMI/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2


Specification defines an HDMI cable as having only HDMI connectors on the ends

Adaptor cables contravene current HDMI spec, and may not be "allowed" to be sold? http://www.techradar.com/news/home-cinema/thousands-of-apple-hdmi-cables-must-be-withdrawn-976455 http://www.pcmag.com/article2/0,2817,2388289,00.asp http://mobile.pcauthority.com.au/Article.aspx?CIID=263280&type=News It looks like old news but when I was trying to find out why HDMI to DVI cables are currently nearly impossible to obtain (in rural Australia, at least), I could find no useful information anywhere. If anyone is up to date on this perhaps the subject would be worth a sentence or two? HuwG 203.208.123.81 (talk) 07:43, 29 October 2012 (UTC)

The requirement is that certified HDMI cables must have a HDMI plug connection on both ends of the cable. The press section of the HDMI website gives an explanation about that requirement and it states that dongles that convert from a different cable type to a HDMI receptacle connection are allowed. While DVI to HDMI cables can't be certified I still see them for sale on the internet, and at retail stores, so I don't think that HDMI licensing is trying that hard to get rid of them. --GrandDrake (talk) 00:09, 25 May 2013 (UTC)

What is RedMere and how does it work?

I've heard of a technology called RedMere it is supposed to allow "up to 65 feet (20 meters) at the full 10.2 Gbps data throughput" (http://www.monoprice.com/products/product.asp?c_id=102&cp_id=10240&cs_id=1025501&p_id=9167&seq=1&format=2). What is it? How does it work? Does anyone know? — Preceding unsigned comment added by 68.228.41.185 (talk) 22:01, 19 December 2012 (UTC)

RedMere HDMI cables use active amplification which allows them to have much longer cable lengths than passive HDMI cables. Active HDMI cables contain a small chip that boosts the HDMI signal. Active HDMI cables are usually more expensive than passive HDMI cables but are useful if you need a HDMI cable longer than 5 meters. --GrandDrake (talk) 00:14, 25 May 2013 (UTC)

Critical omission in not mentioning severe limitations

—Headline altered from original "Critial" to "Critical" by 70.66.87.2 on 02:56, 3 July 2015 (UTC). This note made by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)

This article comes off as a sales piece, since it is essentially silent on what makes HDMI so costly and difficult for consumers to use, and practically impossible to use over many distances needed in typical home uses. For example, HDMI is crippled in its ability to transmit over anything longer than just a few meters because it uses a parallel signal transmission method, instead of serial communications used in every other modern transmission protocol.

Until this article addresses all such significant consumer issues, I will continue to assume that firms with a financial interest in HDMI will continue to control the article content. — Preceding unsigned comment added by 67.171.190.119 (talk) 23:23, 20 May 2013 (UTC)

The limits of HDMI are due to the high data rate that it uses combined with the low cost signalling method. HDMI at full bandwidth works with passive cables at up to about 5 meters in length. DisplayPort has a higher data rate but at full bandwidth works with passive cables that are up to about 3 meters in length. Neither standard uses serial data transmission since that would either reduce the data rate or increase the price. --GrandDrake (talk) 04:44, 24 May 2013 (UTC)
You may want to have a word with the manufacturer of the cable that I use to connect my PC's HDMI port to my AV unit in my living room. The cable is 20 metres long. It seems that both the manufacturer and the cable are unaware of this limitation. DieSwartzPunkt (talk) 08:53, 24 April 2014 (UTC)
The maximum HDMI cable length depends on bandwidth, cable quality, and whether the cable includes a signal amplifier. --GrandDrake (talk) 19:17, 27 April 2014 (UTC)
I thought the same thing (article is a sales piece). The critical omissions are how much HDMI is prone to power surges, and bad image quality with the smallest interference around. Many articles around the web about those two points. Atriel (talk) 03:23, 19 March 2015 (UTC) —Timestamp fixed by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)
Have to agree with above, in that whenever I've tried to use HDMI to interface a device with a display there have always been problems. Like so many new designs, it's all features and no functionality. Issues range from loss of display resolution each time the lead is reconnected, to no signal at all. Also, you find nonstandard submin sockets that are misleadingly labelled 'HDMI' but after buying a costly HDMI lead you find it will not fit the 'HDMI' socket. DVI connections almost always work first time, HDMI is a disaster. Maybe reliability reports are outside the scope of WP (not sure) but then this article makes it sound far better than it is. --Anteaus (talk) 09:04, 30 May 2015 (UTC)

My hope was to learn some technical details of the specification. It appears, but is not 100% clear from this article, that to even read the specification you have to be a member of the forum ($15 grand sign up?!). It would be helpful if this was spelled out up front one way or the other, so that people didn't waste their time looking for information that does not exist. — Preceding unsigned comment added by 184.71.25.218 (talk) 21:20, 23 October 2013 (UTC)

Error in version comparison table (HDMI 2.0 refresh rates)

I am pretty sure that there is an error in the table listing the maximum resolution for HDMI 2.0 at different color depths. Specifically, I think anything more than 24 bits per pixel can not be shown at 60 frames per second at 4k resolutions (e.g. 4096×2160p60 at 48 bits per pixel would take 4096*2160*60*48/1e9 = 25.5 Gbits/second excluding overhead, well in excess of what HDMI 2.0 can deliver).

News articles I have seen have said that 8 bit color (24 bpp) can be shown at 4k and 60 hz, and 48 bit color can be shown at 4k (but with an unspecified refresh rate). (e.g. see http://www.theregister.co.uk/2013/09/04/hdmi_20_spec_published/ )

I am not knowledgeable enough about this topic to make the appropriate changes, and it might be that more info will be forthcoming in the next few days. — Preceding unsigned comment added by 67.82.65.131 (talk) 00:51, 5 September 2013 (UTC)

I removed some unverifiable information from the table. --GrandDrake (talk) 01:14, 5 September 2013 (UTC)

Will HDMI 1.4 cables work with HDMI 2.0 devices?

Will HDMI 1.4 cables work with HDMI 2.0 devices? Thanks — Preceding unsigned comment added by 71.131.3.194 (talk) 02:57, 28 September 2013 (UTC)

This isn't that clear. You simply cannot put 600Mhz where before only 300Mhz would work. However, you may not need to. Tafinho (talk) 21:42, 19 June 2014 (UTC) —Timestamp fixed by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)

HDMI 1.4 and CEA-861-D vs CEA-861-E

The article says HDMI 1.4 uses the CEA-861-E video standard while referencing a SiI9389 datasheet that mentions HDMI 1.4 and CEA-861-E compliance. However, I'm reading the actual HDMI 1.4b specification and it makes no mention of CEA-861-E, but talks about CEA-861-D... Anssi (talk) 00:02, 31 October 2013 (UTC)

It requires support for all formats defined in CEA-861-D, and requires support for some but not all formats listed in CEA-861-E. The specific formats are listed in section 6.3 of the HDMI 1.4/a/b Specification, headed with "Detailed timing is found in CEA-861-D or a later version of CEA-861 for the following video format timings:". Since some of the formats in the list that follows (such as 1080p 120 Hz in sec. 6.3.2) do not have timings defined in CEA-861-D, only in CEA-861-E and later, it does implicitly require the CEA-861-E standard to be referenced even though it is not mentioned by name. I suspect this line was simply overlooked when modified from the HDMI 1.3a document, and probably should have said "...found in CEA-861-E or later" explicitly, but it still works as-is. GlenwingKyros (talk) 16:54, 19 August 2017 (UTC)

Accessibility

This article needs to address accessibility of the HDMI standards. It seems like audio description for the blind can be included on one of the audio channels. The HDMI standard does not include communication access though for people with hearing loss or non-native speakers of the primary language (no streams for closed captioning data whether for deaf and hard of hearing or anime/foreign language) so in order to use HDMI with captions, the originating equipment must decode the data and pass the generated pictures of text to the end equipment. This limits the accessibility for persons with hearing loss, especially those with visual impairments. Televisions in the United States are required to have closed caption decoder chips and the visibility can be changed to the user's liking for over the air or Video/Audio inputs but this does not work for HDMI inputs since they do not pass the caption data through. This limits the ease with which physically disabled persons can use captions since they cannot just turn their television caption decoder on once and be done, they must turn on captions on each piece of sending equipment when using HDMI. This makes it harder for children and the aged to gain ease of use for captions. This also increases costs for everyone as with HDMI all of the sending and receiving units must have decoder chips (receiving units/televisions for over the air caption data.) — Preceding unsigned comment added by 2602:30A:C049:EA80:4885:EA70:59A9:82AB (talk) 18:30, 9 June 2014 (UTC)

As far as I know it, this isn't part of HDMI. HDMI moves the signal from source to destination. Audio signals will be supplied by the source, selected from those available. For example, a DVD player will select from the available audio tracks and send one out. I believe CC is also selected and decoded inside a DVD player, and the resultant video sent out. CC decoders in television sets will decode ATSC (or NTSC) closed-caption signals. An external tuner, connected through HDMI, will also do that. Gah4 (talk) 22:09, 25 February 2021 (UTC)

600Mhz Source and existing cables compatibility

There is no source for the 600Mhz TMDS frequency value. Also, there is no independent source for the compatibility for existing 1.4 cat 2 cables, apart from the HDMI forum. Tafinho (talk) 22:03, 19 June 2014 (UTC) —Timestamp fixed by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)

Several sources mention 18 Gbps for HDMI 2.0 so based on how HDMI works that would mean a TMDS clock rate of 600 MHz. I have added two references for the TMDS clock rate. As for compatibility with older cables it would depend on the cable but almost all HDMI cables are made out of copper cable. As long as the bits are correctly transferred over the copper cable it doesn't matter if the cable was designed for it. --GrandDrake (talk) 21:25, 28 June 2014 (UTC) —Alterations made 21:31 (UTC). Timestamp fixed and comment made by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)

To this topic: the article says "Category 2-certified cables, which have been tested at 340 MHz" and that HDMI 2.0 introduces 600MHz. So how should one go about purchasing a cable? Blindly test many until one works? It there an unofficial label used by cable manufacturers? --193.169.48.48 (talk) 13:51, 19 February 2015 (UTC) (the above was by me)--Xerces8 (talk) 09:44, 22 February 2015 (UTC)

The HDMI FAQ explicitly states that "HDMI 2.0 specification defined a new, more efficient signaling method, for speeds above 1.4b limits (10.2Gbps), to allow higher bandwidths (up to 18Gbps) over existing High Speed HDMI Wire Cables." and (further down on the page) "HDMI 2.0 features will work with existing HDMI cables. Higher bandwidth features, such as 4K@50/60 (2160p) video formats, will require existing High Speed HDMI cables (Category 2 cables)."
In other words, "premium high speed" is the same category-2 ("high speed") cable that HDMI 1.4 uses. The name is for a marketing/certification program, not a new cable specification. HDMI 2.0 is not sending 600MHz but is encoding more data (18Gbps) onto the same frequencies (up to 340MHz).
Category-3 cable is the "Ultra High Speed" cable introduced as a part of HDMI 2.1:
Q: What is an Ultra High Speed HDMI Cable?
A: The Ultra High Speed HDMI Cable is the first cable defined by the HDMI Forum. Ultra High Speed HDMI Cables comply with stringent specifications designed to ensure support for high resolution video modes such as 4Kp50/60/100/120 and 8Kp50/60 as well as new features such as eARC and VRR. Ultra High Speed HDMI Cables exceed the requirements of the latest international EMI standards to significantly reduce the probability of interference with wireless services such as Wi-Fi.
Q: Is the Ultra High Speed HDMI Cable a Category 3 cable?
A: Yes
Shamino (talk) 17:57, 26 December 2017 (UTC)
I also found that the HDMI 2.0 spec states that the TMDS clock rate is 1/4 the TMDS character rate for character rates greater than 340Mcsc (they are equal for rates of 340Mcsc is less). This is how they pack 18Gbps onto a 340MHz TMDS clock. Shamino (talk) 18:40, 26 December 2017 (UTC)
That is only a technicality, as the clock signal does not carry the data, that's not where the 18 Gbit/s resides; the data channels still operate at 6 GHz as opposed to 3.4 GHz, which creates compatiblity concerns, and whether the clock rate runs at 600 MHz or 150 MHz only changes whether the clock-to-data signal ratio is 10:1 or 40:1. Premium High Speed cables are certified at 18 Gbit/s, High Speed cables are only tested at 10.2 Gbit/s. They are different certifications. The HDMI FAQ stating that all certified High Speed cables will work at 18 Gbit/s was written before Premium High Speed cert existed, and was found to be wrong (that's why the Premium High Speed cert had to be created in the first place).
http://www.bluejeanscable.com/articles/note-about-hdmi-2.htm
http://www.bluejeanscable.com/articles/premium-hdmi-cable.htm
GlenwingKyros (talk) 19:46, 29 December 2017 (UTC)

HDMI 2.0 references

Back when i added a section about HDMI 2.0, I added some references to convince everyone it was real. But since equipment with HDMI 2.0 is actually on the market right now, do we really need four references per statement in the HDMI 2.0 section? PizzaMan (talk) 18:40, 16 July 2014 (UTC)

There are only two secondary sources. I have marked primary sources and raw press releases. Perhaps those should be removed. ~KvnG 03:07, 23 July 2014 (UTC)
Sorry for the late response, but i agree, so i went ahead and removed them. This article has a *lot* of sources. Seems unnecesarry to me. It's not like the contents of this article will likely come under such debate that they require multiple refs for each sentence. PizzaMan (♨♨) 22:20, 8 December 2014 (UTC)
On a sidenote: is it too lazy for me to rely on bots for handling orphaned references or, for example, dating a [citation needed] tag? PizzaMan (♨♨) 11:12, 9 December 2014 (UTC)

1536kHz Audio

Although the press-release indicates a sampling rate of 1536kHz, this must be an error. The highest sampling rate I've ever heard of is 192khz and that's "audiophile" crazy-high quality that's 4x as high as the human ear can even deal with. 1536kHz is an order of magnitude higher than that! — Preceding unsigned comment added by 199.20.36.1 (talk) 16:53, 11 August 2014 (UTC)

That's 7.1 channels of 192 kHz each. PizzaMan (♨♨) 22:21, 8 December 2014 (UTC)
Then it's 1536 kHz "symbol rate", not "sample rate" - which we don't really add up for separate signals. A clear case of carefully crafted "marketese language" from the HDMI Association. --128.68.48.32 (talk) 23:05, 13 January 2016 (UTC)

Connector inconsistency

In the lead it says "all use same cable and connector."

Later in connectors all the different connectors are described.

Surely both cant be true :-)

Rpm13 (talk) 12:00, 5 January 2015 (UTC)

Type A receptacle should be female?

In the box a the top right of the article it shows an HDMI pin-out with the label "Type A receptacle HDMI (male)" connector. Shouldn't that be "Type A receptacle HDMI (female)"? — Preceding unsigned comment added by 71.81.180.160 (talk) 20:08, 19 January 2015 (UTC)

Royalties?

As perfectly acceptable HDMI cables can be obtained from Pound Shops, it would appear that the manufacturers of these cables are not paying any royalties. Should I be concerned about this? — Preceding unsigned comment added by 89.243.167.3 (talk) 22:20, 2 March 2015 (UTC)

CEC

In the article: "CEC wiring is mandatory, although implementation of CEC in a product is optional.[49]"

In the HDMI 1.3 spetification: "HDMI includes three separate communications channels: TMDS, DDC, and the optional CEC. "

Some cables does't have CEC wire. — Preceding unsigned comment added by Vagonoboev (talkcontribs) 07:43, 8 April 2015 (UTC)

You are referencing Section 8 of the HDMI specification, which is not defining cable specifications. As defined in Section 4 (physical layer and cable specification), the physical CEC wire is not optional. The implementation of the CEC channel in a product is optional (i.e. not all products are required to make use of the CEC wire, though all cables have it). GlenwingKyros (talk) 17:00, 19 August 2017 (UTC)

Expanded table

The feature table counts 6 columns, explaining differences between consecutive HDMI versions, but does not distinguish 1.4 from 1.4a or 2.0 from 2.0a. Shouldn't this table be expanded to, say, 14 columns? The Seventh Taylor (talk) 17:52, 26 May 2015 (UTC)

Cost

> HDMI manufacturers pay an annual fee of US$10,000 plus a royalty rate of $0.15 per unit, reduced to $0.05 if the HDMI logo is used, and further reduced to $0.04 if HDCP is also implemented.

no one should pay asia/europe a dime for going lone star unless they first confirm "HDMI" is making payments to VGA, DVI, VESA etc al who they copied the bulk of their work from

Making several incompatible connectors and versions: all they have is a cable style and a sugguested use which is no better or worse than DVI-D beyond that of designating certain uses "packet headers", such as when data will be considered audio — Preceding unsigned comment added by 72.209.223.190 (talk) 18:19, 16 July 2015 (UTC)

History of Digital Audio/Video

I removed a section which was added by an unknown user, which was completely misplaced in the article, containing no references at all, and was of overall low quality. Soulhack (talk) 10:12, 21 July 2015 (UTC)

Hello fellow Wikipedians,

I have just added archive links to 2 external links on HDMI. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 22:39, 6 January 2016 (UTC)

What connector for 2.0 and 2.0a cable?

The subtitle of this section is self explanatory. The 2.0 cable has been existing (for customers anyway) since 2013 but in the section of connectors nothing is being noticed about which connector it will connect to. 145.132.75.218 (talk) 21:13, 15 January 2016 (UTC)

HDMI versions and connectors are not related, any HDMI version may be transmitted over any connector type. GlenwingKyros (talk) 17:02, 19 August 2017 (UTC)

Hello fellow Wikipedians,

I have just added archive links to one external link on HDMI. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 13:46, 9 February 2016 (UTC)

Consoles

Do some consoles use HDMI? — Preceding unsigned comment added by 172.56.41.196 (talk) 15:47, 19 February 2016 (UTC)

What kind of console? Misty MH (talk) 11:07, 27 July 2016 (UTC)

input/output bi-directional usage

What "reverse channel" features does HDMI offer? Can an HDMI display also include microphone audio input to a computer, for example? Can the display send back touch/multitouch data via HDMI? If such is possible, what are most common current examples and standards?-71.174.184.36 (talk) 13:43, 16 March 2016 (UTC)

seems 2.0b is launched, but what's going beside 2.1?

Nvidia GP104/geforce 1080 launch sheet data(click VIEW FULL SPEC) shows support HDMI 2.0b with DP 1.4, and 2.0b page from HDMI LLC says "Bandwidth up to 18Gbps". so, 2.0b is same as (former)2.1?--Hong620 (talk) 05:15, 7 May 2016 (UTC)

good question, what is going on here? — Preceding unsigned comment added by 193.107.156.62 (talk) 13:43, 6 June 2016 (UTC)

Official 2.0b?

If there is an official HDMI 2.0b, can someone with access and savvy add the features that differentiate it to the article? Thank you! (HDMI 2.0b) Misty MH (talk) 11:06, 27 July 2016 (UTC)

USB / HDMI Alt Mode

There is a new USB Type-C to HDMI spec. Although it's "not backwards compatible with existing devices" and "only supports the older HDMI 1.4b standard" so "4K (UHD) video will work, but only at 30FPS" 40.129.236.30 (talk) 06:32, 6 September 2016 (UTC)

Hello fellow Wikipedians,

I have just modified 6 external links on HDMI. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 08:12, 10 November 2016 (UTC)

Digital or analog?

Among the very first words in the lead is "is a proprietary audio/video interface", without even a mention if it's a digital or analog interface. While I _think_ it's a digital interface, suggesting "is a proprietary digital"... would be correct, I don't know. I'd appreciate if someone knowing HDMI could fix the lead. 83.252.234.134 (talk) 08:02, 2 December 2016 (UTC)

HDMI 2.1 packetization and embedded clock

Regarding, "The sources suggest that the new mode isn't really TMDS any more; we will need to wait until the official specification to clarify the details. (The packet-based format is to facilitate the embedded clock, it's not a separate point.))" [1] by C0nanPayne (talk):

I guess by 'packetization' you are mentioning these references:

Reference 127, "HDTV Expert - HDMI 2.1: The Need For Speed Continues". hdtvmagazine.com: "But HDMI 2.1 adds another lane for TMDS data (although it’s not really TMDS anymore) by taking over the clock lane and embedding clock data within the existing signal, much the same way it’s done with packet-based signaling systems."

Reference 128, "HDMI 2.1 To Bring Robust Home Theater Experience". hdguru.com: "The connector includes three twisted pairs and a clock – which translates to four twisted pairs but sending basically RGB or Y and Cb and Cr. HDMI 2.1 can be run in inverted clock mode, which uses all four lanes and is packetized – This is said to be similar to though not the same as DisplayPort."

DisplayPort and MHL uses certain techniques to derive the data clock from the data stream itself. They don't use a separate clock lane (except in MHL 1.3) and there are only data lanes. Numerous digital coding methods -- on data -- enables them to do this.

Until HDMI 2.0, each of the 3 data lane carried one of the 3 colour component (e.g. Blue on TMDS D0 channel) and this was a fixed mapping. But when you have less than 3 data lanes or more that 3 data lanes the fixed mapping cannot be used any more and data has to be sent as fragments over the lanes. They use packetization to flexibly partition and distribute the AV data over the data lanes. This kind of packetization is used in MHL/superMHL and DisplayPort.

They probably use both embedded clock and packetization on HDMI 2.1. But, embedding of clock signal in data lane is a separate point from packetized structure of data stream.

louisnells (talk) 05:32, 12 February 2017 (UTC)

My point was to aid the reader in understanding that the packet-based format is how the clock is able to be embedded (in addition to being a means to distribute the three streams over a non equal number of lanes). Otherwise the reader is left with the impression that packetization is just another unrelated change, something like 16b/18b. (Both references talk about the embedded clock in the context of packetization.)
Given that DisplayPort explicitly makes use of data packets (called micro packets) to embed the clock signal within the data stream, how does making it a standalone point help explain it? C0nanPayne (talk) 17:27, 12 February 2017 (UTC)
The packetization does not always imply embedding of the clock signal. For example with superMHL there are 6 data lanes and they use packetization, but still they have a separate clock line (they use eCBUS as clock line). Packetization and embedded clock are just two separate techniques. One does not imply the other one. Packetization works in data level and embedding of clock signal works in line coding level (like they are in 2 different layers).
louisnells (talk) 08:30, 13 February 2017 (UTC)

Specifications

The link to HDMI 1.3a Specifications is redirected to Uncompressed video.64.47.214.68 (talk) 08:24, 7 March 2017 (UTC)

HDMI Alternate Mode Support

It seems that HDMI is listed along with USB-C as one of the supported display interfaces for Qualcomm Snapdragon 835. HDMI was supported on many of the earlier Qualcomm chips. Also recently Qualcomm officially confirmed support for USB-C DisplayPort Alt Mode on same chip. Doesn't this mean USB-C HDMI Alt Mode is also supported ? Is anybody having any more info on this ?

File:Qualcomm Snapdragon 835 HDMI Alt Mode.png
Qualcomm Snapdragon 835 HDMI Alt Mode

louisnells (talk) 09:07, 6 June 2017 (UTC)

Hello fellow Wikipedians,

I have just modified 6 external links on HDMI. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 22:06, 18 June 2017 (UTC)

Table on "Refresh Frequency Limits at Various Resolutions"

In the table is the column "Data Rate Required[a]" but with no explanation how this was calculated. Second, is was calculated vor 8 bpc (24 Bit/px) but standard will be in future HDR which has 10 bpc. Could somebody corrected this or add a second column! Thanks! -- 84.157.27.64 (talk) 18:46, 1 September 2017 (UTC)

Yesterday I reverted a change related to refresh rates, as HDMI 1.0 allows only specific rates, and so those should be stated. But data rates are a different question. Seems to me that in any case including compression, it isn't easy to give a meaningful data rate. Data rate is supposed to be an information rate, but is it before or after compression? More important is the required bandwidth for the signals, but even that isn't easy to define. Gah4 (talk) 18:58, 1 September 2017 (UTC)
More specifically, HDMI uses TMDS. That is only useful if, in fact, the bits are not random, such that the information rate is lower than the data rate. Specifying the clock rate for the TMDS signal might be useful, though, but that is somewhat different from the data rate. Gah4 (talk) 19:05, 1 September 2017 (UTC)
Calculation of datarate (bits per second) is simply the number of bits per pixel (24 in this case) times the number of pixels per second (which is number of pixels per frame times the number of frames per second). This must also include blank pixels padded around the image (or in actuality, pauses between sending pixel data between each frame and line, during which audio and control signals are sent, which is accounted for by pretending there are extra pixels which correspond to the amount of time spent on this "auxiliary data" period; these are referred to as "blanking intervals"). Standard video is uncompressed so there is no "before or after compression". HDMI 2.1 does introduce a compression option but since these data rates span all versions it should be clear that they do not have compression applied. I will edit the footnote to make this clear when I get home though.
As noted, the size of the blanking intervals used for calculating the numbers in the table are determined by CVT-R2, which is defined in the VESA CVT 1.2 standard, which is publicly available for download on the VESA website (on the Free Standards page). Though for simplicity you can use the calculator here, which can walk you through all the calculations step by step:
https://linustechtips.com/main/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&H=1920&V=1080&F=60&calculations=show&formulas=show
It is not technically exact since it does not account for pixel clock rounding, but that's well below the precision level used on the table (10⁴). The formula shown on the page is also somewhat simplified although internally it uses the more complex exact formula for the actual results.
Datarate is used here and not bandwidth, since bandwidth requirements are dependent on the interface as different interfaces may need different amounts of bandwidth to send the same data. It would not make sense to list the bandwidth requirements since HDMI 1.0–2.0 use 8b/10b and HDMI 2.1 uses 16b/18b, so you would need two different bandwidth numbers for every format. It makes more sense to simply list the data rate, and it can be compared to each interface's datarate (bits of data per second, which excludes bits used for encoding purposes, as they don't represent part of the data set being transmitted) as listed in the table header, and in more detail in the table above.
I am adding another table for HDR formats soon, but am out of town at the moment. I believe putting them both in the same table would make it more difficult for readers to find the information they are looking for, so a separate table should be used in my opinion. GlenwingKyros (talk) 03:17, 3 September 2017 (UTC)
Also, higher solutions are achieved using 4:2:2 chroma subsampling, as some initial UHD TVs used, and also can be used to support 4K HDR10.Tafinho (talk) 16:10, 19 September 2017 (UTC)
Yes, although as mentioned in the footnotes already the table is only considering 4:4:4 formats. If we are going to include subsampling then we may as well also start saying HDMI 1.4 supports 4K 60 Hz in the previous table (because of YCbCr 4:2:0) etc., but this I think would only lead to confusion. GlenwingKyros (talk) 18:16, 19 September 2017 (UTC)

—Indents fixed by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)

HEC verification

I have marked verification issues in HDMI#HEC. See further discussion at Talk:Ethernet_physical_layer#HDMI_Ethernet_Channel.3F. ~Kvng (talk) 15:39, 17 September 2017 (UTC)

HDMI is being superseded by DisplayPort ?

HDMI is owned and maintained by HDMI Founders and HDMI Forum together and neither of these groups ever agreed with VESA (which maintains DisplayPort) to obsolete HDMI OR on it getting superseded by DisplayPort. Does anyone have any official info on the same ?

HDMI and DisplayPort are rather two competing display interface standards with each one going ahead with it's own road map. louisnells (talk) 16:53, 2 October 2017 (UTC)

"HDMI is the de-facto connection in the home theatre and is used widely on HDTVs as an A/V interface. Some PCs and monitors include HDMI to enable connectivity with HDTVs and other consumer electronics gear. While DisplayPort has a rich consumer electronics feature set, it is expected to complement and not necessarily replace, HDMI. DisplayPort is focused on PC, monitor, and projector usages as a replacement for DVI and VGA where high performance and backwards and forwards compatibility over standard cables are valued." https://www.displayport.org/faq/faq-archive/ GlenwingKyros (talk) 17:57, 2 October 2017 (UTC)
I think I agree that HDMI is not being superseded by DisplayPort, but then again, am I sure that HDMI has superseded DVI and VGA? Did they ask/tell the DVI and VGA people about that? How far along the superseding does it need to be, before we state it here? Gah4 (talk) 19:49, 2 October 2017 (UTC)
Further development of VGA and DVI have ceased, and efforts have moved on to HDMI and DisplayPort. So, HDMI and DisplayPort supersede DVI and VGA, because active development has switched from the latter to the former. DisplayPort does not supersede HDMI, because HDMI 2.1. Likewise HDMI does not supersede DisplayPort, because future DP versions are still ongoing. They are instead competing standards, rather than one superseding the other. It seems as though someone thinks "supersedes" means "is newer and better" though. Overall I would say "supersede" is not really the right word here anyway, as that really should be reserved for successions of one's own standards/products (i.e. HDMI 1.4 supersedes HDMI 1.3). HDMI/DisplayPort "replace" DVI/VGA, is probably a better wording, "supersedes" has a more particular usage, generally speaking. GlenwingKyros (talk) 20:46, 2 October 2017 (UTC)—Original post, but with a strike, reinstated by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)
(Edited) Now that I think about it, "supersede" isn't really an applicable term for any of these situations, since the usage of the term means for one thing to have the "final say" over the other; for example HDMI 1.4 supersedes HDMI 1.3, meaning that for any differences between them, whatever is written in the HDMI 1.4 spec takes precedence over 1.3. Neither DisplayPort or HDMI can supersede the other, because they don't cover the same domain; the DisplayPort standard does not define how to create HDMI devices, and the HDMI specification does not define how to create DisplayPort devices, so neither of them has any material which can supersede what the other says. I think the word people are looking for is "HDMI/DP is the successor to DVI/VGA" or whatever. As for DP being the successor to HDMI, I do not agree with this either, because development of HDMI has not been abandoned in favor of DP, or vice versa; both of them are still actively publishing new revisions. Contrast that with DVI, whose creating organization has disbanded, or VGA, which is certainly not going to have new revisions published any time soon either. These have been succeeded by HDMI and DP. GlenwingKyros (talk) 07:59, 3 October 2017 (UTC) —Timestamp fixed by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)
When you look at it, competing standards are not much different from competing companies: they may either more or less co-exist or eventually one wins over the other, pushing it from the market. Also, a standard may be improved to a new functionality, obsoleting the former version. Only the latter two can be called supersession. Even though HDMI may have lost share in the computer market it's still competing and it's very strong in the media market and very unlikely to be pushed from the market. --Zac67 (talk) 09:15, 3 October 2017 (UTC)
Many of the technologies (e.g. TMDS) which made DVI/HDMI/MHL possible came from -- and even owned by -- Silicon Image Inc. They were also part of the DVI working group (DDWG). Even HDMI had electrical compatibility with DVI defined as part of it's core specification -- both are TMDS based anyway. So, probably they were OK with HDMI superseding DVI.
But not sure of the current ownership of VGA or it being superseded. But anyway the technology is kind of obsolete and newer standards suffices the newer display interface requirements. louisnells (talk) 17:21, 4 October 2017 (UTC)
Seems to me that once an improved (superseder or successor) starts to gain market share, it will be used for new sources. Sinks will start using it, but also stay compatible with previous sysgems, while those are still around. Also, the older ones stay around a while for cases that don't need the advantages of the new one. Low resolution video often still uses composite video (for TV inputs) and VGA is still common for video projectors, such as used by seminar presentations. (The latter also because of the complications of long distances for HDMI.) For all these reasons, old technology stays around for a long time. Gah4 (talk) 18:08, 4 October 2017 (UTC)

Hello fellow Wikipedians,

I have just modified 12 external links on HDMI. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 06:02, 27 October 2017 (UTC)

Maximum Limits Tables

I have reverted edits which added a second column of data rates for CEA-861 timings and removed some formats.

The purpose of the maximum limits table is to list the maximum that HDMI is capable of, within the best of our ability to estimate (since timings of course can be customized by the manufacturer to an extent which differs with every display, so no exact numbers are possible). CVT-R2, which is the lowest overhead standardized timing formula, gives the closest estimate of what HDMI is capable of. The point of the data rate is to give an a general idea of how close it comes to the maximum data rate of each HDMI version. They aren't listed simply for the sake of informing people about data rates, so listing additional data rates with larger timings like CEA-861 serves no purpose and only adds confusion.

30 Hz format listings are relevant for computer monitors to inform people that HDMI version X is limited to 30 Hz at that resolution, rather than leaving it ambiguous. 30 Hz formats are encountered with high resolution computer monitors such as 1440p monitors with 165-MHz-limited HDMI ports, or 4K monitors with only HDMI 1.4 ports, etc. These are problems that people encounter plenty often with real world computer monitors when they don't research their connection interfaces enough. The point of listing them here is that this article is often a source for such research, so providing the information here will help people do research and avoid these sorts of problems.

HDMI 2.0 only supports 4K 60 Hz RGB/4:4:4 when using 8 bpc color depth. When using 10 bpc color depth (which is required by HDR) it can no longer achieve 60 Hz, and the maximum is 50 Hz. The reason 4K 50 Hz is listed in the HDMI table is so that people may read this page wondering if HDMI 2.0 can do 4K 60 Hz HDR, which is a subject of common interest, and be able to immediately identify that the maximum is 50 Hz unless subsampling is used. However I do agree that 5K 50 Hz and 8K 50 Hz are largely irrelevant at the moment and so I have not reinstated those.GlenwingKyros (talk) 04:42, 2 November 2017 (UTC)

But why doesn't HDMI 2.0 support 4K 60Hz RGB 4:4:4 10bpc? 15,x GBit < 18 GBit. [1] 2003:E5:B718:18ED:2170:82C0:4772:C6AB (talk) 20:50, 30 November 2018 (UTC)
15 Gbit/s exceeds the maximum data rate of HDMI 2.0, which is 14.4 Gbit/s. The *bandwidth* is 18.0 Gbit/s, but not all of the bandwidth can be used for data, 20% of it used for 8b/10b encoding overhead. This is explained in the footnotes of the version comparison table (and should have been in the other table footnotes as well; I'll add it later). GlenwingKyros (talk) 02:26, 1 December 2018 (UTC)

References

  1. ^ Q: Does Dynamic HDR require the new Ultra High Speed HDMI Cable? A: No, but it will be necessary to enable 4K120 and 8K60 video with HDR due to the high bandwidth required by these resolutions and refresh rates. https://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

—Indents fixed and reflist inserted by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC).

Ultra Wide resolution support on HDMI

Can somebody please clarify on the ultra-wide support of HDMI.

HDMI 2.0 seems to have official support for ultra-wide resolutions (e.g. 2560x1080, 3440x1440, ...). But can someone clarify on: 1) On which versions (1.4 ?) ultra-wide timings were officially added. 2) Which resolutions were supported (1080p, 1440p, ...) on each version. louisnells (talk) 14:32, 3 November 2017 (UTC)

Explicitly defined ultrawide formats were added in HDMI 2.0 (via CTA-861-F). Prior to that all ultrawide formats were implemented as vendor-specific formats. Keep in mind that HDMI "adding support" for formats does not mean adding the capability as if it were incapable of display those formats before; HDMI can run at any arbitrary resolution and refresh rate combination within the bandwidth limit, which is how new custom formats such as the first 2560×1080 monitors can be created with existing HDMI standards. "Adding support" for 21:9 ratio means adding supporting material (explicitly defined timings and formats) to promote interoperability, but HDMI does not require "support" to be added to be able to display a particular format. (based on your wording you may already understand this, I write it just in case). GlenwingKyros (talk) 21:53, 3 November 2017 (UTC)

obscure formats

As noted in the article, HDMI 1.0 and 1.1 allow specific video formats. These should stay in the table. Later versions allow for any format compatible with the frequency limits. There is, then, no need to add specific obscure formats. Entries in the table at close to the maximum frequency for each version are probably useful. Gah4 (talk) 00:06, 14 November 2017 (UTC)

Ultra High Speed HDMI TMDS clock

Does anyone have a source for the maximum TMDS clock rate for HDMI 2.1? The existing article says 1.2GHz, but doesn't provide a source for the number. The text assumes that the TMDS clock rate equals the TMDS character rate (1.2GHz for 12Gbps), but this is probably not correct given the fact that HDMI 2.0 uses a TMDS clock rate of 1/4 the TMDS character rate for character rates over 340Mcsc. It is highly likely that HDMI 2.1 will either use the same divider or possibly a higher ratio. Until someone can check the HDMI 2.1 spec, I don't think we can make any assumptions here. Shamino (talk) 18:52, 26 December 2017 (UTC)

Hi; thanks for adding the HDMI 2.0 source.
The information of interest is really the amount of data per channel (3.4 Gbit/s per channel in 1.4, 6.0 Gbit/s per channel in 2.0, etc.), the only reason we use TMDS clock is because that's what the HDMI people use, and it help the page be understandable to readers if we discuss things in terms of the same specifications as other sources. The assumption is based on the HDMI Forum's own statements, the same as it was for HDMI 2.0's 600 MHz TMDS clock, but it appears they have just been giving an "equivalent" figure so people don't get confused. Since this is the case I think it would be better to abandon the TMDS clock notation going forward and just use data rate per channel anyway. TMDS clock seems to carry very little meaning now.
Also in regards to the "Mcsc" thing, this is the HDMI Forum's own custom abbreviation, but it shouldn't be used here; the correct unit for character rate is baud ("Mbaud", "Gbaud", etc.). If you want to say we should use "Mcsc" because that's what the HDMI Forum uses, I would also point out the HDMI Forum uses "msec" for milliseconds, but we don't use that either (EDIT: Well, HDMI Forums seems to have switched over to the conventional "ms", but HDMI Licensing used "msec" and we ignore it). Making up their own abbreviations is something they do a lot, Mcsc is just the latest example. This page doesn't strictly follow the terminology established by the HDMI people, we use more standard notation as again it helps it be relatable to other pages, and more understandable. I'm wrong, was taking "character rate" as meaning "symbol rate". Either way, considering that "Mcsc" is terminology only introduced with HDMI 2.0, which is a confidential document not really available to the public (even the original source no longer exists, the cited one is to the internet archive) I'm not a fan of the special notation. I gather it's to distinguish TMDS clock from the rate at which 10 bit characters are transmitted which is no longer 10 bits per TMDS clock, but since TMDS clock in MHz is the conventional notation (and even HDMI Licensing describes HDMI 2.0 as "600 MHz pixel clock" everywhere outside the specification) in the interest of clarity it would probably be better to use the common notation. Perhaps rename the table heading to "Effective TMDS clock" or something like that. Like I said, it's not the first time the HDMI Specification has come up with its own special notation which has not been replicated here. GlenwingKyros (talk) 19:56, 29 December 2017 (UTC)
I understand. Given all of this, perhaps it makes sense to restrict discussion of TMDS clock to the section discussing TMDS, and remove it from other locations (as you did from the table), since it is really unrelated to cable bandwidth - it's only for the devices to sync to each other. The character rate is also probably superfluous, since it is effectively a derived value - bit-rate * 10 / channels. Discussion about analog frequencies that must be carried by cables may be of interest to readers, but for now (at least until HDMI decides to use something like QAM) this is equal to the bit rate for a single channel. It is telling that HDMI certification (at least the docs I've seen) seems to talk entirely in terms of bit-rate and not frequencies. Shamino (talk) 15:15, 2 January 2018 (UTC)
Agreed, I considered doing that when I created the tables, the only reason I decided to leave things in terms of TMDS clock was because that's the specification that the HDMI people use to describe speeds, and I wanted to maintain that consistency to make it more understandable. I would prefer to leave the "340 MHz" and "600 MHz" listed somewhere on the table for that reason; it may be confusing to people if the spec that the HDMI people use is not listed on the table. A table row for bitrate per channel or data channel signal frequency could be a helpful intermediary step to make it more clear for people who are familiar with the 340 MHz number, to see how that translates to 3.4 GHz per channel (via the 10 data channel signals per one clock signal) and then from there to 10.2 Gbit/s bandwidth (via x3 channels). Of course, it's a bit of a mess using character rate since the 600 MHz number doesn't really have any physical significance, it's just an "equivalent conversion" to make it easier to compare to other versions... Would make more sense to just use bitrate instead as you say, but that's not what the HDMI people did, so... here we are :) GlenwingKyros (talk) 18:13, 2 January 2018 (UTC)

—Indents fixed by Mungo fraans ïttrë rumden (talk) 22:59, 31 July 2024 (UTC)

"4K" is the term coined by the DCI to describe a horizontal resolution of 4096 pixels.

The term refers to that standard.

Television marketing folks decided that they liked the term so much that they incorrectly applied it to their own sub-4K televisions, creating ambiguity, and more importantly, making it much harder for the competing DCI to market their products.

As Wikipedia is an encyclopedia, and not an advertising brochure, it's inappropriate to promote advertising terminology, especially when it's ambiguous and misleading. The precise term for the specification of these higher-resolution televisions is UHD-1, which refers specifically to the 3840 x 2160 pixel resolution.

InternetMeme (talk) 12:55, 1 February 2018 (UTC)

No. That is simply factually inaccurate. "4K" is not a term that was coined by DCI. It is not, and has never been, used solely to refer to that standard. That is something completely made up by consumer "tech" journalists when 4K TVs were first coming to the market, and they were scrambling to be the first to write a "4K EXPLAINED" article with a few minutes of Google research to back it up. Needless to say, they got it completely wrong.
In reality, "4K" and "2K" are used (and have always been used) as generic terms in the cinema industry, to refer to resolutions within a certain class, sometimes referred to in a longer form such as "4K x 2K" or "4K x 1K". I'm sure you've heard that before. They do not refer to exact resolutions, and "4K" is not some "official name" for some specific resolution. Here are some examples:
So no, "4K" is not a term created by DCI. And television marketing folks did not decide on their own to refer to 3840×2160 as "4K", it's called that in the official press release from ITU when the standard defining UHDTV resolutions was published: https://www.itu.int/net/pressoffice/press_releases/2012/31.aspx#.WAJplugrKCp
So as I said, the whole "3840×2160 isn't 4K, that's "UHD"! REAL 4K is 4096×2160!" is completely made-up by consumer journalists, and has no basis in reality. And such misinformation certainly has no place being proliferated here.
(EDIT: Also, UHD-1 is the name of DVB's plan to roll out 4K resolution to broadcast television. It isn't a name for 3840×2160 established by ITU in the definition of UHDTV, they simply refer to it (them) as the 3840×2160 and 7680×4320 systems of UHDTV, or more colloquially as the 4K and 8K UHDTV systems, as noted above.
In regards to being vague, there are also numerous references to 5K, 8K, 10K, etc, which apparently you have no problem with, so I don't see why 4K alone would need to be changed. Exact resolutions aren't needed at all times. When the specific resolution is important, the resolution is listed in parentheses next to the term 4K, in most other places I don't see a particular need to be specific about the resolution, the general class of resolution is all that is needed to make the point.)
GlenwingKyros (talk) 18:41, 1 February 2018 (UTC)

Citations do not support the claim

Article claims: "Conversion to dual-link DVI and component video (VGA/YPbPr) requires active powered adapters.[182][189]"

But notes 182 & 189 say nothing about Component at all. They concern Display Port. Can the editor provide proof of his claim? I received with a device I bought a cable which has HDMI on one end & 3 component cables on the other end, which I believe is intended to change HDMI to Component. But from where would power come? I don't see any circuit imposed in the adaptor; I suppose it could be tiny. (PeacePeace (talk) 01:19, 27 July 2018 (UTC))

HDMI pin 18 provides +5V which could be used to power conversion circuitry. However, this does not mean that the claim is correct. Verbcatcher (talk) 03:17, 27 July 2018 (UTC)
Which claim are you challenging, that HDMI -> Component requires active adapters in general (i.e. a signal conversion process), or just the claim that the adapters always require power? The claim that a signal conversion from HDMI is necessary is difficult to source; it is true simply as a consequence of the fact that the HDMI specification doesn't define any such capability; devices that are designed to meet the HDMI specification will not have the ability to output analog component signals. Of course though, there is no specific passage we can cite which talks about what isn't in the HDMI specification. The claim that adapters require power is technically true at all times, though trivially so; any DAC circuit will require some amount of power to operate. However, to most people, "powered adapters" implies an external power connector, so that statement may be misleading in that sense. HDMI provides inline 5 V power, but only in very small amounts (max 0.055 A of current, or 0.275 W), but that's intended for reading the EDID of a powered-down sink device. I have no idea if that would be enough for powering a simple DAC circuit, but if it can power an EDID chip, maybe it can power a DAC chip. GlenwingKyros (talk) 17:47, 30 July 2018 (UTC)
I don't know if I have one, but I think there are HDMI to VGA adapters with built-in DAC, and powered by the HDMI source.[1] Gah4 (talk) 18:45, 30 July 2018 (UTC)

References

  1. ^ "HDMI to VGA adapter". amazon.com. Retrieved 30 July 2018.