Jump to content

Talk:HDMI/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

CEC

Could we elaborate the CEC section? Or create a separate CEC page? It seems like a pretty involved topic. Some basic questions on CEC: Does it allow a device (like a receiver) to essentially handle much of the functionality of a universal remote by sending control signals over HDMI instead of IR? What functionality is in the core CEC specification? And what kinds of functionality is added by manufacturer specific proprietary extensions (Bravia Link, NetCommand, etc.)? Interoperability and limitations when using components from different manufacturers (Pioneer DVD player and Sony receiver). —Preceding unsigned comment added by Gthiruva (talkcontribs) 05:51, 26 September 2008 (UTC)

DRM

Is it related with the DRM/encryption? I keep hearing about it...

Yes. HDMI supports HDCP, which is a form of DRM. HTH. --Heron 20:36, 30 January 2006 (UTC)
Yes. It's called HDMI-CP, and will be required to view HD-DVD/Blu-ray movies in Windows Vista.
Wulf 02:49, 22 March 2006 (UTC)

This should be (and now, 1 Sept 2007, is) clearly stated. The wording has a lot of pro-DRM indirect propaganda. —Preceding unsigned comment added by 81.38.118.205 (talk) 05:12, 1 September 2007 (UTC)

HDCP is technically copy protection and it is used by DRM systems such as CSS, CPPM, and AACS. Also Blu-ray movies can be viewed on computers using VGA or component video though in the future they might be downconverted over analog because of the Image Constraint Token. --GrandDrake (talk) 00:42, 2 July 2008 (UTC)

Extent of Backwards Compatibility

Since DVI connections can support analog and digital signals, it has been unclear to me whether or not one could use a VGA-DVI adapter and a DVI-HDMI adapter in serial in order to connect a VGA source with an HDMI receiver. It now seems to me that this would not work at all, given that HDMI is apparently digital-only. I think that this article would be a good place to put a clarification of this for people like me. I don't want to do the editing myself, though, because I don't really know anything for sure. --Jmacaulay 15:15, 15 March 2006 (UTC)


ANSWER: HDMI is digital only, so no, a VGA source cannot be connected to a digital display via HDMI. As for a DVI source, as there are 2 types of DVI connectors - digital only (DVI-D) and digital+analog (DVI-I), either will drive an HDMI display using a DVI-to-HDMI cable if the source is digital. For further reference, try http://www.hardwarebook.net/connector/av/dvi.html

Next-Generation consoles

The PS3 will have 1 HDMI connections, I hope the Microsoft will add it soon to their console. (201.145.144.97 22:14, 20 April 2006 (UTC)).

The port on the back of the Xbox 360 is proprietary, so it is conceivible that it can connect to a wire with a HDMI output. However, HDMI is NOT neccissary for any video game, under the realization that you can't just watch a video game like you can watch a movie, and that leting people use a fully HD signal with componet cables or a moniter's cable would actually boost publicity, and thus profits, by advertising for your games. One great example of this is Red vs. Blue. - 68.228.33.74 04:51, 9 October 2006 (UTC)

I dont know about the xbox, but PS3 will support HDMIver1.3 (both consoles)--sin-man 03:00, 30 October 2006 (UTC)

Clarification

It is independent of the various DTV standards such as ATSC, DVB(-T,-S,-C), as these are encapsulations of the MPEG data streams, which are passed off to a decoder, and output as uncompressed video data, which can be high-definition.

Is this how the other standards work or hdmi?


Reference of Image Constraint Token lacks, but,I may have gone over the article too quickly [1]

Is HDMI Type B equivalent to double cabled DVI

Could someone please provide a photo of the new HDMI Type B connector? HDMI Type A is equivalent to DVI Dual Link; Is HDMI Type B equivalent to the double cabled connections used for large digital monitors i.e. Apple and Dell's 30" widescreen models? Will we see latched HDMI (part of UDI (Unified Display Initiative))) replace DVI on computers?

  • According to the article, Type A HDMI is compatible with single link DVI-D and Type B HDMI is compatible with dual link DVI-D. If this isn't true, the article needs to be fixed. Herorev 07:59, 9 August 2006 (UTC)

Licensing

I think there should be some discussion of licensing, and how that has triggered the DisplayPort and UDI efforts. --Belltower

That would be a fairly complex issue and some of it is less than clear cut. For instance DisplayPort is currently license/royalty free but at any time the companies that made it could exert there patents on it. Furthermore HDMI licensing is $10,000 yearly with only a 4 cents royalty per HDMI device (regardless of the number of HDMI connections) so except for the smallest of companies it is not even an issue. Also as of May 2008 the UDI website has been down for several months and looks to have been dead for at least the last year. UDI was basically just a stripped down version of HDMI 1.3 that tried to remove some of the features to reduce the licensing costs. When several of the computer companies though decided they wanted it completely license/royalty free and joined up with DisplayPort I think all development on it died. Personally I don't see any problem with that since HDMI 1.3 is perfectly capable of being a computer interface. Whether it will end up being widely used as one though is another matter. --GrandDrake (talk) 16:30, 6 May 2008 (UTC)
Is it not relevant to HDMI that "HDMI licensing is $10,000 yearly with a 4 cents royalty per HDMI device"? I think it is. I recommend adding (or returning) that information to the article. ---Ransom (--67.91.216.67 (talk) 18:20, 29 September 2008 (UTC))
Agree - I accidentally stumbled upon the licensing issue, and was surprised not to see it in the article. --Yurik (talk) 20:16, 16 October 2008 (UTC)

Improvement over other Interfaces

I need a technical explanation for why any other interface (specifically HDMI) on the DTE side of a cable box would be better than the standard coax connection. The cable feed arrives through a coax connection, so how can a different interface from the cable box to the HDTV provide a better quality picture and sound?

ANSWER: The cable box is tuning digital cable not analog this is the key to why these interfaces are better than standard coax. With new display technologies such as DLP, LCD, LCoS, and plasma able to recieve digital signals and DVI and HDMI able to transmit uncompressed digital signals the TV is receiving the most error free signal possible. If you were to use the RF output or even sVideo or baseband outputs of a cable box the box must first receive and decode the digital signal, than do a digital to analog conversion and remodulate the signal for use on an analog input on a television. As you can guess you lose information in doing that, not to mention these outputs are not capable of high definition. Even though the YPbPr outputs are capable of 1080i output there is a D-to-A conversion that has inherent signal degradation. With HDMI able to transmit multi-channel digital audio as well as HD video, not to mention the use of EDID handshaking, I think I would use it over remod coax. —The preceding unsigned comment was added by 129.188.33.222 (talkcontribs) .
Even a cable box which takes in video content in analog form (the cheaper subscription to cable TV!) will benefit the user in the picture on the screen. The analog signal is converted one time, in the set-top box, and then sent to the TV in digital form over HDMI (as the previous append explains). The analog-to-digital conversion in the set-top box is quite good, in most cases, even though the incoming analog content is limited to SDTV, and not HDTV.Calbookaddict 05:38, 30 March 2007 (UTC)
This is true, however it needs to be said that SD content on an HD TV looks terrible. This is because the TV or STB is forced to interpolate (intelligently make up) pixels that dont exist in the video signal. In reality analog sevices look the best on 480i analog TV's. I have seen too many people spend 2000 dollars on a TV and refuse to get HD cable service, this is an enormous waste of money. —Preceding unsigned comment added by 129.188.69.145 (talk) 15:32, 14 December 2007 (UTC)
With regard to "SD content on an HDTV looks terrible"... "Everything looks better on an HDTV. Even non-HD broadcast TV looks better, thanks to a technial marvel called "upscaling," plus several other technical innovations. -Bill Machrone (columnist/contributing editor PC Magazine), writing in The Wall Street Journal, Jan 7, 2008, page s1. -Dawn McGatney, Jan 7, 2008. —Preceding unsigned comment added by McGatney (talkcontribs) 23:02, 7 January 2008 (UTC)

Compatibility

Will an HDMI 1.3 cable work to its fullest using an HDMI 1.2 female connector? --StealthHit06 (talk)

HDMI is limited to the lowest common denominator so if an HDMI 1.3 device is connected to an HDMI 1.2 device the capabilities of the connection are limited to HDMI 1.2. If your asking if the cable itself will work than it will though the bandwidth it can handle depends on whether it is a Category 1 or Category 2 cable. --GrandDrake (talk) 00:49, 2 July 2008 (UTC)

Merge with Unified Display Interface

Why? --Heron 17:10, 1 August 2006 (UTC)

First version

Was 1.1 the first version, or was there a 1.0? -- Mattbrundage 19:35, 5 September 2006 (UTC)

Added 1.0 specs -- Mattbrundage 15:19, 21 November 2006 (UTC)

HDCP

Any knowledgeable person knows that HDMI is just a part of the CARROT (High Definition) that the ones in the business (Hollywood, MPAA, "content providers" and others) are putting in front of the ASS (the consumer), to lure it to the PATH (enforcing draconian DRMs and limiting the consumers' freedom) they want.

The other parts would be, of course, Blu-ray, HD-DVD or whatever other stupid format with a bazillion times more storage capacity, while the bitrate of the content (movie) is also raised to a bazillion times more (to keep one movie per disk), and all the HD paraphernalia (TVs, BD players...). And, of course, they'll come with DRMs built-in.

To me, HDCP should be much more predominantly mentioned in the article. The whole point of the HD "revolution" is to enforce DRMs (and sell in Blu-ray format movies that where already sold in DVD, and before that in VHS). At least the VHS->DVD step WAS revolutionary. Blu-ray and their kind ONLY offer higher resolution, at the price of unacceptable DRMs. This should be made prominent! Isilanes 20:12, 6 September 2006 (UTC)

It is useful to point out that, as of the HDMI 1.3a Specification, with its Compliance Test Spec, there is a requirement that any HDMI system which implements HDCP must do so in a way fully-compliant with the HDCP Specification. Earlier versions of the HDMI Spec did not explicitly require that HDCP implementations be tested to be fully compliant, in order to attain compliance to the HDMI Spec itself. This enhancement in HDMI 1.3a will greatly improve interoperability among HDMI systems (those tested to HDMI 1.3a), so that the end user's experience improves. Ideally, the end user should never see HDCP turning on or off, nor any visual or aural effects of HDCP.Calbookaddict 05:44, 30 March 2007 (UTC)

It should also be noted that there are no commercially available Compliance Test Equipment devices. There is one company, Simplay Labs LLC (a subsidiary of Silicon Images, who holds the market for HDMI chips) that offers HDCP compliance testing based upon their own SimplayHD Compliance Test specification (EETimes article, Consumer testing hijacked?). The SimplayHD Compliance Test Specification is not available for free or for purchase. The downside that I see is that without having the SimplayHD CTS, there is no assurance that it is compliant to the HDCP Compliance Test Specification. --IlliniFlag 17:17, 9 November 2007 (UTC)

Note that the HDCP Compliance Test Specification used by the authorized test centers worldwide is available on the web at http://www.digital-cp.com/specs/HDCPSpecificationComplianceTestSpecification1_1.pdf. Intel wrote the original HDCP Spec, and continues to maintain it. They have provided this compliance test spec in recent months. It is true, as stated earlier, that test equipment to perform the tests in the CTS is not available. But the definitions of the test in the referenced PDF file (133 pages) go a long way to showing how compliance is checked.Calbookaddict 02:41, 10 November 2007 (UTC)


High-bandwidth Digital Content Protection) An encryption system for enforcing digital rights management (DRM) over DVI and HDMI interfaces. The copy protection system (DRM) resides in the computer, DVD player or set-top box. If it determines the video material must be protected, it encrypts the signal via HDCP and transmits it to the display system, which decrypts it. HDCP enforces copyrights of content that flows through HDMI/DVI connections. HDCP is essentially a way to get the device makers follow the wishes of content providers. Device makers must get licenses that will equip their devices with licensed keys. These keys will enable them to receive and display encrypted content. To get the license, they agree to honor flags in the content that will generally limit the storage and re-transmission of content. HDCP was originally developed by Intel Corporation. It is now published and maintained by Digital Content Protection. There is an incompatibility between HDCP enforcing transmitters and receivers that are not HDCP enabled. Now as the standard evolves, each generation will continue to see the same compatibility problem —Preceding unsigned comment added by 202.162.160.31 (talk) 19:22, 3 October 2007 (UTC)

More HDCP

It's also worth noting that HDMI doesn't require that content be encrypted. The content provider (Blu-Ray disc, cable company, etc.) has to set a flag to instruct the source to encrypt content. For instance, plain old DVDs do not need to be copy protected. I've come across several HDMI-enabled up-converting DVD players that don't support/require HDCP. Then again, there are other players out there that just take the "safe" route and HDCP-protect everything.

TIME CODE

Unlike HD-SDI, HDMI does not provide time coding within the protocol. One could time the transfer of video data by counting frames (of course, also knowing the frame rate). the InfoFrame Packet structures specified in HDMI do include a generic InfoFrame, which could be loaded with time-stamp data, but this would be a vendor-specific solution and not compatible among different, disassociated vendors. Calbookaddict 00:04, 23 March 2007 (UTC)


—Preceding unsigned comment added by 202.162.160.31 (talk) 19:15, 3 October 2007 (UTC)

Cable Costs

I keep hearing from technically inclined friends that since HDMI uses a digital signal, your TV will either work perfectly or not at all. They say that a $5 cable will work as well as the argon-filled, platinum plated, precision balanced, fairy dust sprinkled $80 ones. Maybe some one should comment on that in the article. —The preceding unsigned comment was added by 209.217.75.174 (talk) 20:49, 22 January 2007 (UTC).

Since the article is not a buyers guide, it seems inappropriate to make a generic claim, even though it might be helpful to people considering buying equipment. Its worth nothing the article does already mention that any cable which can meet the transmission specs is appropriate (it just doesn't explicitly say "regardless of the cost"). - Davandron | Talk 22:34, 25 January 2007 (UTC)
I added a couple of sentences to the criticisms section addressing the cost issue. The difference in cable prices is extreme enough that it's made it to some major blogs. If you google for "hdmi cables", 3 of the top 10 links talk about the cost issue. Searching for "hdmi cable scam" gives 300K results. I think it belongs in the article. Being helpful to buyers is just a little bonus.68.8.110.219 06:38, 16 March 2007 (UTC)

HDMI versus Component

I think it would be helpful if their could be a section that compares HDMI to Component. I think this is something that people, including myself, often wonder about. While I realize HDMI is digital while component is analog, it's not clear what kind of difference this makes to image quality. Attila226

I came here to ask the same question. -Indolences 16:01, 1 May 2007 (UTC)
Think abou it. You picture is in digital format on your DVD or Blu-Ray disk, it's then got to be converted to analog component parts to be sent up a component cable, assuming you have a digital panel, you then need to recompile the picture and resample it back into a digital picture to display on your digital panel.. Lots of conversion and messing around with signals.. For HDMI, it's a pure digital signal path.. —The preceding unsigned comment was added by Mgillespie (talkcontribs) 09:47, 14 May 2007 (UTC).
I have verified with Sony that at least on their Bravia LCD HDTVs, Closed Captioning cannot be interpreted by their HDTV if the signal is supplied through an HDMI input. Their Knowledge Articles referencing this lack are C352674 and C83284. Unless the set-top box or the DVD player interprets Closed Captioning themselves and passes it on in the video output, the HDTV will not perform this function with HDMI as input. The suggestion if the set-top box or the DVD cannot do this, one must use a Component, S-Video, or Composite connection. I have seen little discussion of this problem elsewhere in HDMI discussions, and since Closed Captioning can be very important to the hearing-impaired, I'm passing what I've learned along. Perhaps someone else can comment further on this or discuss when and if an HDMI Closed Caption Standard and implementation will be forthcoming. This may be one reason to choose a Component connection instead of HDMI presently. Jwdening (talk) 22:17, 24 February 2008 (UTC)

Picture request

Can you add a close-up picture of the end connector to the article along with something to compare its size to like a USB or (a context appropriate) DVI end connector or even just a ruler? Even better would be the inclusion of a picture of a HDMI female port. Just saying because based on the pictures included in the article right now I would be hard pressed to recognize it instantly in the wild if I were to come across it in a context free situation. --70.51.229.95 23:08, 18 May 2007 (UTC)

Sorry about the lack of focus - a limitation on my camera - but here's a picture with a ruler. The HDMI plug is on the left, the USB plug is on the right.Calbookaddict 01:12, 19 May 2007 (UTC)

Even out of focus, the image looks right as a thumbnail. I've added it to the article. That was really quick response! Great job! --70.51.229.95 01:51, 19 May 2007 (UTC)
What units is the ruler measuring in? Looks too big to be cm, must be some archaic unit :-) TiffaF 06:17, 21 May 2007 (UTC)
The units are inches. I would have thought that the finer divisions of length, being into eighths and sixteenths, would be a dead giveaway that it's not centimeters and millimeters. Sorry, American-centric habit, I suppose. Can someone explain to me why IC packages have dimensions in metric units, while PCB dimensions are in English units? Go figure.Calbookaddict 05:47, 22 May 2007 (UTC)

Note that there is also a Type C connector (and plug) defined in the HDMI 1.3 Specification. It's smaller than the Type A shown in the picture. I don't have one here at home, but they're available already.Calbookaddict 05:52, 22 May 2007 (UTC)

FYI, the original 1970's Texas Instruments 7400 series chips had 0.1 inch pin spacing, and it stuck as a standard, though it is now quoted as "2.54 mm". Newer smaller chip designs are in metric.
BTW, I looked up this article because I noticed all the flat screen TVs and DVD players in my local shops suddenly now have HDMI connections, and at least one DVD player did not have a Scart output. Looks like analogue Scart is being phased out in favour of digital HDMI. TiffaF 07:05, 22 May 2007 (UTC)

Manier?

Just for clarification, in the criticism section is "manier" supposed to be "more"? This is not an English word, and I can find no reference to this through an online search (except that it appears to be a word in other languages).

He's just trying to say "more many", since "many" is obviously an adjective (note the -y ending). Points for trying, I guess. "plugfest events (i.e. manufacturer conferences)" seems like a false appositive.

Confusion?

I have an HDMI labelled Toshiba progressive-scan DVD (and a bajillion other formats) player. I do NOT have any HDMI capable receivers or television. The DVD player has HDMI, component (RGB), S-Video, and A/V (yellow) video outputs, as well as RCA, Digital and Optical audio outputs. The RCA and Optical audio outputs work normally, but I can't seem to get a signal out of the video to save my life. Am I out of luck without an HD tv or receiver? --Snicker|¥°| 03:01, 4 July 2007 (UTC)

This question is not related to the article itself. Ask the question in an A/V forum (such as this. -- Mattbrundage 17:56, 6 August 2007 (UTC)

New table problems

The max color depth information is inaccurate, it is true that RGB is limited on earlier versions to 24 bits but AFAIK all versions of HDMI can do 30 and 36 bit 4:2:2 component. --Ray andrew 04:43, 31 July 2007 (UTC)

This is consistent with what I have read, and also some of the text elsewhere in the post: "Pixel encodings: RGB 4:4:4, YCbCr 4:4:4 (8-16 bits per component); YCbCr 4:2:2 (12 bits per component) " but someone changed my table.

The General Notes state that "8-channel uncompressed digital audio at 192 kHz sample rate with 24 bits/sample". I interpret this as 8 ch, 192kHz, 24 bits is available at the same time. Is this true? In that case, the table could be simplified.

The table originally was supposed to display why and when anyone would need HDMI 1.3 with all exceptions, but now that clarity is lost.

You are correct about the audio, 8 ch of 192kHz @ 24bit is available with any video mode. I agree that the table has been made to complicated, and I would suggest if there is no objections that we trim it to just the major revisions (1.0,1.1,1.2,1.3) --Ray andrew 18:48, 31 July 2007 (UTC)

________________________________________________________________________________________

New problem...

I have a HD ready tv, a dvd player that can support high definition. If i connected each other by a hdmi cable does that mean i can watch dvds in high definition if i put a high defintion dvd in. Plus if i can is it true i can only play hd movies on blu ray? Finally if i cannot do i have to by a dish for it?

Thanks

_____________________________________________________________________________________________ —Preceding unsigned comment added by 79.73.214.192 (talk) 16:51, 17 February 2008 (UTC)

hdmi versions

the hdmi versions on the article doesnt make any sense to me. Hope this helps....


Specifications, Versions, and Capabilities

HDMI version 1.0 met the goals of the HDMI Working Group and provided a true one-cable solution for uncompressed HD video and multi-channel audio including Dolby Digital and DTS bit streams (more on format support later).


Version 1.0

HDMI v1.0 was the original format, released in December 2002. It took DVI's video signal format and added in the ability to carry a Dolby Digital or DTS bitstream or only two channels of PCM audio (48kHz, 24-bit). The two-channel PCM restriction worked fine for connections between cable/satellite receivers or DVD players and a stand-alone HDTV (which only supported two channels of audio) but it wouldn't be able to support the new audio formats that were slated to accompany HD optical discs (HD-DVD and Blu-ray). Adoption of HDMI v1.0 was sluggish, as DVI-HDCP had a headstart in the market. It didn't help that HDMI shares DVI's cable length restriction – anything more than about 15 meters violates the specification and is likely to require either a booster or a conversion to fiber optic.


2. Version 1.1

It was with Version 1.1 (released in May 2004) that HDMI was finally able to make a compelling argument for superceding DVI-HDCP. HDMI could now carry multichannel PCM audio (eight channels at 192kHz, 24-bit) in addition to Dolby Digital and DTS compressed bitstreams. Version 1.1 also added support for passing the bitstream data from DVD-Audio discs, which previously had to be decoded inside the player and output as six channels of analog or passed as a bitstream through IEEE-1394 (also called FireWire or iLink, a connection type that never saw widespread adoption). HDMI 1.1 was a relatively minor update. The primary feature was to add some packets of audio-related content protection information. These packets were required by DVD-Audio in order to permit DVD-Audio content transmission on HDMI. HDMI 1.0 had the audio and video bandwidth and capabilities and HDCP already had the content protection capabilities, but there was some data that the DVD-Audio folks wanted to send to HDMI/HDCP sinks to tell them not to send the DVD-Audio content elsewhere.


    3. Version 1.2

HDMI v1.2 was adopted in August 2005 (v1.2a was adopted in December 2005 and added some testing and certification language). The only notable difference between it and v1.1 is support for a DSD (one bit audio) digital bitstream. This means that a player can now send the raw digital signal from an SACD over HDMI to a receiver or processor, eliminating the need for decoding of the DSD signal at the player. As for HDMI 1.2, several companies have requested enhancements to the HDMI spec that are being considered by the HDMI Founders, but these items are, by agreement, not permitted to be discussed publicly until the specification is released. The HDMI Founders designed the HDMI specification to be dynamic. As such, HDMI has plenty of extra bandwidth to accommodate future audio and video requirements, and the Founders are committed to evaluating and updating the specification to accommodate new audio and video formats that may be introduced in the foreseeable future.


The HDMI 1.2 specifications are: • Support for One Bit Audio format, such as SuperAudio CD's DSD (Direct Stream Digital) • Changes to offer better support for current and future PCs with HDMI outputs, including: • Availability of the widely-used HDMI Type A connector for PC sources and displays with full support for PC video formats • Ability for PC sources to use their native RGB color-space while retaining the option to support the YCbCr CE color-space • Requirement for HDMI 1.2 and later displays to support future low-voltage (i.e., AC-coupled) sources, such as those based on PCI Express I/O technology



    4. Version 1.3

The HDMI 1.3 specification more than doubles HDMI’s bandwidth and adds support for Deep Color technology, a broader color space, new digital audio formats, automatic audio/video synching capability (“lip sync”), and an optional smaller connector for use with personal photo and video devices. The update reflects the determination of the HDMI founders to ensure HDMI continues evolving ahead of future consumer demands. New HDMI 1.3 capabilities include: • Higher speed: HDMI 1.3 increases its single-link bandwidth from 165MHz (4.95 gigabits per second) found on Version 1.1 to 340 MHz (10.2 Gbps) to support the demands of future high definition display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds. • Deep color: HDMI 1.3 supports 30-bit, 36-bit and 48-bit (RGB or YCbCr) color depths, up from the 24-bit depths in previous versions of the HDMI specification. o Lets HDTVs and other displays go from millions of colors to billions of colors o Eliminates on-screen color banding, for smooth tonal transitions and subtle gradations between colors o Enables increased contrast ratio o Can represent many times more shades of gray between black and white. At 30-bit pixel depth, four times more shades of gray would be the minimum, and the typical improvement would be eight times or more • Broader color space: HDMI 1.3 removes virtually all limits on color selection. o Next-generation “xvYCC” color space supports 1.8 times as many colors as existing HDTV signals o Lets HDTVs display colors more accurately o Enables displays with more natural and vivid colors • New mini connector: With small portable devices such as HD camcorders and still cameras demanding seamless connectivity to HDTVs, HDMI 1.3 offers a new, smaller form factor connector option. • Lip Sync: Because consumer electronics devices are using increasingly complex digital signal processing to enhance the clarity and detail of the content, synchronization of video and audio in user devices has become a greater challenge and could potentially require complex end-user adjustments. HDMI 1.3 incorporates an automatic audio/video synching capability that allows devices to perform this synchronization automatically with accuracy. • New lossless audio formats: In addition to HDMI’s current ability to support high-bandwidth uncompressed digital audio and currently-available compressed formats (such as Dolby® Digital and DTS), HDMI 1.3 adds additional support for new, lossless compressed digital audio formats Dolby® TrueHD and DTS-HD Master Audio™. Its bandwidth will be upgraded from 165MHz to 225MHz (but can go up to 450MHz if necessary). The increased bandwidth enables displays to handle 1080i at 60Hz with 36-bit RGB color or 1080p with 90Hz refresh rate with 36-bit color. The new HDMI 1.3 will also support Dolby HD and DTS-HD audio standards (v1.2 only supports Dolby Digital 5.1 and DTS standards). —Preceding unsigned comment added by 202.162.160.31 (talk) 19:30, 3 October 2007 (UTC)

I would mention that two of the statements in that post are wrong. The statement that HDMI version 1.0 was limited to "only two channels of PCM audio" is a common misconception and HDMI has always been capable of up to 8 channels of PCM audio at 24-bits/192-kHz. The statement that HDMI cables over "15 meters violates the specification" is also wrong since there is no fixed limit for HDMI cables. --GrandDrake (talk) 16:46, 6 May 2008 (UTC)

Common connecting examples...

I would like to know when I need this kind of connection and what connects to what... now that I know what HDMI is...

Example... I have a Comcast digital cable box with a HDMI connector on it. I have a LCD TV that has a HDMI connector on it. Both devices also have coax connections. The signal arrives via the coax to the cable box. So, do I need the HDMI to get the best picture or is the coax connection from the cable box to the TV as equal to/better than/worst than the HDMI connection? If equal to or better than HDMI, then why is there a HDMI connection on the cable box in the first place?

Would the cable box HDMI be used to connect to the DVD-R to record movies in HD?

(long winded - sorry) So HDMI is used to connect: DVD/bluray player to TV only? cable box to anything?? DVD camera to TV?? —Preceding unsigned comment added by 209.98.246.21 (talk) 19:15, 15 October 2007 (UTC)

Your coax cable is carrying the "cable" content, either analog or digital, to the set-top box. I'm not sure about the purpose of the output coax cable from the cable box. But the HDMI output allows direct connection to the TV, to an A/V Receiver, to an HDMI switch, etc. A single HDMI cable carries the video and the audio, as well as control information.Calbookaddict 15:56, 7 November 2007 (UTC)

Vista?

It is mentioned in the article "a PC running Windows Vista". As I understand it, that is not really true. Since I am a Linux user myself, and interested in buying a laptop, I googled to find if this is true. A mailing list claims that, for example, a Linux box could also use HDMI. It is probably confused with HDCP, that runs over HDMI/DVI, which is 'supposed' to provide an encrypted link between the monitor and the signal source. So, an article on HDMI as an interface itself should be unrelated of HDCP, thus I find the quote "a PC running Windows Vista" misleading. Any PC could use it without HDCP. Am I wrong? - Ioannis Gyftos 146.124.141.250 11:41, 5 November 2007 (UTC) My source: http://groups.google.com/group/comp.os.linux.hardware/browse_thread/thread/ac75fd4c2a0bec2e/41bb694d8cb7f876

HDCP is not a requirement of HDMI. Content protection is a requirement placed by the content owners, and may apply only to certain video modes (e.g., high-definition). One's own video content can be sent without content protection.
PCs can provide HDMI outputs, with or without HDCP support. The mainline graphics card providers - ATi, nVidia and Intel - all provide graphics cards or motherboards with HDMI putputs. I have not kept up on the details, but googling "HDMI graphics card" resulted in this hit: http://www.tweaktown.com/articles/972/three_hdmi_graphics_cards_tested_on_lcd_tv/index.html
Of course, it must be said that these mainline providers may supply only Windows drivers. A search, or inquiries to those companies, should reveal if there are Linux drivers for their graphics solutions.Calbookaddict 15:54, 7 November 2007 (UTC)
The Linux kernel easily detects an HDMI output as DVI. And since my graphics card apparently isn't responsible for audio, I dont believe that HDMI through windows would be any different. Yesterday I bought that laptop and it works in displaying in my LCD-TV through HDMI. So I'll go ahead and remove the Vista restriction. Could provide photos, but I don't know if that would be convincing. - Ioannis Gyftos 146.124.141.250 09:52, 8 November 2007 (UTC)

Unclear how DVI signal is helped by using HDMI cables

In the HDMI page it mentions that HDMI cables can carry a DVI signal, via an adapter. Does using the HDMI cable overcome the DVI distance limitation, or are boosters still needed? Does using an HDMI cable have any other effects on a DVI signal? I am curious, but would also find this useful for the articles in question. --Alphastream 03:33, 3 November 2007 (UTC)

HDMI has the same limitations, but realisticaly its not much of a limitation. I have a good 50 foot run that works perfectly with no signal boosters or anything. --Ray andrew (talk) 21:09, 17 November 2007 (UTC)

1440P Confusion

So what is 1440P and what is it used for? This page mentions it and has a link for it but it just redirects you to the hdtv page. Seems kind of pointless to do that considering 1440P is not even mentioned on that page. A page needs to be created specifically for 1440P just like all of the other resolutions already have. Dvferret (talk) 13:40, 10 January 2008 (UTC)

First time I heard about 1440p was back in 2006 when Chi Mei Optoeletronics, a Chinese CE company, said they would come out with a 47" LCD in the second quarter of 2007 that would have it for the native resolution. I don't think it was ever released to consumers and I can't even find a price for it. Currently there is no indication that it will be used in either consumer displays or computer displays. The most recent 1440p news I can think of was when Gateway made a bit of a stink by stating that their WQXGA (2560x1600) computer display was the world's first "Quad-HD" display which was a rather silly claim to make considering how many other WQXGA displays had been released. Also agree with you that the re-direction of 1440p to the HDTV page doesn't make any sense and that 1440p needs to be made into its own page. --GrandDrake (talk) 17:24, 6 May 2008 (UTC)

Cable costs - explanation?

At my local electronics store the other day I noticed that one brand of cables was separated by bandwidth - the cheaper cables only let you send low resolutions like 720p, but the more expensive ones went up to and past 1080p. Maybe this explains some of the cable cost differences? 138.38.154.9 (talk) 14:15, 14 January 2008 (UTC)

Sometimes it does but not always.Dvferret (talk) 00:38, 14 February 2008 (UTC)

HDMI Flashing Picture Problem

Digital Rights Management is partially achieved by detecting attempts to circumvent security established for a high-definition session between two devices. For example, a HDMI-jack-equipped DVD player sending a signal to a HDMI-jack-equipped TV over a HDMI cable involves a session during which the two devices continually exchange a security token, thereby permitting detection of the case where a session is hijacked and bits are diverted to a unsecure device. Well, it turns out sometimes this token does not arrive when or as expected. The DVD player and TV combination causes the picture to blink from picture-to-black-to-picture rapidly and continuously. This makes the program material unwatchable. The attempt by the electronics industry, working hand-in-hand with the large media companies, to protect copyright produces an effect that should only be seen by those who are attempting to steal high definition content. Probably because of different interpretations of the HDMI standard, incompatibilities arise that cause the all-or-nothing system to break down. Perhaps this issue deserves coverage in the wiki. Many users encounter this problem and do not know that its root cause is poor management of HDMI and DRM deployment. —Preceding unsigned comment added by 141.156.240.13 (talk) 12:06, 30 March 2008 (UTC)

GA Review

This review is transcluded from Talk:High-Definition Multimedia Interface/GA. The edit link for this section can be used to add comments to the review.

This article has met the Good Article criteria and has therefore been passed. Gary King (talk) 21:07, 27 June 2008 (UTC)

HDMI Switches

I think we also need a section that explains what an HDMI switch is and how it compares and relates to a receiver when plugged into the receiver's HDMI input port. —Preceding unsigned comment added by Gthiruva (talkcontribs) 05:54, 26 September 2008 (UTC)

HDMI CAble

Please can any body confirm that HD TV can connected with Laptop through HDMI Cable


Thanks SIMRAN SINGH —Preceding unsigned comment added by 59.178.48.69 (talk) 13:16, 1 November 2008 (UTC)

Specification defines an HDMI cable as having only HDMI connectors on the ends

Adaptor cables contravene current HDMI spec, and may not be "allowed" to be sold? http://www.techradar.com/news/home-cinema/thousands-of-apple-hdmi-cables-must-be-withdrawn-976455 http://www.pcmag.com/article2/0,2817,2388289,00.asp http://mobile.pcauthority.com.au/Article.aspx?CIID=263280&type=News It looks like old news but when I was trying to find out why HDMI to DVI cables are currently nearly impossible to obtain (in rural Australia, at least), I could find no useful information anywhere. If anyone is up to date on this perhaps the subject would be worth a sentence or two? HuwG 203.208.123.81 (talk) 07:43, 29 October 2012 (UTC)

The requirement is that certified HDMI cables must have a HDMI plug connection on both ends of the cable. The press section of the HDMI website gives an explanation about that requirement and it states that dongles that convert from a different cable type to a HDMI receptacle connection are allowed. While DVI to HDMI cables can't be certified I still see them for sale on the internet, and at retail stores, so I don't think that HDMI licensing is trying that hard to get rid of them. --GrandDrake (talk) 00:09, 25 May 2013 (UTC)

What is RedMere and how does it work?

I've heard of a technology called RedMere it is supposed to allow "up to 65 feet (20 meters) at the full 10.2 Gbps data throughput" (http://www.monoprice.com/products/product.asp?c_id=102&cp_id=10240&cs_id=1025501&p_id=9167&seq=1&format=2). What is it? How does it work? Does anyone know? — Preceding unsigned comment added by 68.228.41.185 (talk) 22:01, 19 December 2012 (UTC)

RedMere HDMI cables use active amplification which allows them to have much longer cable lengths than passive HDMI cables. Active HDMI cables contain a small chip that boosts the HDMI signal. Active HDMI cables are usually more expensive than passive HDMI cables but are useful if you need a HDMI cable longer than 5 meters. --GrandDrake (talk) 00:14, 25 May 2013 (UTC)

Error in version comparison table (HDMI 2.0 refresh rates)

I am pretty sure that there is an error in the table listing the maximum resolution for HDMI 2.0 at different color depths. Specifically, I think anything more than 24 bits per pixel can not be shown at 60 frames per second at 4k resolutions (e.g. 4096×2160p60 at 48 bits per pixel would take 4096*2160*60*48/1e9 = 25.5 Gbits/second excluding overhead, well in excess of what HDMI 2.0 can deliver).

News articles I have seen have said that 8 bit color (24 bpp) can be shown at 4k and 60 hz, and 48 bit color can be shown at 4k (but with an unspecified refresh rate). (e.g. see http://www.theregister.co.uk/2013/09/04/hdmi_20_spec_published/ )

I am not knowledgeable enough about this topic to make the appropriate changes, and it might be that more info will be forthcoming in the next few days. — Preceding unsigned comment added by 67.82.65.131 (talk) 00:51, 5 September 2013 (UTC)

I removed some unverifiable information from the table. --GrandDrake (talk) 01:14, 5 September 2013 (UTC)

Will HDMI 1.4 cables work with HDMI 2.0 devices?

Will HDMI 1.4 cables work with HDMI 2.0 devices? Thanks — Preceding unsigned comment added by 71.131.3.194 (talk) 02:57, 28 September 2013 (UTC)

This isn't that clear. You simply cannot put 600Mhz where before only 300Mhz would work. However, you may not need to. Tafinho (talk) 22:05, 19 June 2014 (UTC)

HDMI 2.0 references

Back when i added a section about HDMI 2.0, I added some references to convince everyone it was real. But since equipment with HDMI 2.0 is actually on the market right now, do we really need four references per statement in the HDMI 2.0 section? PizzaMan (talk) 18:40, 16 July 2014 (UTC)

There are only two secondary sources. I have marked primary sources and raw press releases. Perhaps those should be removed. ~KvnG 03:07, 23 July 2014 (UTC)
Sorry for the late response, but i agree, so i went ahead and removed them. This article has a *lot* of sources. Seems unnecesarry to me. It's not like the contents of this article will likely come under such debate that they require multiple refs for each sentence. PizzaMan (♨♨) 22:20, 8 December 2014 (UTC)
On a sidenote: is it too lazy for me to rely on bots for handling orphaned references or, for example, dating a [citation needed] tag? PizzaMan (♨♨) 11:12, 9 December 2014 (UTC)

Connector inconsistency

In the lead it says "all use same cable and connector."

Later in connectors all the different connectors are described.

Surely both cant be true :-)

Rpm13 (talk) 12:00, 5 January 2015 (UTC)

Type A receptacle should be female?

In the box a the top right of the article it shows an HDMI pin-out with the label "Type A receptacle HDMI (male)" connector. Shouldn't that be "Type A receptacle HDMI (female)"? — Preceding unsigned comment added by 71.81.180.160 (talk) 20:08, 19 January 2015 (UTC)

Royalties?

As perfectly acceptable HDMI cables can be obtained from Pound Shops, it would appear that the manufacturers of these cables are not paying any royalties. Should I be concerned about this? — Preceding unsigned comment added by 89.243.167.3 (talk) 22:20, 2 March 2015 (UTC)

Expanded table

The feature table counts 6 columns, explaining differences between consecutive HDMI versions, but does not distinguish 1.4 from 1.4a or 2.0 from 2.0a. Shouldn't this table be expanded to, say, 14 columns? The Seventh Taylor (talk) 17:52, 26 May 2015 (UTC)

Critical omission in not mentioning severe limitations

This article comes off as a sales piece, since it is essentially silent on what makes HDMI so costly and difficult for consumers to use, and practically impossible to use over many distances needed in typical home uses. For example, HDMI is crippled in its ability to transmit over anything longer than just a few meters because it uses a parallel signal transmission method, instead of serial communications used in every other modern transmission protocol.

Until this article addresses all such significant consumer issues, I will continue to assume that firms with a financial interest in HDMI will continue to control the article content. — Preceding unsigned comment added by 67.171.190.119 (talk) 23:23, 20 May 2013 (UTC)

The limits of HDMI are due to the high data rate that it uses combined with the low cost signalling method. HDMI at full bandwidth works with passive cables at up to about 5 meters in length. DisplayPort has a higher data rate but at full bandwidth works with passive cables that are up to about 3 meters in length. Neither standard uses serial data transmission since that would either reduce the data rate or increase the price. --GrandDrake (talk) 04:44, 24 May 2013 (UTC)
You may want to have a word with the manufacturer of the cable that I use to connect my PC's HDMI port to my AV unit in my living room. The cable is 20 metres long. It seems that both the manufacturer and the cable are unaware of this limitation. DieSwartzPunkt (talk) 08:53, 24 April 2014 (UTC)
The maximum HDMI cable length depends on bandwidth, cable quality, and whether the cable includes a signal amplifier. --GrandDrake (talk) 19:17, 27 April 2014 (UTC)
I thought the same thing (article is a sales piece). The critical omissions are how much HDMI is prone to power surges, and bad image quality with the smallest interference around. Many articles around the web about those two points. Atriel (talk) 03:24, 19 March 2015 (UTC)
Have to agree with above, in that whenever I've tried to use HDMI to interface a device with a display there have always been problems. Like so many new designs, it's all features and no functionality. Issues range from loss of display resolution each time the lead is reconnected, to no signal at all. Also, you find nonstandard submin sockets that are misleadingly labelled 'HDMI' but after buying a costly HDMI lead you find it will not fit the 'HDMI' socket. DVI connections almost always work first time, HDMI is a disaster. Maybe reliability reports are outside the scope of WP (not sure) but then this article makes it sound far better than it is. --Anteaus (talk) 09:04, 30 May 2015 (UTC)

My hope was to learn some technical details of the specification. It appears, but is not 100% clear from this article, that to even read the specification you have to be a member of the forum ($15 grand sign up?!). It would be helpful if this was spelled out up front one way or the other, so that people didn't waste their time looking for information that does not exist. — Preceding unsigned comment added by 184.71.25.218 (talk) 21:20, 23 October 2013 (UTC)

Cost

> HDMI manufacturers pay an annual fee of US$10,000 plus a royalty rate of $0.15 per unit, reduced to $0.05 if the HDMI logo is used, and further reduced to $0.04 if HDCP is also implemented.

no one should pay asia/europe a dime for going lone star unless they first confirm "HDMI" is making payments to VGA, DVI, VESA etc al who they copied the bulk of their work from

Making several incompatible connectors and versions: all they have is a cable style and a sugguested use which is no better or worse than DVI-D beyond that of designating certain uses "packet headers", such as when data will be considered audio — Preceding unsigned comment added by 72.209.223.190 (talk) 18:19, 16 July 2015 (UTC)

History of Digital Audio/Video

I removed a section which was added by an unknown user, which was completely misplaced in the article, containing no references at all, and was of overall low quality. Soulhack (talk) 10:12, 21 July 2015 (UTC)

External links modified

Hello fellow Wikipedians,

I have just added archive links to 2 external links on HDMI. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 22:39, 6 January 2016 (UTC)

1536kHz Audio

Although the press-release indicates a sampling rate of 1536kHz, this must be an error. The highest sampling rate I've ever heard of is 192khz and that's "audiophile" crazy-high quality that's 4x as high as the human ear can even deal with. 1536kHz is an order of magnitude higher than that! — Preceding unsigned comment added by 199.20.36.1 (talk) 16:53, 11 August 2014 (UTC)

That's 7.1 channels of 192 kHz each. PizzaMan (♨♨) 22:21, 8 December 2014 (UTC)
Then it's 1536 kHz "symbol rate", not "sample rate" - which we don't really add up for separate signals. A clear case of carefully crafted "marketese language" from the HDMI Association. --128.68.48.32 (talk) 23:05, 13 January 2016 (UTC)

External links modified

Hello fellow Wikipedians,

I have just added archive links to one external link on HDMI. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 13:46, 9 February 2016 (UTC)

input/output bi-directional usage

What "reverse channel" features does HDMI offer? Can an HDMI display also include microphone audio input to a computer, for example? Can the display send back touch/multitouch data via HDMI? If such is possible, what are most common current examples and standards?-71.174.184.36 (talk) 13:43, 16 March 2016 (UTC)

seems 2.0b is launched, but what's going beside 2.1?

Nvidia GP104/geforce 1080 launch sheet data(click VIEW FULL SPEC) shows support HDMI 2.0b with DP 1.4, and 2.0b page from HDMI LLC says "Bandwidth up to 18Gbps". so, 2.0b is same as (former)2.1?--Hong620 (talk) 05:15, 7 May 2016 (UTC)

good question, what is going on here? — Preceding unsigned comment added by 193.107.156.62 (talk) 13:43, 6 June 2016 (UTC)

Official 2.0b?

If there is an official HDMI 2.0b, can someone with access and savvy add the features that differentiate it to the article? Thank you! (HDMI 2.0b) Misty MH (talk) 11:06, 27 July 2016 (UTC)

Consoles

Do some consoles use HDMI? — Preceding unsigned comment added by 172.56.41.196 (talk) 15:47, 19 February 2016 (UTC)

What kind of console? Misty MH (talk) 11:07, 27 July 2016 (UTC)

USB / HDMI Alt Mode

There is a new USB Type-C to HDMI spec. Although it's "not backwards compatible with existing devices" and "only supports the older HDMI 1.4b standard" so "4K (UHD) video will work, but only at 30FPS" 40.129.236.30 (talk) 06:32, 6 September 2016 (UTC)

External links modified

Hello fellow Wikipedians,

I have just modified 6 external links on HDMI. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 08:12, 10 November 2016 (UTC)

Digital or analog?

Among the very first words in the lead is "is a proprietary audio/video interface", without even a mention if it's a digital or analog interface. While I _think_ it's a digital interface, suggesting "is a proprietary digital"... would be correct, I don't know. I'd appreciate if someone knowing HDMI could fix the lead. 83.252.234.134 (talk) 08:02, 2 December 2016 (UTC)

HDMI 2.1 packetization and embedded clock

Regarding, "The sources suggest that the new mode isn't really TMDS any more; we will need to wait until the official specification to clarify the details. (The packet-based format is to facilitate the embedded clock, it's not a separate point.))" [2] by C0nanPayne (talk):

I guess by 'packetization' you are mentioning these references:

Reference 127, "HDTV Expert - HDMI 2.1: The Need For Speed Continues". hdtvmagazine.com: "But HDMI 2.1 adds another lane for TMDS data (although it’s not really TMDS anymore) by taking over the clock lane and embedding clock data within the existing signal, much the same way it’s done with packet-based signaling systems."

Reference 128, "HDMI 2.1 To Bring Robust Home Theater Experience". hdguru.com: "The connector includes three twisted pairs and a clock – which translates to four twisted pairs but sending basically RGB or Y and Cb and Cr. HDMI 2.1 can be run in inverted clock mode, which uses all four lanes and is packetized – This is said to be similar to though not the same as DisplayPort."

DisplayPort and MHL uses certain techniques to derive the data clock from the data stream itself. They don't use a separate clock lane (except in MHL 1.3) and there are only data lanes. Numerous digital coding methods -- on data -- enables them to do this.

Until HDMI 2.0, each of the 3 data lane carried one of the 3 colour component (e.g. Blue on TMDS D0 channel) and this was a fixed mapping. But when you have less than 3 data lanes or more that 3 data lanes the fixed mapping cannot be used any more and data has to be sent as fragments over the lanes. They use packetization to flexibly partition and distribute the AV data over the data lanes. This kind of packetization is used in MHL/superMHL and DisplayPort.

They probably use both embedded clock and packetization on HDMI 2.1. But, embedding of clock signal in data lane is a separate point from packetized structure of data stream.

louisnells (talk) 05:32, 12 February 2017 (UTC)

My point was to aid the reader in understanding that the packet-based format is how the clock is able to be embedded (in addition to being a means to distribute the three streams over a non equal number of lanes). Otherwise the reader is left with the impression that packetization is just another unrelated change, something like 16b/18b. (Both references talk about the embedded clock in the context of packetization.)
Given that DisplayPort explicitly makes use of data packets (called micro packets) to embed the clock signal within the data stream, how does making it a standalone point help explain it? C0nanPayne (talk) 17:27, 12 February 2017 (UTC)
The packetization does not always imply embedding of the clock signal. For example with superMHL there are 6 data lanes and they use packetization, but still they have a separate clock line (they use eCBUS as clock line). Packetization and embedded clock are just two separate techniques. One does not imply the other one. Packetization works in data level and embedding of clock signal works in line coding level (like they are in 2 different layers).
louisnells (talk) 08:30, 13 February 2017 (UTC)

Specifications

The link to HDMI 1.3a Specifications is redirected to Uncompressed video.64.47.214.68 (talk) 08:24, 7 March 2017 (UTC)

HDMI Alternate Mode Support

It seems that HDMI is listed along with USB-C as one of the supported display interfaces for Qualcomm Snapdragon 835. HDMI was supported on many of the earlier Qualcomm chips. Also recently Qualcomm officially confirmed support for USB-C DisplayPort Alt Mode on same chip. Doesn't this mean USB-C HDMI Alt Mode is also supported ? Is anybody having any more info on this ?

File:Qualcomm Snapdragon 835 HDMI Alt Mode.png
Qualcomm Snapdragon 835 HDMI Alt Mode

louisnells (talk) 09:07, 6 June 2017 (UTC)

External links modified

Hello fellow Wikipedians,

I have just modified 6 external links on HDMI. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 22:06, 18 June 2017 (UTC)

HDMI 1.4 and CEA-861-D vs CEA-861-E

The article says HDMI 1.4 uses the CEA-861-E video standard while referencing a SiI9389 datasheet that mentions HDMI 1.4 and CEA-861-E compliance. However, I'm reading the actual HDMI 1.4b specification and it makes no mention of CEA-861-E, but talks about CEA-861-D... Anssi (talk) 00:02, 31 October 2013 (UTC)

It requires support for all formats defined in CEA-861-D, and requires support for some but not all formats listed in CEA-861-E. The specific formats are listed in section 6.3 of the HDMI 1.4/a/b Specification, headed with "Detailed timing is found in CEA-861-D or a later version of CEA-861 for the following video format timings:". Since some of the formats in the list that follows (such as 1080p 120 Hz in sec. 6.3.2) do not have timings defined in CEA-861-D, only in CEA-861-E and later, it does implicitly require the CEA-861-E standard to be referenced even though it is not mentioned by name. I suspect this line was simply overlooked when modified from the HDMI 1.3a document, and probably should have said "...found in CEA-861-E or later" explicitly, but it still works as-is. GlenwingKyros (talk) 16:54, 19 August 2017 (UTC)

CEC

In the article: "CEC wiring is mandatory, although implementation of CEC in a product is optional.[49]"

In the HDMI 1.3 spetification: "HDMI includes three separate communications channels: TMDS, DDC, and the optional CEC. "

Some cables does't have CEC wire. — Preceding unsigned comment added by Vagonoboev (talkcontribs) 07:43, 8 April 2015 (UTC)

You are referencing Section 8 of the HDMI specification, which is not defining cable specifications. As defined in Section 4 (physical layer and cable specification), the physical CEC wire is not optional. The implementation of the CEC channel in a product is optional (i.e. not all products are required to make use of the CEC wire, though all cables have it). GlenwingKyros (talk) 17:00, 19 August 2017 (UTC)

What connector for 2.0 and 2.0a cable?

The subtitle of this section is self explanatory. The 2.0 cable has been existing (for customers anyway) since 2013 but in the section of connectors nothing is being noticed about which connector it will connect to. 145.132.75.218 (talk) 21:13, 15 January 2016 (UTC)

HDMI versions and connectors are not related, any HDMI version may be transmitted over any connector type. GlenwingKyros (talk) 17:02, 19 August 2017 (UTC)

HEC verification

I have marked verification issues in HDMI#HEC. See further discussion at Talk:Ethernet_physical_layer#HDMI_Ethernet_Channel.3F. ~Kvng (talk) 15:39, 17 September 2017 (UTC)

Table on "Refresh Frequency Limits at Various Resolutions"

In the table is the column "Data Rate Required[a]" but with no explanation how this was calculated. Second, is was calculated vor 8 bpc (24 Bit/px) but standard will be in future HDR which has 10 bpc. Could somebody corrected this or add a second column! Thanks! -- 84.157.27.64 (talk) 18:46, 1 September 2017 (UTC)

Yesterday I reverted a change related to refresh rates, as HDMI 1.0 allows only specific rates, and so those should be stated. But data rates are a different question. Seems to me that in any case including compression, it isn't easy to give a meaningful data rate. Data rate is supposed to be an information rate, but is it before or after compression? More important is the required bandwidth for the signals, but even that isn't easy to define. Gah4 (talk) 18:58, 1 September 2017 (UTC)
More specifically, HDMI uses TMDS. That is only useful if, in fact, the bits are not random, such that the information rate is lower than the data rate. Specifying the clock rate for the TMDS signal might be useful, though, but that is somewhat different from the data rate. Gah4 (talk) 19:05, 1 September 2017 (UTC)

Calculation of datarate (bits per second) is simply the number of bits per pixel (24 in this case) times the number of pixels per second (which is number of pixels per frame times the number of frames per second). This must also include blank pixels padded around the image (or in actuality, pauses between sending pixel data between each frame and line, during which audio and control signals are sent, which is accounted for by pretending there are extra pixels which correspond to the amount of time spent on this "auxiliary data" period; these are referred to as "blanking intervals"). Standard video is uncompressed so there is no "before or after compression". HDMI 2.1 does introduce a compression option but since these data rates span all versions it should be clear that they do not have compression applied. I will edit the footnote to make this clear when I get home though.

As noted, the size of the blanking intervals used for calculating the numbers in the table are determined by CVT-R2, which is defined in the VESA CVT 1.2 standard, which is publicly available for download on the VESA website (on the Free Standards page). Though for simplicity you can use the calculator here, which can walk you through all the calculations step by step:

https://linustechtips.com/main/topic/729232-guide-to-display-cables-adapters-v2/?section=calc&H=1920&V=1080&F=60&calculations=show&formulas=show

It is not technically exact since it does not account for pixel clock rounding, but that's well below the precision level used on the table (10⁴). The formula shown on the page is also somewhat simplified although internally it uses the more complex exact formula for the actual results.

Datarate is used here and not bandwidth, since bandwidth requirements are dependent on the interface as different interfaces may need different amounts of bandwidth to send the same data. It would not make sense to list the bandwidth requirements since HDMI 1.0–2.0 use 8b/10b and HDMI 2.1 uses 16b/18b, so you would need two different bandwidth numbers for every format. It makes more sense to simply list the data rate, and it can be compared to each interface's datarate (bits of data per second, which excludes bits used for encoding purposes, as they don't represent part of the data set being transmitted) as listed in the table header, and in more detail in the table above.

I am adding another table for HDR formats soon, but am out of town at the moment. I believe putting them both in the same table would make it more difficult for readers to find the information they are looking for, so a separate table should be used in my opinion. GlenwingKyros (talk) 03:17, 3 September 2017 (UTC)

Also, higher solutions are achieved using 4:2:2 chroma subsampling, as some initial UHD TVs used, and also can be used to support 4K HDR10.Tafinho (talk) 16:10, 19 September 2017 (UTC)

Yes, although as mentioned in the footnotes already the table is only considering 4:4:4 formats. If we are going to include subsampling then we may as well also start saying HDMI 1.4 supports 4K 60 Hz in the previous table (because of YCbCr 4:2:0) etc., but this I think would only lead to confusion. GlenwingKyros (talk) 18:16, 19 September 2017 (UTC)

HDMI is being superseded by DisplayPort ?

HDMI is owned and maintained by HDMI Founders and HDMI Forum together and neither of these groups ever agreed with VESA (which maintains DisplayPort) to obsolete HDMI OR on it getting superseded by DisplayPort. Does anyone have any official info on the same ?

HDMI and DisplayPort are rather two competing display interface standards with each one going ahead with it's own road map. louisnells (talk) 16:53, 2 October 2017 (UTC)

"HDMI is the de-facto connection in the home theatre and is used widely on HDTVs as an A/V interface. Some PCs and monitors include HDMI to enable connectivity with HDTVs and other consumer electronics gear. While DisplayPort has a rich consumer electronics feature set, it is expected to complement and not necessarily replace, HDMI. DisplayPort is focused on PC, monitor, and projector usages as a replacement for DVI and VGA where high performance and backwards and forwards compatibility over standard cables are valued." https://www.displayport.org/faq/faq-archive/ GlenwingKyros (talk) 17:57, 2 October 2017 (UTC)
I think I agree that HDMI is not being superseded by DisplayPort, but then again, am I sure that HDMI has superseded DVI and VGA? Did they ask/tell the DVI and VGA people about that? How far along the superseding does it need to be, before we state it here? Gah4 (talk) 19:49, 2 October 2017 (UTC)
(Edited) Now that I think about it, "supersede" isn't really an applicable term for any of these situations, since the usage of the term means for one thing to have the "final say" over the other; for example HDMI 1.4 supersedes HDMI 1.3, meaning that for any differences between them, whatever is written in the HDMI 1.4 spec takes precedence over 1.3. Neither DisplayPort or HDMI can supersede the other, because they don't cover the same domain; the DisplayPort standard does not define how to create HDMI devices, and the HDMI specification does not define how to create DisplayPort devices, so neither of them has any material which can supersede what the other says. I think the word people are looking for is "HDMI/DP is the successor to DVI/VGA" or whatever. As for DP being the successor to HDMI, I do not agree with this either, because development of HDMI has not been abandoned in favor of DP, or vice versa; both of them are still actively publishing new revisions. Contrast that with DVI, whose creating organization has disbanded, or VGA, which is certainly not going to have new revisions published any time soon either. These have been succeeded by HDMI and DP. GlenwingKyros (talk) 20:46, 2 October 2017 (UTC)
When you look at it, competing standards are not much different from competing companies: they may either more or less co-exist or eventually one wins over the other, pushing it from the market. Also, a standard may be improved to a new functionality, obsoleting the former version. Only the latter two can be called supersession. Even though HDMI may have lost share in the computer market it's still competing and it's very strong in the media market and very unlikely to be pushed from the market. --Zac67 (talk) 09:15, 3 October 2017 (UTC)
Many of the technologies (e.g. TMDS) which made DVI/HDMI/MHL possible came from -- and even owned by -- Silicon Image Inc. They were also part of the DVI working group (DDWG). Even HDMI had electrical compatibility with DVI defined as part of it's core specification -- both are TMDS based anyway. So, probably they were OK with HDMI superseding DVI.
But not sure of the current ownership of VGA or it being superseded. But anyway the technology is kind of obsolete and newer standards suffices the newer display interface requirements. louisnells (talk) 17:21, 4 October 2017 (UTC)
Seems to me that once an improved (superseder or successor) starts to gain market share, it will be used for new sources. Sinks will start using it, but also stay compatible with previous sysgems, while those are still around. Also, the older ones stay around a while for cases that don't need the advantages of the new one. Low resolution video often still uses composite video (for TV inputs) and VGA is still common for video projectors, such as used by seminar presentations. (The latter also because of the complications of long distances for HDMI.) For all these reasons, old technology stays around for a long time. Gah4 (talk) 18:08, 4 October 2017 (UTC)

External links modified

Hello fellow Wikipedians,

I have just modified 12 external links on HDMI. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 06:02, 27 October 2017 (UTC)

Ultra Wide resolution support on HDMI

Can somebody please clarify on the ultra-wide support of HDMI.

HDMI 2.0 seems to have official support for ultra-wide resolutions (e.g. 2560x1080, 3440x1440, ...). But can someone clarify on: 1) On which versions (1.4 ?) ultra-wide timings were officially added. 2) Which resolutions were supported (1080p, 1440p, ...) on each version. louisnells (talk) 14:32, 3 November 2017 (UTC)

Explicitly defined ultrawide formats were added in HDMI 2.0 (via CTA-861-F). Prior to that all ultrawide formats were implemented as vendor-specific formats. Keep in mind that HDMI "adding support" for formats does not mean adding the capability as if it were incapable of display those formats before; HDMI can run at any arbitrary resolution and refresh rate combination within the bandwidth limit, which is how new custom formats such as the first 2560×1080 monitors can be created with existing HDMI standards. "Adding support" for 21:9 ratio means adding supporting material (explicitly defined timings and formats) to promote interoperability, but HDMI does not require "support" to be added to be able to display a particular format. (based on your wording you may already understand this, I write it just in case). GlenwingKyros (talk) 21:53, 3 November 2017 (UTC)

obscure formats

As noted in the article, HDMI 1.0 and 1.1 allow specific video formats. These should stay in the table. Later versions allow for any format compatible with the frequency limits. There is, then, no need to add specific obscure formats. Entries in the table at close to the maximum frequency for each version are probably useful. Gah4 (talk) 00:06, 14 November 2017 (UTC)

600Mhz Source and existing cables compatibility

There is no source for the 600Mhz TMDS frequency value. Also, there is no independent source for the compatibility for existing 1.4 cat 2 cables, apart from the HDMI forum. Tafinho (talk) 22:06, 19 June 2014 (UTC)

Several sources mention 18 Gbps for HDMI 2.0 so based on how HDMI works that would mean a TMDS clock rate of 600 MHz. I have added two references for the TMDS clock rate. As for compatibility with older cables it would depend on the cable but almost all HDMI cables are made out of copper cable. As long as the bits are correctly transferred over the copper cable it doesn't matter if the cable was designed for it. --GrandDrake (talk) 21:31, 28 June 2014 (UTC)

To this topic: the article says "Category 2-certified cables, which have been tested at 340 MHz" and that HDMI 2.0 introduces 600MHz. So how should one go about purchasing a cable? Blindly test many until one works? It there an unofficial label used by cable manufacturers? --193.169.48.48 (talk) 13:51, 19 February 2015 (UTC) (the above was by me)--Xerces8 (talk) 09:44, 22 February 2015 (UTC)

The HDMI FAQ explicitly states that "HDMI 2.0 specification defined a new, more efficient signaling method, for speeds above 1.4b limits (10.2Gbps), to allow higher bandwidths (up to 18Gbps) over existing High Speed HDMI Wire Cables." and (further down on the page) "HDMI 2.0 features will work with existing HDMI cables. Higher bandwidth features, such as 4K@50/60 (2160p) video formats, will require existing High Speed HDMI cables (Category 2 cables)."
In other words, "premium high speed" is the same category-2 ("high speed") cable that HDMI 1.4 uses. The name is for a marketing/certification program, not a new cable specification. HDMI 2.0 is not sending 600MHz but is encoding more data (18Gbps) onto the same frequencies (up to 340MHz).
Category-3 cable is the "Ultra High Speed" cable introduced as a part of HDMI 2.1:
Q: What is an Ultra High Speed HDMI Cable?
A: The Ultra High Speed HDMI Cable is the first cable defined by the HDMI Forum. Ultra High Speed HDMI Cables comply with stringent specifications designed to ensure support for high resolution video modes such as 4Kp50/60/100/120 and 8Kp50/60 as well as new features such as eARC and VRR. Ultra High Speed HDMI Cables exceed the requirements of the latest international EMI standards to significantly reduce the probability of interference with wireless services such as Wi-Fi.
Q: Is the Ultra High Speed HDMI Cable a Category 3 cable?
A: Yes
Shamino (talk) 17:57, 26 December 2017 (UTC)
I also found that the HDMI 2.0 spec states that the TMDS clock rate is 1/4 the TMDS character rate for character rates greater than 340Mcsc (they are equal for rates of 340Mcsc is less). This is how they pack 18Gbps onto a 340MHz TMDS clock. Shamino (talk) 18:40, 26 December 2017 (UTC)
That is only a technicality, as the clock signal does not carry the data, that's not where the 18 Gbit/s resides; the data channels still operate at 6 GHz as opposed to 3.4 GHz, which creates compatiblity concerns, and whether the clock rate runs at 600 MHz or 150 MHz only changes whether the clock-to-data signal ratio is 10:1 or 40:1. Premium High Speed cables are certified at 18 Gbit/s, High Speed cables are only tested at 10.2 Gbit/s. They are different certifications. The HDMI FAQ stating that all certified High Speed cables will work at 18 Gbit/s was written before Premium High Speed cert existed, and was found to be wrong (that's why the Premium High Speed cert had to be created in the first place).
http://www.bluejeanscable.com/articles/note-about-hdmi-2.htm
http://www.bluejeanscable.com/articles/premium-hdmi-cable.htm
GlenwingKyros (talk) 19:46, 29 December 2017 (UTC)

Ultra High Speed HDMI TMDS clock

Does anyone have a source for the maximum TMDS clock rate for HDMI 2.1? The existing article says 1.2GHz, but doesn't provide a source for the number. The text assumes that the TMDS clock rate equals the TMDS character rate (1.2GHz for 12Gbps), but this is probably not correct given the fact that HDMI 2.0 uses a TMDS clock rate of 1/4 the TMDS character rate for character rates over 340Mcsc. It is highly likely that HDMI 2.1 will either use the same divider or possibly a higher ratio. Until someone can check the HDMI 2.1 spec, I don't think we can make any assumptions here. Shamino (talk) 18:52, 26 December 2017 (UTC)

Hi; thanks for adding the HDMI 2.0 source.
The information of interest is really the amount of data per channel (3.4 Gbit/s per channel in 1.4, 6.0 Gbit/s per channel in 2.0, etc.), the only reason we use TMDS clock is because that's what the HDMI people use, and it help the page be understandable to readers if we discuss things in terms of the same specifications as other sources. The assumption is based on the HDMI Forum's own statements, the same as it was for HDMI 2.0's 600 MHz TMDS clock, but it appears they have just been giving an "equivalent" figure so people don't get confused. Since this is the case I think it would be better to abandon the TMDS clock notation going forward and just use data rate per channel anyway. TMDS clock seems to carry very little meaning now.
Also in regards to the "Mcsc" thing, this is the HDMI Forum's own custom abbreviation, but it shouldn't be used here; the correct unit for character rate is baud ("Mbaud", "Gbaud", etc.). If you want to say we should use "Mcsc" because that's what the HDMI Forum uses, I would also point out the HDMI Forum uses "msec" for milliseconds, but we don't use that either (EDIT: Well, HDMI Forums seems to have switched over to the conventional "ms", but HDMI Licensing used "msec" and we ignore it). Making up their own abbreviations is something they do a lot, Mcsc is just the latest example. This page doesn't strictly follow the terminology established by the HDMI people, we use more standard notation as again it helps it be relatable to other pages, and more understandable. I'm wrong, was taking "character rate" as meaning "symbol rate". Either way, considering that "Mcsc" is terminology only introduced with HDMI 2.0, which is a confidential document not really available to the public (even the original source no longer exists, the cited one is to the internet archive) I'm not a fan of the special notation. I gather it's to distinguish TMDS clock from the rate at which 10 bit characters are transmitted which is no longer 10 bits per TMDS clock, but since TMDS clock in MHz is the conventional notation (and even HDMI Licensing describes HDMI 2.0 as "600 MHz pixel clock" everywhere outside the specification) in the interest of clarity it would probably be better to use the common notation. Perhaps rename the table heading to "Effective TMDS clock" or something like that. Like I said, it's not the first time the HDMI Specification has come up with its own special notation which has not been replicated here. GlenwingKyros (talk) 19:56, 29 December 2017 (UTC)

I understand. Given all of this, perhaps it makes sense to restrict discussion of TMDS clock to the section discussing TMDS, and remove it from other locations (as you did from the table), since it is really unrelated to cable bandwidth - it's only for the devices to sync to each other. The character rate is also probably superfluous, since it is effectively a derived value - bit-rate * 10 / channels. Discussion about analog frequencies that must be carried by cables may be of interest to readers, but for now (at least until HDMI decides to use something like QAM) this is equal to the bit rate for a single channel. It is telling that HDMI certification (at least the docs I've seen) seems to talk entirely in terms of bit-rate and not frequencies. Shamino (talk) 15:15, 2 January 2018 (UTC)

Agreed, I considered doing that when I created the tables, the only reason I decided to leave things in terms of TMDS clock was because that's the specification that the HDMI people use to describe speeds, and I wanted to maintain that consistency to make it more understandable. I would prefer to leave the "340 MHz" and "600 MHz" listed somewhere on the table for that reason; it may be confusing to people if the spec that the HDMI people use is not listed on the table. A table row for bitrate per channel or data channel signal frequency could be a helpful intermediary step to make it more clear for people who are familiar with the 340 MHz number, to see how that translates to 3.4 GHz per channel (via the 10 data channel signals per one clock signal) and then from there to 10.2 Gbit/s bandwidth (via x3 channels). Of course, it's a bit of a mess using character rate since the 600 MHz number doesn't really have any physical significance, it's just an "equivalent conversion" to make it easier to compare to other versions... Would make more sense to just use bitrate instead as you say, but that's not what the HDMI people did, so... here we are :) GlenwingKyros (talk) 18:13, 2 January 2018 (UTC)

"4K" is the term coined by the DCI to describe a horizontal resolution of 4096 pixels.

The term refers to that standard.

Television marketing folks decided that they liked the term so much that they incorrectly applied it to their own sub-4K televisions, creating ambiguity, and more importantly, making it much harder for the competing DCI to market their products.

As Wikipedia is an encyclopedia, and not an advertising brochure, it's inappropriate to promote advertising terminology, especially when it's ambiguous and misleading. The precise term for the specification of these higher-resolution televisions is UHD-1, which refers specifically to the 3840 x 2160 pixel resolution.

InternetMeme (talk) 12:55, 1 February 2018 (UTC)

No. That is simply factually inaccurate. "4K" is not a term that was coined by DCI. It is not, and has never been, used solely to refer to that standard. That is something completely made up by consumer "tech" journalists when 4K TVs were first coming to the market, and they were scrambling to be the first to write a "4K EXPLAINED" article with a few minutes of Google research to back it up. Needless to say, they got it completely wrong.
In reality, "4K" and "2K" are used (and have always been used) as generic terms in the cinema industry, to refer to resolutions within a certain class, sometimes referred to in a longer form such as "4K x 2K" or "4K x 1K". I'm sure you've heard that before. They do not refer to exact resolutions, and "4K" is not some "official name" for some specific resolution. Here are some examples:
So no, "4K" is not a term created by DCI. And television marketing folks did not decide on their own to refer to 3840×2160 as "4K", it's called that in the official press release from ITU when the standard defining UHDTV resolutions was published: https://www.itu.int/net/pressoffice/press_releases/2012/31.aspx#.WAJplugrKCp
So as I said, the whole "3840×2160 isn't 4K, that's "UHD"! REAL 4K is 4096×2160!" is completely made-up by consumer journalists, and has no basis in reality. And such misinformation certainly has no place being proliferated here.
(EDIT: Also, UHD-1 is the name of DVB's plan to roll out 4K resolution to broadcast television. It isn't a name for 3840×2160 established by ITU in the definition of UHDTV, they simply refer to it (them) as the 3840×2160 and 7680×4320 systems of UHDTV, or more colloquially as the 4K and 8K UHDTV systems, as noted above.
In regards to being vague, there are also numerous references to 5K, 8K, 10K, etc, which apparently you have no problem with, so I don't see why 4K alone would need to be changed. Exact resolutions aren't needed at all times. When the specific resolution is important, the resolution is listed in parentheses next to the term 4K, in most other places I don't see a particular need to be specific about the resolution, the general class of resolution is all that is needed to make the point.)
GlenwingKyros (talk) 18:41, 1 February 2018 (UTC)

Citations do not support the claim

Article claims: "Conversion to dual-link DVI and component video (VGA/YPbPr) requires active powered adapters.[182][189]"

But notes 182 & 189 say nothing about Component at all. They concern Display Port. Can the editor provide proof of his claim? I received with a device I bought a cable which has HDMI on one end & 3 component cables on the other end, which I believe is intended to change HDMI to Component. But from where would power come? I don't see any circuit imposed in the adaptor; I suppose it could be tiny. (PeacePeace (talk) 01:19, 27 July 2018 (UTC))

HDMI pin 18 provides +5V which could be used to power conversion circuitry. However, this does not mean that the claim is correct. Verbcatcher (talk) 03:17, 27 July 2018 (UTC)
Which claim are you challenging, that HDMI -> Component requires active adapters in general (i.e. a signal conversion process), or just the claim that the adapters always require power? The claim that a signal conversion from HDMI is necessary is difficult to source; it is true simply as a consequence of the fact that the HDMI specification doesn't define any such capability; devices that are designed to meet the HDMI specification will not have the ability to output analog component signals. Of course though, there is no specific passage we can cite which talks about what isn't in the HDMI specification. The claim that adapters require power is technically true at all times, though trivially so; any DAC circuit will require some amount of power to operate. However, to most people, "powered adapters" implies an external power connector, so that statement may be misleading in that sense. HDMI provides inline 5 V power, but only in very small amounts (max 0.055 A of current, or 0.275 W), but that's intended for reading the EDID of a powered-down sink device. I have no idea if that would be enough for powering a simple DAC circuit, but if it can power an EDID chip, maybe it can power a DAC chip. GlenwingKyros (talk) 17:47, 30 July 2018 (UTC)
I don't know if I have one, but I think there are HDMI to VGA adapters with built-in DAC, and powered by the HDMI source.[1] Gah4 (talk) 18:45, 30 July 2018 (UTC)

References

  1. ^ "HDMI to VGA adapter". amazon.com. Retrieved 30 July 2018.

New Section Needed on Circuitry

After looking through this article (lots of words), I did not see any explanation of the circuitry; schematics. Is the HDMI cable itself a passive device -- just wires with some impedance running from connector to connector? Or is there circuitry embedded inside the cable which is fed with a power source & does things?

Where is the HDCP circuitry? Is it in the device which puts out an HDMI signal? Where is the response circuit? -- in the device which receives the signal? I note that I can run an HDMI signal out of a DVR to a component converter box which changes HDMI to component, then run component cables to a monitor & see a picture on the monitor. How does the handshake take place? But if I run the same HDMI signal to the same converter box, then run component cables from the converter box to another DVR that has component input, the component input DVR receives no signal apparently. Can someone clarify this mystery? What circuits located where are running the HDCP? (PeacePeace (talk) 02:18, 10 August 2018 (UTC))

Maximum Limits Tables

I have reverted edits which added a second column of data rates for CEA-861 timings and removed some formats.

The purpose of the maximum limits table is to list the maximum that HDMI is capable of, within the best of our ability to estimate (since timings of course can be customized by the manufacturer to an extent which differs with every display, so no exact numbers are possible). CVT-R2, which is the lowest overhead standardized timing formula, gives the closest estimate of what HDMI is capable of. The point of the data rate is to give an a general idea of how close it comes to the maximum data rate of each HDMI version. They aren't listed simply for the sake of informing people about data rates, so listing additional data rates with larger timings like CEA-861 serves no purpose and only adds confusion.

30 Hz format listings are relevant for computer monitors to inform people that HDMI version X is limited to 30 Hz at that resolution, rather than leaving it ambiguous. 30 Hz formats are encountered with high resolution computer monitors such as 1440p monitors with 165-MHz-limited HDMI ports, or 4K monitors with only HDMI 1.4 ports, etc. These are problems that people encounter plenty often with real world computer monitors when they don't research their connection interfaces enough. The point of listing them here is that this article is often a source for such research, so providing the information here will help people do research and avoid these sorts of problems.

HDMI 2.0 only supports 4K 60 Hz RGB/4:4:4 when using 8 bpc color depth. When using 10 bpc color depth (which is required by HDR) it can no longer achieve 60 Hz, and the maximum is 50 Hz. The reason 4K 50 Hz is listed in the HDMI table is so that people may read this page wondering if HDMI 2.0 can do 4K 60 Hz HDR, which is a subject of common interest, and be able to immediately identify that the maximum is 50 Hz unless subsampling is used. However I do agree that 5K 50 Hz and 8K 50 Hz are largely irrelevant at the moment and so I have not reinstated those.GlenwingKyros (talk) 04:42, 2 November 2017 (UTC)

But why doesn't HDMI 2.0 support 4K 60Hz RGB 4:4:4 10bpc? 15,x GBit < 18 GBit. [1] 2003:E5:B718:18ED:2170:82C0:4772:C6AB (talk) 20:50, 30 November 2018 (UTC)

15 Gbit/s exceeds the maximum data rate of HDMI 2.0, which is 14.4 Gbit/s. The *bandwidth* is 18.0 Gbit/s, but not all of the bandwidth can be used for data, 20% of it used for 8b/10b encoding overhead. This is explained in the footnotes of the version comparison table (and should have been in the other table footnotes as well; I'll add it later). GlenwingKyros (talk) 02:26, 1 December 2018 (UTC)

This is an article by and for nerds. Useless for the rest of us.

I need to buy a new laptop. Wanted to know a bit about the standard connections I'm being offered, at a practical user level. This article is useless. Full of technical stuff. Useless for the masses. Is there a chance it will ever change to suit the vast majority of us, rather than just the geeks of the world? HiLo48 (talk) 20:50, 2 October 2018 (UTC)

I tend to agree with this. Much of the article is not well-written, containing irrelevant or needlessly technical statements, and is not in line with presenting information people will be looking for, particularly in the opening sections.
I've already re-written the version comparison tables entirely, which I think are the most-linked section of the page. I was planning on doing the Cables and Connectors section next, but I'm very busy so that's a long way off.
Rewriting is a very slow process. GlenwingKyros (talk) 16:23, 5 October 2018 (UTC)
Thanks for the response. I agree that tasks like making tech stuff friendly to the masses can be a challenging task. As an old techo I wish I could help, but this isn't one of my strong areas. HiLo48 (talk) 04:04, 6 October 2018 (UTC)
I wouldn't want to see the technical information removed, but probably we could make the article more user-friendly by getting more specific feedback about what's missing. What question were you trying to answer and couldn't? Krubo (talk) 06:03, 2 May 2019 (UTC)
The first paragraphs seem to describe HDMI in a mostly non-technical way. Given something as technical as it is. For most uses, about all you need to know is that the signal comes from one device, and goes into another. But yes, the nerds do need a reference for the more complicated details, and the later sections are where this appears. As above, what did you want to know that it didn't say? Gah4 (talk) 06:23, 2 May 2019 (UTC)

«DisplayPort has a royalty rate of US$0.20 per unit (from patents licensed by MPEG LA),»

It's only a claim. Whether it's correct or patent trolling is up for debate and court rulings. DisplayPort is meant to be royalty-free. See https://en.wikipedia.org/wiki/DisplayPort#Cost

How could this be reworded to be more accurate? (if that's a correct understanding)

--Tuxayo (talk) 16:41, 4 August 2020 (UTC)

Accessibility

This article needs to address accessibility of the HDMI standards. It seems like audio description for the blind can be included on one of the audio channels. The HDMI standard does not include communication access though for people with hearing loss or non-native speakers of the primary language (no streams for closed captioning data whether for deaf and hard of hearing or anime/foreign language) so in order to use HDMI with captions, the originating equipment must decode the data and pass the generated pictures of text to the end equipment. This limits the accessibility for persons with hearing loss, especially those with visual impairments. Televisions in the United States are required to have closed caption decoder chips and the visibility can be changed to the user's liking for over the air or Video/Audio inputs but this does not work for HDMI inputs since they do not pass the caption data through. This limits the ease with which physically disabled persons can use captions since they cannot just turn their television caption decoder on once and be done, they must turn on captions on each piece of sending equipment when using HDMI. This makes it harder for children and the aged to gain ease of use for captions. This also increases costs for everyone as with HDMI all of the sending and receiving units must have decoder chips (receiving units/televisions for over the air caption data.) — Preceding unsigned comment added by 2602:30A:C049:EA80:4885:EA70:59A9:82AB (talk) 18:30, 9 June 2014 (UTC)

As far as I know it, this isn't part of HDMI. HDMI moves the signal from source to destination. Audio signals will be supplied by the source, selected from those available. For example, a DVD player will select from the available audio tracks and send one out. I believe CC is also selected and decoded inside a DVD player, and the resultant video sent out. CC decoders in television sets will decode ATSC (or NTSC) closed-caption signals. An external tuner, connected through HDMI, will also do that. Gah4 (talk) 22:09, 25 February 2021 (UTC)

DVI-D to HDMI adapters do not appear to be single-link only

While the article states that such adapters are single-link only, page 139 of the linked PDF lists a "Type B to DVI Adapter Cable," which appears to have all the wiring assignments required for dual-link. This is preceded by "Type A-to-DVI-D" on page 138, which is indeed single-link. Did someone misread the document, or am I confused? Thanks. — Preceding unsigned comment added by 73.88.59.121 (talk) 21:46, 18 March 2021 (UTC)

A Type B to DVI adapter could in fact be dual link - theoretically. I don't think they exist. --Zac67 (talk) 21:51, 18 March 2021 (UTC)
I suppose that info should be added to article? — Preceding unsigned comment added by 73.88.59.121 (talk) 22:07, 18 March 2021 (UTC)
The mechanical specifications for an HDMI Type B connector was defined in the specification for later use, but not permitted for use in products. The intent was to allow compatibility with dual-link DVI as you describe, but ultimately they decided not to use dual-link operation in HDMI, so the Type-B design was never used and remains to this day a theoretical concept only. GlenwingKyros (talk) 22:21, 18 March 2021 (UTC)
I think we should mention that in the article, I can't be the only person disbelieving of single-link only, going to the linked PDF, and then being confused. — Preceding unsigned comment added by 73.88.59.121 (talk) 22:29, 20 March 2021 (UTC)

Change all 48 Gbit/s to 48.11 Gbit/s

People are really confused online, see https://images.idgesg.net/images/article/2017/12/formatdataratetable-100743742-orig.jpg Also, 40.1 instead of 40 and 32.08 instead of 32 (LG CX and PS5). 109.252.90.119 (talk) 22:09, 4 November 2021 (UTC)

That table shows bit rates required for various video formats. Not the bit rate of the HDMI interface. GlenwingKyros (talk) 22:29, 4 November 2021 (UTC)
Is not that only correct for DSC 1.2 formats? What is the math there? 109.252.90.119 (talk) 22:37, 4 November 2021 (UTC)
I don't understand your question. What are you referring to? Is what only correct for DSC? GlenwingKyros (talk) 22:43, 4 November 2021 (UTC)
Only for DSC 1.2 the data rate and uncompressed data rate are different, is it not? Not counting 16b/18b, sice that is 42.6666.../16*18 = 48.000 42.666... is mentioned in the article, is that correct? 109.252.90.119 (talk) 02:30, 5 November 2021 (UTC)
If DSC compression is not used, the uncompressed data rate is the data rate. None of the bit rates listed on the table you linked include compression in the calculation, and neither do any of the tables in this article, so I don't know how it is related to anything. Can you clarify which part of the article you think needs to be changed, and why? GlenwingKyros (talk) 02:36, 5 November 2021 (UTC)
I think all mention of 48 Gbit/s should be 48.11 Gbit/s. 109.252.90.119 (talk) 14:39, 5 November 2021 (UTC)

The bit rate of HDMI 2.1 is 48.000 Gbit/s, not 48.11 Gbit/s. GlenwingKyros (talk) 18:05, 5 November 2021 (UTC)

Then what is 48.11 Gbit/s? Valery Zapolodov (talk) 07:51, 6 November 2021 (UTC)
The bit rate of the video format on the table? GlenwingKyros (talk) 08:15, 6 November 2021 (UTC)

After 2.1?

So what is being developed as v2.2 or v3.0 (or whatever number they choose to call it)? 2.1 is effectively being put in everything now, so is largely the release standard in products, but surely R+D work is being done on the next standard... yet nothing in the article about what is needs to be able to do, or what's being planned/aimed for (eg. 16K, higher frequency rates, etc.)? (Yes, I know some things remain business secrets, but a lot of this is likely to be obvious expected capabilities that can be mentioned.) Jimthing (talk) 15:54, 13 December 2021 (UTC)

Audio

Should not the audio be simply… limited to the tuner’s decoder? HDMI is as passive in data signals as any other wire. MacOS doesn’t support 5.1 in 2 DTS note does VLC which outputs the raw 2ch stream but my tuner picks it up just fine and decodes it before sending 5.1 to speakers. Lostinlodos (talk) 11:29, 23 December 2021 (UTC)

Additional features of HDMI 2.1

This statement is factually incorrect since all products licensed under the deprecated 2.0 standard are now licensed 2.1. Since no tiers exist it is untruthful to call these additional features, since all of them are optional. Previous HDMI standards tried to console the availability of any feature under the standard. There is no other improvement possible but calling them optional. Engaging in corporate PR is not purpose of this community. — Preceding unsigned comment added by 2003:C9:C714:2000:B6:261D:4C02:AE91 (talk) 02:30, 30 December 2021 (UTC)

I have moved your comment to a new thread here, since it did not appear to have anything to do with the discussion it was placed under. "Additional features" is correct phrasing. If you read the previous revision of the HDMI Specification, version 2.0, these features were not there. In version 2.1 of the HDMI Specification, they are. They weren't there before, now they are there, hence they were added to the HDMI Specification. I believe you are confusing the HDMI Specification with HDMI devices. These features were added to the HDMI Specification. However, you are correct they were not "added" to devices (as new features very rarely are added to existing devices), nor are they required to be present in all new HDMI devices moving forward. Indeed, they are optional, as every new feature of every previous new version of the HDMI Specification has been. There is nothing particularly special about the release of the HDMI 2.1 Specification with all new capabilities being optional. It would make very little sense to require every single new HDMI device to have new advanced capabilities, and to have previous HDMI designs which were perfectly compliant when the previous version was in force suddenly no longer be spec-compliant and no longer allowed to be manufactured, which is what would happen if the latest version of the HDMI Specification made a new addition mandatory which didn't exist in the previous version it is replacing.
Right now, this article does a poor job of explaining the structure of the HDMI Specification and HDMI compliance testing, and explaining the meaning and proper usage of version numbers. This is primarily due to a lack of sources that can be referenced, as most related material is kept confidential. But even so, there are many improvements that could be made to the article in this regard. However, the edits you have made here are not correct either, and only add to the misinformation. For example, changing the table of features to say "optional" for all the features introduced in version 2.1. All features in every version of the HDMI Specification are optional, as mentioned before, so it makes very little sense to make this change while leaving the other versions and features saying "Yes", as this implies that the other features are mandatory in those respective versions, which is untrue. Version 2.0 of the HDMI Specification for example does not require HDR, nor does it require dual view support (and in fact I've never seen it implemented before). These features are all "optional" as well, so in fact you would need to change the entire table to say "optional". But I believe this would be more effectively explained in text rather than trying to use the table. The table (both in its former state and in your edited version) promotes the idea that "version numbers" are a method of indicating what features are supported by devices, when it is really just indicating which version of the HDMI Specification was the first to allow each feature, and the entire table probably needs to be reformatted to avoid this sort of misunderstanding.
As far as the claim of "HDMI 2.0 has effectively been renamed HDMI 2.1", this is not correct. Otherwise devices that implemented the new capabilities introduced in version 2.1 would not be able to be certified, as the older version 2.0 of the Specification contains no provisions regarding implementation of those features. Devices implementing the new capabilities (FRL or eARC for example) rely on the new information contained in version 2.1 of the Specification, which was not contained in version 2.0. But version 2.1 also still contains all the instructions for building lower-spec HDMI devices, otherwise such devices would no longer be permissible. Different versions of the HDMI Specification are not concurrent standards. Version 2.1 is an edited replacement for version 2.0, and it still contains all the same information and instructions that version 2.0 did, plus some new things that were added. Version 2.1 replaces version 2.0 as the current in-force edition of the HDMI Specification, and it governs the entirety of the HDMI ecosystem, from 1080p 60 Hz TVs to 4K 120 Hz monitors, just as every previous version of the HDMI Specification has done. Just like the 2020 edition of the National Electrical Code replaces the 2017 edition, it isn't a new "tier" of electrical feature support or something like that.
The real issue here is that people believe "version numbers" are a method of indicating features support, and this article currently does not do a good job of explaining the reality of the situation (in many cases, the phrasing encourages this misunderstanding). There are many improvements that can be made in this regard.
Lastly, please do not reinstate your edits for the time being. Your edits are disputed, and therefore the page should be left in its original state while we discuss the matter here, and hopefully other editors. Once a WP:CONSENSUS is reached, the agreed changes can be made to the live article. If you continue to reinstate your edits, it will be considered an WP:EDITWAR and what will mostly likely happen is your edits will be reverted by an administrator and you will be banned. The reason for the ban will be failure to follow Wikipedia editing policy after it has been clearly explained to you. This would not help anyone, and I would prefer that you stay and continue the discussion so that we can clear up any misconceptions that anyone might have, and walk away with a better understanding and a more accurate article. Disputes on Wikipedia are resolved through reasoned and calm discussion, not by making the same edits over and over in an attempt to "shout louder" than the other person and making accusations about the other side and avoiding their discussion points. If you are really correct, then you should have no problem making your case in a discussion and convincing me and/or other editors that you are correct, and your edits should be accepted. But trying to reinstate your edits without waiting for WP:CONSENSUS from others will never result in your edits sticking permanently. The only way forward is to make a convincing argument here. GlenwingKyros (talk) 07:11, 30 December 2021 (UTC)

No the issue is you are trying to avert the conversation from the actual topic to benefit your narrative. The article is easily changed to be comprehensible and represent the implementation and licensing of the current 2.1 standard. Implementation and Licensing cannot be viewed separately, but this is what you demand. This is untruthful, since all devices, even with production prior to the definition of 2.1, are now licensed under 2.1. There is an official statement of the HDMI licensing Administration in regards to this practice, hence your comment of lack of sources is untruthful as well. — Preceding unsigned comment added by 2003:C9:C714:2000:B6:261D:4C02:AE91 (talk) 11:27, 30 December 2021 (UTC)

I'm discussing the topic in great detail, I'm not sure what you mean be averting from the "actual topic". The actual topic is the meaning of "HDMI 2.1" and version numbers.
The article is easily changed to be comprehensible and represent the implementation and licensing of the current 2.1 standard. Implementation and Licensing cannot be viewed separately, but this is what you demand.
Implementation and licensing are separate, that is precisely the reason for your confusion. All products are licensed under whatever the most recent version of the HDMI Specification is, which is version 2.1. Every device can have a different implementation, as they can choose which features of the HDMI Specification to follow. The official statement of HDMI Licensing basically says what I'm saying. Version 2.0 is deprecated, and all products are certified under the version 2.1 standard now. This does not indicate what features are or aren't supported. Saying "HDMI 2.0 has been renamed to 2.1" is incorrect (unless you want to say that every version is the previous version renamed, and we're still on "HDMI 1.0" right now). My statement about "lack of sources" refers to the sentence that came right before it, not to HDMI Licensing's statement.
Please stop editing the page to restore your version without achieving WP:CONSENSUS. You are restoring content that is incorrect, and I have told you is incorrect and explained why, and you have not responded to this but continue to restore it. Following the procedures here is a requirement, not an option. GlenwingKyros (talk) 18:34, 30 December 2021 (UTC)

"HDMI®" listed at Redirects for discussion

An editor has identified a potential problem with the redirect HDMI® and has thus listed it for discussion. This discussion will occur at Wikipedia:Redirects for discussion/Log/2022 April 15#HDMI® until a consensus is reached, and readers of this page are welcome to contribute to the discussion. BD2412 T 05:00, 15 April 2022 (UTC)

  1. ^ Q: Does Dynamic HDR require the new Ultra High Speed HDMI Cable? A: No, but it will be necessary to enable 4K120 and 8K60 video with HDR due to the high bandwidth required by these resolutions and refresh rates. https://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx