What are the requirements for a graphics card to support HDCP?

A complete HDTV system solution includes three parts: source acquisition, decoding, and output display, including N and A cards have basically broken through the bottleneck that HDTV can only rely on processor decoding, and the HDTV acceleration circuit integrated in the core of the graphics card can achieve smooth decoding acceleration of HDTV three formats in the current mainstream system (replacing several of the processes in the decoding work, which can only be described as decoding acceleration If the CPU can be completely disengaged from the process, it is a complete hard decoding). But decoding is only one third of the system, and the other third is bound to be stuck in the neck by HDCP. To solve this problem, the graphics card corresponding to HDCP previously needed to be retrofitted with Silicon Image, TI and other manufacturers' control chips. In general, support for HDCP graphics cards need to meet four conditions:

EEPROM size of the graphics card can be adjusted.
EEPROM content can be erased.
GPU support for on-the-fly video data encoding.
The manufacturer has purchased a license from the HDCP Association.

The data content protected by HDCP technology on the computer platform will be output by the COPP driver in the operating system (Certified Output Protection Protocol) to first verify the graphics card BIOS, only a legitimate BIOS can achieve content output, followed by a good to authenticate the 40-bit key of the display device, only devices that meet HDCP requirements can finally display the content transmitted by the graphics card.

Note: ATMEL 89C51 belongs to a HDCP key chip, is a three-stage encryption 4K PEROM, its function and processing power is very strong and has a separate cache, mostly used in microcontrollers, here mainly used to store HDCP protection key for HDMI conversion chip call!

If one of the software and hardware does not support HDCP, then we can not read digital content. Since the next generation of Blu-ray and HD-DVD will implement the HDCP standard, in the future if you want to watch movies at a resolution of 1980*1080, then the system must support HDCP. conversely, if it does not, then you will only get 1/4 of the resolution. If this is really the case, then even high resolution video sources may be barely played at a "blurry" resolution. (Note: HDMI has embedded HDCP content protection mechanism, so the graphics card supporting HDMI interface can also support HDCP.) Microsoft has repeatedly stressed that the display devices and drivers supporting Vista must support HDCP, and ATI and NVIDIA have been working hard to achieve this condition.

Although a long time ago NVIDIA claimed to have completed the corresponding BIOS design, and their graphics cards from GeForce FX onwards have been "HDCP Compliant", but did not really achieve to HDCP support ---- previously, even the GF7 series often need with a third-party chip to achieve HDCP support. However, this situation has changed ----- has now really introduced HDCP support in the latest GF71 cores, such as the latest GF7900GS, GF7950GT, GF7950GX2 have HDCP support.

All new versions of the G73 with HDPC support add -H as a difference ----- with "-H" numbered cores can support HDCP. Core support for HDCP is only a requirement, in addition to the core code "-H", the card must also A HDCP key chip must be loaded to fully support HDCP. ----- Such cards integrate an authorized ROM chip on the back of the PCB.

The cost of a ROM chip is negligible (some cards can write the HDCP Key into the BIOS), but manufacturers will need to pay up to $15,000 in certification fees to put the HDCP logo on their products, so the cost adds up! However, NVIDIA claims that with the introduction of the 80nm process in the future, the GPU will directly integrate the video data encoding engine, which will significantly reduce the cost of HDCP graphics cards for manufacturers. Compared to NVIDIA, ATI seems to be a bit ahead on the road to HDCP. For example, ATI has already prepared for HDCP when releasing the Radeon X1000 series of graphics cards, which no longer require the addition of chips from third-party manufacturers, but are supported through the AVIVO platform ---- previously ATI has clearly marked "HDCP Ready". At the same time, ATI has integrated HDMI modules in the RV560, RV570 and RV55 chips to directly implement HDCP support. (Note: the previous X1000 series also required a third-party chip to achieve HDMI support, such as the X1600PRO HDMI graphics card pushed by Sapphire.

It should be noted that HDCP certified products will carry the words "HDCP Ready" or the "HDCP Ready" label. Since such cards need to be addressed at the hardware level, it is unlikely that if you buy a fast card that does not support HDCP-HDMI, it will be supported in the future unless you replace the card. Since there is no HDMI interface is not well compatible with digital HD display devices, ATI and NVIDIA have placed great emphasis on HDCP-HDMI in the new generation of graphics cards.


TV HDMI interface can output audio?
HDMI Audio Splitter
Who has better sound quality, HDMI or optical?
The difference between MiniHDMI and HDMI in graphics cards
How to connect a TV with a computer's HDMI port (HD port)
The difference between the graphics interface HDCP and HDMI?
Advantages of HDMI display interface
What is DisPlayPort?
What are the advantages of the DisplayPort interface?

 
  



 

 

 .