Video connections, HDMI, DVI, VGA, what's the difference between them and which one to use?

In the past, computers used only VGA ports to connect to good old tube monitors, but over time, this started to change, as the VGA connection is purely analog, sending colors and horizontal and vertical synchronization, suitable for displaying monitors. analog.

With the evolution of LCD, LED, Plasma, etc., it was necessary to convert all analog information to digital, so that each pixel is right in its place in the digital image area.

The analog signal is a sine wave that sends a signal that can oscillate in a huge range of energy values, and make this wave always correct, need good decoders to avoid noise, remove unforeseen errors, and even common network oscillations. electrical.

The "digital" cables, in fact, continue to work in the same way, but the range is broken into parts, so a small oscillation still continues to deliver a "value" without it becoming interference or a problem.

In analog, the signal from a supposed cable travels with 1.21564v volts and has a range from 1.21500v to 1.23400v and any voltage value that arrives at the destination, is treated as real and valid information.

In digital, these 1.21564 are broken into 2 (or more parts) equal, for example: 1.23400 = - 1.21564 = 0.01836. So we take this value and divide it into 2 parts (0.00918v): Range 1: 1.23400 to 1.22482 is considered value 1, and from 1.22481 to 1.21564 is considered zero.

A digital system does not always break into just 2 parts, there may be several segments and several ranges to even consider other values, and not just zeros and ones, and this is where many digital systems take advantage using the same cables as before, but using more accurate controllers and voltage regulators to deliver data by oscillating power at lower, faster levels.

VGA cable, by default, only carries analog data, but there is an analog-to-digital conversion within current monitors, which can impact the image display, as well as speed in ms of response time.

However, if you use a cable that transfers data to digital, optimized at both ends for this, it is not necessary to have a good voltage regulator that always emits the same voltage, as long as there are error corrections, as in the case of HDMI.


The HDMI standard arose from the need to deliver digital data content for multimedia content, with protection against unauthorized recording and reproduction of digital content, an evolution of the DRM of DVD players that were connected by RCA and that could be copied by video cards. capture or even applications on the computer that performed the key break to perform the disk copy.

Inside HDMI, there are more security and control protocols, than the image itself, especially if the content is protected video, where data is transmitted encrypted between the computer and the monitor, using HDCP technology. It is a standard for the safe transmission of images, produced for use in a residential environment, in the homes of the general public.

With each passing year, new technologies emerge to HDMI, and new versions and new standards incorporate better features, encryptions, error correction, audio transport, increased data traffic rate and new supports for 4K images for example, as the supported version.


The DVI standard, is, in short, the simplest digital means of transport, which operates between VGA and HDMI, and does the function of transporting digital image from the computer to the monitor, without any other function on it, such as HDCP and etc., as we find on HDMI.

It is common to find DVI ports on video cards, as manufacturers always include it because DVI support is cheaper than HDMI, although HDMI is also backwards compatible with a protocol to support built-in DVI (being possible to transmit DVI over HDMI signal).

There are some video cards that have the information that HDCP is supported over DVI, this is because an HDMI connection has been negotiated with the DVI pins, as there is a DVI to HDMI adapter at one end, and the video card is able to detect the type of protocol through these ports.


VGA is the most traditional and simplest image-carrying standard produced to transmit the image between the computer case and the tube monitor, operating at the same frequency of hertz where the monitor updates its image, that is, the computer was primarily responsible in sending the information that would be rendered on the screen tube, on those old monitors, which operated in the 50 Hz or 50 updates per second range.

The signal is completely analog, and totally susceptible to interference if the cable is not shielded or protected against electromagnetic spectrum, which are those radio waves and the like, interference by motors, reactors, and transformers in general.

The number of pins is 15, but in fact, the image needs approximately 8 wires to transmit the image, which is the color pins, and horizontal and vertical synchronization, so that it can form the image.

On LCD monitors, there is an analog-to-digital converter so that it can be reproduced on the screens, as they are entirely digital. To speed up FPS in some games, it is recommended not to use VGA, as DVI can deliver more frames per second than VGA, as there is about a 60 Hz limit on VGA, which is exceeded for some resolutions with DVI.

See more at:

No comments