Discussion:
Why does VGA fit a screen but HDMI does not?
(too old to reply)
James Harris
2021-02-20 09:52:03 UTC
Permalink
What is it with HDMI that it finds it so hard to fit the computer's
output to the size of the display? Anyone know?

I always find the old 15-pin VGA system to be pixel perfect. For
example, on a 1920 x 1200 flat panel every pixel output by the graphics
processor lands on a pixel of the panel - which is quite remarkable in
itself, IMO, and I guess it comes down to precise timing.

But with the 'more advanced' HDMI system the graphics processor doesn't
seem to know where the pixels are, usually leading to overscan.

Even more surprising, this article on Wikipedia,

https://en.wikipedia.org/wiki/Overscan#Modern_video_displays

says that HDMI will be pixel perfect and VGA will not. My experience is
the exact opposite.

That said, I am just starting to switch over to HDMI so maybe there's
some special approach that I am missing.

What is your experience of VGA and HDMI? Does pixel-perfect HDMI depend
on some sort of magic incantation or cables or settings etc?
--
James Harris
wolfgang kern
2021-02-20 11:44:17 UTC
Permalink
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's
output to the size of the display? Anyone know?
I always find the old 15-pin VGA system to be pixel perfect. For
example, on a 1920 x 1200 flat panel every pixel output by the graphics
processor lands on a pixel of the panel - which is quite remarkable in
itself, IMO, and I guess it comes down to precise timing.
But with the 'more advanced' HDMI system the graphics processor doesn't
seem to know where the pixels are, usually leading to overscan.
Even more surprising, this article on Wikipedia,
  https://en.wikipedia.org/wiki/Overscan#Modern_video_displays
says that HDMI will be pixel perfect and VGA will not. My experience is
the exact opposite.
That said, I am just starting to switch over to HDMI so maybe there's
some special approach that I am missing.
What is your experience of VGA and HDMI? Does pixel-perfect HDMI depend
on some sort of magic incantation or cables or settings etc?
VGA connector ? haven't seen since a while ... don't mix HDMI with VGA.

chosen screen resolution must match the monitors capability to see pixel
perfect (either HDMI or VGA or RGB32).
__
wolfgang
James Harris
2021-02-20 16:30:43 UTC
Permalink
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's
output to the size of the display? Anyone know?
...
Post by James Harris
What is your experience of VGA and HDMI? Does pixel-perfect HDMI
depend on some sort of magic incantation or cables or settings etc?
VGA connector ?  haven't seen since a while ... don't mix HDMI with VGA.
chosen screen resolution must match the monitors capability to see pixel
perfect (either HDMI or VGA or RGB32).
Shouldn't the display driver find out the display's resolution via DDC?

https://en.wikipedia.org/wiki/Display_Data_Channel

It works over VGA but, certainly in my experience, not over HDMI.

Maybe it'll be different when I have an OS installed but as an example I
have a brand new computer here which, when I boot it and it's still in
the firmware, shows boot messages in overscan. The computer's idea of
the top-left corner of the display is somewhere slightly above and to
the left of the visible screen. As a result the top and left of what
should be displayed are missing.

But why? How can it be so hard for modern firmware and modern displays
to get their act together and agree on where the pixels are? AFAICS the
mismatch makes no sense.
--
James Harris
Bernhard Schornak
2021-02-20 18:28:54 UTC
Permalink
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's output to the size of the
display? Anyone know?
...
What is your experience of VGA and HDMI? Does pixel-perfect HDMI depend on some sort of magic
incantation or cables or settings etc?
VGA connector ?  haven't seen since a while ... don't mix HDMI with VGA.
chosen screen resolution must match the monitors capability to see pixel perfect (either HDMI or
VGA or RGB32).
Shouldn't the display driver find out the display's resolution via DDC?
  https://en.wikipedia.org/wiki/Display_Data_Channel
It works over VGA but, certainly in my experience, not over HDMI.
Maybe it'll be different when I have an OS installed but as an example I have a brand new computer
here which, when I boot it and it's still in the firmware, shows boot messages in overscan. The
computer's idea of the top-left corner of the display is somewhere slightly above and to the left of
the visible screen. As a result the top and left of what should be displayed are missing.
But why? How can it be so hard for modern firmware and modern displays to get their act together and
agree on where the pixels are? AFAICS the mismatch makes no sense.
Why HDMI? The standard connection between Graphics adapter and
monitor is Display Port DP (most cards have three DP, but only
one HDMI). HDMI below V2.0 cannot deliver 2160p @ 60 Hz, which
is the standard frame rate for modern LCD monitors - DP can do
that since V1.4.

My Philips BDM 4350 works fine with DP 1.4, and it reports its
technical data back to Windows 7 without problems. Maybe there
is a problem with your graphics adapter or the used cable?


Enjoy the weekend!

Bernhard Schornak
James Harris
2021-02-22 09:24:30 UTC
Permalink
Post by Bernhard Schornak
Post by James Harris
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's
output to the size of the display? Anyone know?
...
Post by James Harris
What is your experience of VGA and HDMI? Does pixel-perfect HDMI
depend on some sort of magic incantation or cables or settings etc?
VGA connector ?  haven't seen since a while ... don't mix HDMI with VGA.
chosen screen resolution must match the monitors capability to see
pixel perfect (either HDMI or VGA or RGB32).
Shouldn't the display driver find out the display's resolution via DDC?
   https://en.wikipedia.org/wiki/Display_Data_Channel
It works over VGA but, certainly in my experience, not over HDMI.
Maybe it'll be different when I have an OS installed but as an example
I have a brand new computer here which, when I boot it and it's still
in the firmware, shows boot messages in overscan. The computer's idea
of the top-left corner of the display is somewhere slightly above and
to the left of the visible screen. As a result the top and left of
what should be displayed are missing.
But why? How can it be so hard for modern firmware and modern displays
to get their act together and agree on where the pixels are? AFAICS
the mismatch makes no sense.
Why HDMI? The standard connection between Graphics adapter and
monitor is Display Port DP (most cards have three DP, but only
is the standard frame rate for modern LCD monitors - DP can do
that since V1.4.
When you say "The standard" I would suggest that that's in your
experience. AISI there are many standards: VGA, Display Port, HDMI - and
others, I believe.

HDMI is standard across lots of hardware. For example, I just bought two
NUC machines. Both have HDMI. One has VGA as well. Neither has Display
Port.
Post by Bernhard Schornak
My Philips BDM 4350 works fine with DP 1.4, and it reports its
technical data back to Windows 7 without problems. Maybe there
is a problem with your graphics adapter or the used cable?
The cable in this case is brand new, the latest standard HDMI 2.1, and
wired for Ethernet and ARC so it's not that. But I've seen the overscan
problem almost any time I have used HDMI over the years.

Just thinking about it, I wonder if it could be that the HDMI overscan
I've seen has been because I was displaying the output on TVs. I get the
impression that TVs expect HDMI output to be from a DVD player or
suchlike and are /designed/ to rescale such outputs which leads to them
trimming the edges off the screen image.
--
James Harris
Bernhard Schornak
2021-02-22 15:51:25 UTC
Permalink
Post by Bernhard Schornak
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's output to the size of the
display? Anyone know?
...
What is your experience of VGA and HDMI? Does pixel-perfect HDMI depend on some sort of magic
incantation or cables or settings etc?
VGA connector ?  haven't seen since a while ... don't mix HDMI with VGA.
chosen screen resolution must match the monitors capability to see pixel perfect (either HDMI or
VGA or RGB32).
Shouldn't the display driver find out the display's resolution via DDC?
   https://en.wikipedia.org/wiki/Display_Data_Channel
It works over VGA but, certainly in my experience, not over HDMI.
Maybe it'll be different when I have an OS installed but as an example I have a brand new
computer here which, when I boot it and it's still in the firmware, shows boot messages in
overscan. The computer's idea of the top-left corner of the display is somewhere slightly above
and to the left of the visible screen. As a result the top and left of what should be displayed
are missing.
But why? How can it be so hard for modern firmware and modern displays to get their act together
and agree on where the pixels are? AFAICS the mismatch makes no sense.
Why HDMI? The standard connection between Graphics adapter and
monitor is Display Port DP (most cards have three DP, but only
is the standard frame rate for modern LCD monitors - DP can do
that since V1.4.
When you say "The standard" I would suggest that that's in your experience. AISI there are many
standards: VGA, Display Port, HDMI - and others, I believe.
HDMI is standard across lots of hardware. For example, I just bought two NUC machines. Both have
HDMI. One has VGA as well. Neither has Display Port.
Read "the standard" as "the usual connector for PCs". As I told,
this has to do with the limitation for the maximum frequency you
can transfer via the connector standard:

https://www.tomshardware.com/features/displayport-vs-hdmi-better-for-gaming

The tables show, why DP is preferred for PC monitors for highest
resolutions (4k and 8k).
Post by Bernhard Schornak
My Philips BDM 4350 works fine with DP 1.4, and it reports its
technical data back to Windows 7 without problems. Maybe there
is a problem with your graphics adapter or the used cable?
The cable in this case is brand new, the latest standard HDMI 2.1, and wired for Ethernet and ARC so
it's not that. But I've seen the overscan problem almost any time I have used HDMI over the years.
To work properly with HDMI 2.1, both, the sending as well as the
receiving device, must comply to that standard.
Just thinking about it, I wonder if it could be that the HDMI overscan I've seen has been because I
was displaying the output on TVs. I get the impression that TVs expect HDMI output to be from a DVD
player or suchlike and are /designed/ to rescale such outputs which leads to them trimming the edges
off the screen image.
I have no experience with TVs (yet). For monitors built for PCs,
it is recommended to use one of the DP connectors. With HDMI, it
even might cause problems if a higher version graphic card sends
signals with too high frequency/FPS to the receiving device. But
that's just speculation from my side. To find the culprit, it is
probably necessary to use the old "trial and error" method... ;)


Greetings from Augsburg

Bernhard Schornak


P.S. 1: If I remember correctly, there is a scaling option some-
where in the bowels of the settings menu of AMD / Nvidia
cards, where you can set the overscan percentage.

P.S. 2: Congratulations, if you managed to get your hands on one
of the rare graphics cards capable to handle HDMI 2.1!

Rod Pemberton
2021-02-21 03:58:27 UTC
Permalink
On Sat, 20 Feb 2021 09:52:03 +0000
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's
output to the size of the display? Anyone know?
I always find the old 15-pin VGA system to be pixel perfect. For
example, on a 1920 x 1200 flat panel every pixel output by the
graphics processor lands on a pixel of the panel - which is quite
remarkable in itself, IMO, and I guess it comes down to precise
timing.
But with the 'more advanced' HDMI system the graphics processor
doesn't seem to know where the pixels are, usually leading to
overscan.
Even more surprising, this article on Wikipedia,
https://en.wikipedia.org/wiki/Overscan#Modern_video_displays
says that HDMI will be pixel perfect and VGA will not. My experience
is the exact opposite.
That said, I am just starting to switch over to HDMI so maybe there's
some special approach that I am missing.
What is your experience of VGA and HDMI? Does pixel-perfect HDMI
depend on some sort of magic incantation or cables or settings etc?
The only time I remember VGA/SVGA being "pixel perfect" was for high
resolution Sony Trinitron CRT monitors, when the video output was also
set to the highest resolution of the monitor. These CRTs used a
line-based color-mask instead of a dot-base color-mask.

I've not seen any overscan of BIOS startup text screen on a computer,
but this computer is a decade or so old. This computer has both DVI-D
and HDMI output, but the monitor is DVI-D only. At 1680x1050, I don't
even see the pixels, normally. I've got to be about 3 inches from the
screen to see them, and they're really small. Any closer and I can't
see them (vision out-of-focus), but they seem to be about 3 pixels per
mm or so.

Our cable television program guide has overscan issues when the
cable tuner is connected to the television via HDMI, but not for either
RF coax video or composite video. Not sure about S-video or component
video. Normally, here in the U.S., you have to pay extra for cable
channels that are broadcast in high-def resolutions, i.e., fit HDMI
resolution. Typically, you don't buy the high-def channels, connect
the television and tuner via HDMI, and set the television to expand/zoom
non-HDMI resolutions to fit the screen. The video quality is slightly
sharper for non high-def channels, if the video is sent to the
television via HDMI cable, instead of via RF coax or composite video
cable.

--
James Harris
2021-02-22 09:33:01 UTC
Permalink
Post by Rod Pemberton
On Sat, 20 Feb 2021 09:52:03 +0000
Post by James Harris
What is it with HDMI that it finds it so hard to fit the computer's
output to the size of the display? Anyone know?
...
Post by Rod Pemberton
Our cable television program guide has overscan issues when the
cable tuner is connected to the television via HDMI, but not for either
RF coax video or composite video.
That fits what I just guessed at when replying to Bernhard. Could it be
that TVs are designed to rescale what they receive over HDMI. IOW:

1. TVs generally rescale HDMI.

2. TVs don't tend to rescale what they receive over other input types.

3. Unlike TVs monitors tend not to rescale any inputs, including HDMI.

If so, the problem would only be present when connecting a computer to a
TV via HDMI. Speculation, I know, but that could explain why I have seen
overscan on HDMI but others here have not: I was using a TV, perhaps
they were talking about monitors.
--
James Harris
Loading...