Will computer graphics migrate to HDMI?

R

rjn

I'm typing this on a 1920x1200@60Hz LCD monitor
connected via single-link DVI. That's the max that
DVI can do. Even so, the spec had to be bent
(adding CVT) to make that work. The 165 MHz
single-link rate of DVI was short-sighted.

Monitors with more pixels today require dual-link,
which is likely seen as a real customer satisfaction
hazard, because the card, the cable and the monitor
all have to be dual-link - and that's not common.

Not surprisingly, there are few dual-link monitors
(mostly the 2560x1600 30-inchers).

There is also technical argument for increasing the
frame buffer-to-pixel rate of existing 1920 monitors
above 60Hz, plus increasing the rate to support more
than 8 bits per color. That too requires dual-link,
and I wouldn't be surprised if no products offer
that today.

If the market logistics of dual-link DVI stand in the
way of larger/faster/deeper monitors, I'm wondering
if HDMI might be a solution (it might introduce new
issues too, like what to do with the HDMI audio).

HDMI 1.3 has a link rate of up to 340 MHz, or twice
DVI's single-link rate, enough for 2560x1600@80Hz
(as long you don't go for color too deep :).

I see that some graphics cards now sport HDMI ports,
although that might be aimed at TV connections. And
some monitors have HDMI, but they seem to be TV or
dual-use TV/PC items.

Is there any drift in the computer industry to move
to HDMI for the monitor connection?
 
T

the dog from that film you saw

Is there any drift in the computer industry to move
to HDMI for the monitor connection?



yes there is - to keep the makers of HD dvd and blu ray happy.
they want your hdmi graphics card connected to a hdmi monitor when you watch
movies on your pc - scum that they are.
 
R

rjn

Bob Myers said:
... the interface which seems best poised right now
to replace DVI (and eventually VGA) is the new
DisplayPort standard from VESA. See, for instance:

"Some OEMs have chafed at the four cent per
port royalties and $10,000 annual fee
charged for HDMI licenses."

Could be the deal-maker. My impression is that
Apple's per-port royalties on FireWire helped
it stay second fiddle to USB 2.0.

DP info also at wiki:
<http://en.wikipedia.org/wiki/Displayport>
Normal wiki caveat applies: (it's been changed
since you read it, and it may not have been
true then either :)
... (and eventually VGA) ...

DP doesn't seem to have any analog pinouts,
unlike DVI-I, so why is it likely to kill off
VGA/HD15/Dsub15 any faster than market forces
are already doing as CRTs dry and die?
 
B

Bob Myers

rjn said:
Is there any drift in the computer industry to move
to HDMI for the monitor connection?

In short, no.

There will be some use of HDMI on PCs for television connectivity,
but the interface which seems best poised right now to replace
DVI (and eventually VGA) is the new DisplayPort standard from
VESA. See, for instance:

http://www.eetimes.com/showArticle.jhtml?articleID=196802386

DisplayPort is a packetized digital interface with a maximum
capacity (in the initial release) of 10.8 Gbit/sec, and is expected
to see an increase in capacity in the planned 2.0 release some
time next year (without breaking compatibility with the
1.1 specification).

Bob M.
 
R

rjn

"the dog from that film you saw"
yes there is - to keep the makers of HD dvd and
blu ray happy.

HD-DVD and BR may both fail. This isn't the VHS vs. Beta
fight for a mass market. It's two deliberately,
maliciously, incompatible formats fighting over a
niche market (videophiles, a 1% market during the
LaserDisc era) using horrifically expensive hardware
and abusive DRM that punishes the early adopters that
HD needs to evangelize the technology (people with
analog and early-HDCP HDTV displays only get 720p).

But even if these formats fail, the Hollywood
lawyers still want a credit card slot in the display.
they want your hdmi graphics card connected to
a hdmi monitor when you watch
movies on your pc - scum that they are.

You won't be happy to learn that the contenders
for future LCD connects; DVI, HDMI, DP and UDI,
*ALL* include HDCP.
 
M

Mike Ruskai

I'm typing this on a 1920x1200@60Hz LCD monitor
connected via single-link DVI. That's the max that
DVI can do. Even so, the spec had to be bent
(adding CVT) to make that work. The 165 MHz
single-link rate of DVI was short-sighted.

Monitors with more pixels today require dual-link,
which is likely seen as a real customer satisfaction
hazard, because the card, the cable and the monitor
all have to be dual-link - and that's not common.

Not surprisingly, there are few dual-link monitors
(mostly the 2560x1600 30-inchers).

All new adapters based on ATI and nVidia chips have dual-link DVI
connectors.

I'm currently using a GeForce 8800GTX with an HP LP3065 30-inch panel
at 2560x1600, but I could also drive the panel with an $80 ATI X1300.

The panel comes with the cables, so all you need to worry about is the
card, since any panel requiring dual-link obviously has both the
connectors and the cables.
There is also technical argument for increasing the
frame buffer-to-pixel rate of existing 1920 monitors
above 60Hz, plus increasing the rate to support more
than 8 bits per color. That too requires dual-link,
and I wouldn't be surprised if no products offer
that today.

More than 24 bits of color is really outside normal consumer usage,
and outside what existing panels are capable of displaying.
If the market logistics of dual-link DVI stand in the
way of larger/faster/deeper monitors, I'm wondering
if HDMI might be a solution (it might introduce new
issues too, like what to do with the HDMI audio).

HDMI 1.3 has a link rate of up to 340 MHz, or twice
DVI's single-link rate, enough for 2560x1600@80Hz
(as long you don't go for color too deep :).

I see that some graphics cards now sport HDMI ports,
although that might be aimed at TV connections. And
some monitors have HDMI, but they seem to be TV or
dual-use TV/PC items.

Is there any drift in the computer industry to move
to HDMI for the monitor connection?

I don't know the answer to that, but as I said, dual-link DVI is an
adequate solution, supported by all new graphics hardware, and
obviously supported by those panels which require it.
 
B

Bob Myers

rjn said:
DP doesn't seem to have any analog pinouts,
unlike DVI-I, so why is it likely to kill off
VGA/HD15/Dsub15 any faster than market forces
are already doing as CRTs dry and die?

Oh, it's not - it's just that, until now, there wasn't much in
the way of anything that really stood a chance of taking
over from VGA as the standard, on-every-PC-out-the-
door video interface. DVI couldn't do that, for various
reasons that I doubt most here are interested in.

Bob M.
 
R

rjn

Mike Ruskai said:
All new adapters based on ATI and nVidia chips
have dual-link DVI connectors.

But do they all support interesting pixel geometries?
(i.e. above 1920x1200)
This is not a rhetorical question; I don't know, and
a browse of newegg on the topic a few days ago was
not encouraging.
The panel comes with the cables, ...

Cables?
Plural?
Why? DVI dual-link can use a single connector.
... dual-link DVI is an adequate solution, supported by
all new graphics hardware ...

That's a bit more reaching than the claim above, and is
not strictly true. I'm a Matrox user, and their dual-link
offerings are pretty thin (and expensive), but then Matrox
is busy niche'ing itself towards obscurity.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top