DVI digital has a thing called the "dot clock". By specification,
it maxes out at about 165MHz, while if you examine what is on the
cable with a scope, you see signals at 1650 megabaud. Which is damn
fast. The extremely high speed, is what helps limit the cable
lengths supported.
There are a few video cards, where the DVI output is not capable
of reaching full speed. The driver prevents the resolution
from going too high, in order to prevent the DVI output from
appearing "snowy" to the user. That could be a reason for
restricting the output.
When I was looking for cards, I read a comment about the 9250 chip,
which said that the second DVI connector would only do 1024x768.
And that is really restrictive, in terms of the 1920x1200 res that
DVI should be able to do.
http://en.wikipedia.org/wiki/Dvi
The VGA is limited by the video DAC (digital to analog converter)
bandwidth. The bandwidth specification of the DAC, determines the
resolution and refresh rates that can be supported.
One reason for the grainy look, could be a resolution mismatch
and resampling happening on the monitor. That tells me you aren't
really running at 1360x768. The display should look better, if you
can manage to drive it at "native" resolution. And DVI should
look a bit better than VGA, because with VGA, any imperfections
in the cable (electrical reflections), are visible on the screen.
DVI is perfect, until bit corruption flips individual bits, and
then you see "snow" start to appear. If transmission is really
bad, maybe eventually you lose sync.
With the goofy 1366x768 thing, monitors have displayed a number
of strange looking symptoms. On some of them, it is virtually
impossible to strike a video output resolution, that the monitor
likes. I believe the more modern ones, are a bit more tolerant
of the 1360-1366-1368 variations. For example, if the monitor
is sent 1360 and is 1366 internally, it should just use black
bars for 3 pixels on either side of the picture. But some of the
old ones, would try to resample the image and scale it, which
looks stupid.
There are apparently some video cards that have resolution to
the nearest pixel (so you could dial up 1366 if you needed it),
but there is no way, using the available advertising material,
to find such cards. And since your monitor is supposed to accept
1360, an ordinary card (divisible by 8) should suffice anyway.
Paul- Hide quoted text -
- Show quoted text -