Just said:
Got no help from the guy on the chat-line at Sony, who sits there
telling me that the DVI to HDMI adapter won't work--using my HDTV as
a
monitor for viewing NetFlix and YouTube, etc.
BUT IT DOES work because I get a display on the HDTV as the computer
begins to boot up. Only when Windows XP starts to load do I lose the
display.
Any tips, as to how I can get that DVI interface up and running?
Would
I have to go into SET-UP before Windows boot and enable it there? If
so, which function key do I have to hit to get in there? Been so long
I can't remember.
Also which screen resolution is best?
Hardware details ? Computer make and model ? What about your
video output options ? Video connectors in the I/O plate
area of the computer ? Does the machine use an add-in style video card,
with video connectors in the "PCI slot area" ?
Generally, for full control, you want two video connectors
on the computer. Preferably, on the same video card, for
convenience.
If it's a single connector, and you're swapping monitors,
it's possible you'll end up with a black screen. Try reconnecting
the original monitor (both VGA and DVI can be installed hot,
and I've done that hundreds and hundreds of times here). The
black screen might result from Plug and Play (somehow) using
the wrong resolution.
It's easier if the two devices (old monitor and new monitor)
are connected at the same time, as then there is less chance
of losing control.
Plug in the regular monitor to one connector, the HDTV to the
other connector. Windows may currently be using only the
regular monitor, and doesn't know about the HDTV.
You then need to change video card modes, to a "dual head"
configuration. Or, otherwise get Windows to recognize the second
display. Then, change the HDTV to be the primary display,
if your intention is to run with nothing but the HDTV in the
future.
The BIOS has a setting, for display card priority, and it
can be used to steer the BIOS screen output to a particular
GPU. It sounds like that is already happening (since you saw
the BIOS on the HDTV screen). But once Windows
starts, it has its own configuration details.
And then the screen can go black if it wants.
On a laptop, there are Function keys defined in the user manual,
for changing the video configuration on the fly. These allow
a saleman to switch his laptop to driving a projector, while
doing a sales presentation. A desktop might not come with
that same exact setup, and then you use the Display control
panel to make changes.
Example for an ATI video card:
http://support.dell.com/support/edocs/video/p152390/en/displays.htm#wp1132831
Example for an Nvidie video card (Classic control panel, out of date...)
http://support.dell.com/support/edocs/video/p56135/en/usage.htm
Nvidia "New" control panel, picture of dual monitor setup
http://i358.photobucket.com/albums/oo21/Happy_Ocuk/Clipboard01-20.jpg
Intel probably has something too, if it's an Intel built-in chipset
graphics. Example here.
http://support.gateway.com/s/Mobile/2007/Phantom/1014329Rfaq10.shtml
Paul