can a bad video card damage a monitor?

  • Thread starter Ministry of silly walks
  • Start date
M

Ministry of silly walks

Hi all,
and first of all please forgive my poor english, but the italian newsgroup
wasn't very helpful.
I'm a quite experienced user but now I'm stuck!
my system: AMD 2400+, 512 mb ram, Geforce 440 MX... no modding, no
overclock (I'm not a gamer).
I have a Samsung Syncmaster 793s, connected to my pc until August 2004;
never experienced a problem.

Last week, while a was working, the image suddenly disappeared (like
standby mode), then re-appeared, then disappeared again; now the monitor
seem to be constantly in standby mode (the green light blinks), even if I
shut it down or I connect it to another pc.

The day after I picked a second working monitor (Samsung, 1997), and after
20 minutes it began flashing and "distorting", then it shut down. It is
died, I cannot turn it on.

I tried with a third monitor (Olidata, 1995) and it seems to work, even if
sometimes the image disappears for a couple of seconds (in this moment I'm
using it).

I specify I've never experienced shock hazards or magnetic fields;
I cannot think it was simply a coincidence, since the two monitors (one of
which quite new) where perfectly working.
So I wonder if there can be a fault in my video card (eg. non-standard
refreshing rates, even if I've set 75hz), of something wrong in my mobo
(maybe in the agp slot).

I'm quite desperate! Any ideas?
Thanks in advance!!
M.
 
M

Ministry of silly walks

Il Thu, 11 Aug 2005 20:21:32 GMT, Ministry of silly walks ha scritto:
Hi all,
and first of all please forgive my poor english, but the italian newsgroup
wasn't very helpful.
I'm a quite experienced user but now I'm stuck!
my system: AMD 2400+, 512 mb ram, Geforce 440 MX... no modding, no
overclock (I'm not a gamer).

p.s.: my mobo is a Asus A7V8X
 
J

Joe

I would lean toward a bad video card. Best test would be if you could
run the original monitor or second on another computer. I just do not think
your video card has fried three monitors. I think you have three good
monitor and one bad video card and the problem is taking 20 minutes or so to
reveal itself.
Does your card possibly have a cooling fan that has quit working, some
Geforce 440's do and some have heatsinks. A new 440 card would cost only
about 40 dollars which is far less than a new monitor so I would start
there.

Joe
 
D

digisol

Check the power settings, IE; auto timed system standby settings
monitor turn off, hard drive turn off in your vid card power optio
propertie

Doubtfull that any 440 card will hurt any decent monitor unless you
Hz settings are set too way too high for an older monitor, just don'
go past 85 and it should be OK

That said if your Hz is too high you will simply get a total displa
mess, and no blinking of the monitor LED showing it's a power settin
issue, a 440 card will be hard pressed to go past 90 Hz

It is probably going into standby after 20 min, not having it goin
into "most likely" standby mode may well fix it

BTW, The default settings for XP and other o/s's will turm off you
monitor after 20 min, time it

Set all your power settings to "Never" especially the standby and HDD
only usefull setting is to turn off the monitor but that can be don
just as simply as turning off the monitor, the system does not nee
any monitor to run and work fine, plus, some boards don't lik
standby and sleep mode as it will confuse the hardware, requiring
restart at least
 
J

JANA

By your description, you are using all old monitors to begin with. Older
monitors will have a less reliability to start with.

As for a display card causing a monitor to fail, this may happen if it has a
short in it, and the supply voltage is going up the monitor drive cable. The
output section of the card is driven by a buffer IC, that is supposed to
have some protection built in to it.

One common cause of monitor failure is that the user may set the refresh
rate too high. Other than that, there is not much else.

--

JANA
_____


Hi all,
and first of all please forgive my poor english, but the italian newsgroup
wasn't very helpful.
I'm a quite experienced user but now I'm stuck!
my system: AMD 2400+, 512 mb ram, Geforce 440 MX... no modding, no
overclock (I'm not a gamer).
I have a Samsung Syncmaster 793s, connected to my pc until August 2004;
never experienced a problem.

Last week, while a was working, the image suddenly disappeared (like
standby mode), then re-appeared, then disappeared again; now the monitor
seem to be constantly in standby mode (the green light blinks), even if I
shut it down or I connect it to another pc.

The day after I picked a second working monitor (Samsung, 1997), and after
20 minutes it began flashing and "distorting", then it shut down. It is
died, I cannot turn it on.

I tried with a third monitor (Olidata, 1995) and it seems to work, even if
sometimes the image disappears for a couple of seconds (in this moment I'm
using it).

I specify I've never experienced shock hazards or magnetic fields;
I cannot think it was simply a coincidence, since the two monitors (one of
which quite new) where perfectly working.
So I wonder if there can be a fault in my video card (eg. non-standard
refreshing rates, even if I've set 75hz), of something wrong in my mobo
(maybe in the agp slot).

I'm quite desperate! Any ideas?
Thanks in advance!!
M.
 
M

Ministry of silly walks

I would lean toward a bad video card. Best test would be if you could
run the original monitor or second on another computer. I just do not think
your video card has fried three monitors. I think you have three good
monitor and one bad video card and the problem is taking 20 minutes or so to
reveal itself.

Of course I've checked my monitors with other computers, but the first
one (the newest Syncmaster) remains in a "stanby" mode or similar from
which it seems unrecovable... Eg.: if I turn it on I can hear the
typical "degauss" sound, but it remains completely dark.

The second monitor doesn't ever turn on, like it wasn't connected to
the power line.
Does your card possibly have a cooling fan that has quit working, some
Geforce 440's do and some have heatsinks. A new 440 card would cost only
about 40 dollars which is far less than a new monitor so I would start
there.

Yes, I'll get a new video card as soon as possible, but I fear the
fault can be on the motherboard...

thanks
M.
 
M

Ministry of silly walks

Il Fri, 12 Aug 2005 02:31:39 GMT,
(e-mail address removed)-dot-au.no-spam.invalid (digisol) ha scritto:
Check the power settings, IE; auto timed system standby settings,
monitor turn off, hard drive turn off in your vid card power option
properties

The problem occoured while I was working and, by the way, the pc
didn't hang but continued working normally, even if I couldn't see
anything.
Doubtfull that any 440 card will hurt any decent monitor unless your
Hz settings are set too way too high for an older monitor, just don't
go past 85 and it should be OK.
That said if your Hz is too high you will simply get a total display
mess, and no blinking of the monitor LED showing it's a power setting
issue, a 440 card will be hard pressed to go past 90 Hz.

Yes, I know that, and other people said me the same thing... and I
think it's the first time I experience something similar... but as I
said the monitors don't work even if connectet to other machines... so
they must be damaged.
And of course I always used default refreshing rates... maybe a fault
in the videocard can result in non-standard frequences overriding my
settings (just a hardware problem and not a software one) ?
It is probably going into standby after 20 min, not having it going
into "most likely" standby mode may well fix it.

Ok, but in that moment I was working with Microsoft Word...
BTW, The default settings for XP and other o/s's will turm off your
monitor after 20 min, time it ?

Yes, but I've forgotten to say that the second monitor was "fried"
while Windows was loading (during the logo screen)
Set all your power settings to "Never" especially the standby and HDD,
only usefull setting is to turn off the monitor but that can be done
just as simply as turning off the monitor, the system does not need
any monitor to run and work fine, plus, some boards don't like
standby and sleep mode as it will confuse the hardware, requiring a
restart at least.

Yes I'm aware of that; but my system was working perfectly for the
last months and it's quite strange two monitors went fried in the same
days... :|

Plus, as I said using my third monitor I can work but the image often
disappears for a couple of seconds, like the video card was "randomly"
sending non-standard signals.
But since it isn't a continuous problem, I'm not able to "isolate" it
it was reciving non-standard refreshig rates...

thanks!
M.
 
M

Ministry of silly walks

One common cause of monitor failure is that the user may set the refresh
rate too high. Other than that, there is not much else.

Well, the first fried monitor was a quite new Samsung Syncmaster...
and I'm sure I've set a standard refresh rate (72 or 75 hz).
Maybe a faulty video card could override my settings?

But as I wrote now all seems to work, so I cannot identify the
problem.
I'll try to change the monitor power connection cable and the video
card hoping not to burn something else.

Thanks!
M.
 
S

Schrodinger

Ministry of silly walks said:
Of course I've checked my monitors with other computers, but the first
one (the newest Syncmaster) remains in a "stanby" mode or similar from
which it seems unrecovable... Eg.: if I turn it on I can hear the
typical "degauss" sound, but it remains completely dark.

The second monitor doesn't ever turn on, like it wasn't connected to
the power line.

I had exactly this happen to a monitor after a graphics card fried it.
Change the card. I can't see it being the motherboard - I would expect bios
beeps to signify a problem in that respect?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top