A8N-SLI Deluxe Maxtor SATA Poll

E

edde

I have been using a Maxtor DiamondMax9 160GB SATA harddrive on my A8N-SLI
Deluxe motherboard for over a month now, and it's been flawless. This is my
only harddrive, and it's connected to the Nforce Sata controller. Everything
is perfect with this board in every way. I've used all final bioses
(1002,1003, 1004) and all work great.
I've been reading that some people are having trouble with this board (and
nforce4 boards in general) and Maxtor SATA drives, saying XP loses the drive
intermittantly.
I was just wondering if people out there with this combo can report if they
are having this problem, or their drive works fine.
Thanks

A8N-SLI Deluxe (1004)
A64 3500+ (stock)
2x 512MB OCZ PC3200EL Dual Channel (1t-2-3-6 Cas2) (1000HT)
Maxtor 160GB SATA HDD (Nforce SATA)
NEC 3500a DVDRW
Enermax 350w
Leadtek 660GT PCI-E (66.93) (stock)
WinXP Sp2 - Nforce 6.39's
 
P

Pete M Williams

I was just wondering if people out there with this combo can report if
they are having this problem, or their drive works fine.
Thanks

A8N-SLI Deluxe (1004)
A64 3500+ (stock)
2x 512MB OCZ PC3200EL Dual Channel (1t-2-3-6 Cas2) (1000HT)
Maxtor 160GB SATA HDD (Nforce SATA)
NEC 3500a DVDRW
Enermax 350w
Leadtek 660GT PCI-E (66.93) (stock)
WinXP Sp2 - Nforce 6.39's


I have two 200Gig Maxtor DiamondMax 10's on my A8N-SLI Deluxe (not in Raid)
and they have/do work flawlessly.
I also have a system that is not dissimilar to yours in other respects.






A8N-SLI Deluxe (1004)
A64 3500+
Corsair (TWINX1024-3200XLPRO) 1024MB
2 x Maxtor Diamondmax 10 200GB 7200rpm Serial ATA150
NEC ND-3500
Leadtek 6600GT PCI-E
Antec Neopower 480 Watt
WinXp Pro Sp2
 
D

DarkElldar

I have had no problems with Harddrives in this system.

Asus A8N-SLI Deluxe - (1004)
Athlon64 3500+ (130 nm)
1 Gig Crucial PC3200 Ram 256 x 4
ATI X700 Pro 256 Meg
2 Maxtor DiamondMax 9 120 Gig SATA Drives (non raid) on NV controler
Sound Blaster Audigy 2 Value OEM
Enermax 350 watt PS with 24 Pin Power
LG DVD burner 16x Dual Layer
LG DVD drive
 
B

bsd107

I have been having problems with my two DiamondMax10 300GB hard drives
in my A8N-SLI. I have these mirrored on the Nvraid controller. I also
have a pair of 76GB Raptors in a stripe set, also on the Nvraid.

I have been getting intermittent dropouts of one or both of the
DiamondMax10 drives. What is maddening is that it'll work fine for
weeks, then all of a sudden on a reboot, the Nvraid BIOS can't find
either one or both drives.

There is nothing wrong with the drives. I can swap the SATA cables,
and BIOS will claim that the single, detected DiamondMax10 is on the
same controller, even though I moved it.

If I remove an IDE DVD-ROM drive, or one or both of the Raptors, other
IDE hard drives, etc, in different combinations, I can sometimes get
both DiamondMax10 drives visible again. This clearly demonstrates that
this is a BIOS controller issue.

I only get any of these issues during POST - i.e. I have never had the
drive disappear while running WindowsXP.

Previously, I had this problem after updating from BIOS 1002 to a beta
version of 1003. I spent a whole day trying different combinations of
other drives installed, constantly changing between different release
BIOS versions, resetting CMOS, etc. (which, again, as this is a bug,
affects whether the DiamondMax10's can be seen), and somehow got the
drives back again.

This morning I upgraded from release BIOS 1003 to release BIOS 1004.
The whole system worked fine through several warm and cold boots. Then
this evening I did a warm reboot and now my mirror set is gone again,
with only one DiamondMax10 visible.

Various BIOS reburnings, CMOS clearings, removing other drives, and I
was able to get both drives visible again, but now Nvraid POST thinks
that the mirrored drives are from two different sets, so I can't boot.

I have to say that overall, Nvraid on the A8N-SLI is a total disaster.
I only went with it because I thought that having my system drive be a
Mirrored RAID set would increase my reliability, but in reality it's
been the flakiest implemenation of anything I've personally used on PC
in about 10 years.
 
F

fsda

(e-mail address removed) wrote in
I have been having problems with my two DiamondMax10 300GB hard drives
in my A8N-SLI. I have these mirrored on the Nvraid controller. I
also have a pair of 76GB Raptors in a stripe set, also on the Nvraid.

I have been getting intermittent dropouts of one or both of the
DiamondMax10 drives. What is maddening is that it'll work fine for
weeks, then all of a sudden on a reboot, the Nvraid BIOS can't find
either one or both drives.

There is nothing wrong with the drives. I can swap the SATA cables,
and BIOS will claim that the single, detected DiamondMax10 is on the
same controller, even though I moved it.

If I remove an IDE DVD-ROM drive, or one or both of the Raptors, other
IDE hard drives, etc, in different combinations, I can sometimes get
both DiamondMax10 drives visible again. This clearly demonstrates
that this is a BIOS controller issue.

I only get any of these issues during POST - i.e. I have never had the
drive disappear while running WindowsXP.

Previously, I had this problem after updating from BIOS 1002 to a
beta version of 1003. I spent a whole day trying different
combinations of other drives installed, constantly changing between
different release BIOS versions, resetting CMOS, etc. (which, again,
as this is a bug, affects whether the DiamondMax10's can be seen), and
somehow got the drives back again.

This morning I upgraded from release BIOS 1003 to release BIOS 1004.
The whole system worked fine through several warm and cold boots.
Then this evening I did a warm reboot and now my mirror set is gone
again, with only one DiamondMax10 visible.

Various BIOS reburnings, CMOS clearings, removing other drives, and I
was able to get both drives visible again, but now Nvraid POST thinks
that the mirrored drives are from two different sets, so I can't boot.

I have to say that overall, Nvraid on the A8N-SLI is a total disaster.
I only went with it because I thought that having my system drive be a
Mirrored RAID set would increase my reliability, but in reality it's
been the flakiest implemenation of anything I've personally used on PC
in about 10 years.

As a followup, I did manage to get my DiamondMax10 drives to both show
up again by going back to the 1002 BIOS, resetting CMOS (and I was then
able to go back to 1003 and still see the drives). BUT, somehow NVRaid
decided now that the two drives are two different mirror sets. I
managed to fix this by deleting one of the drives (which made it a
spare), and adding it back to the other drive. It's rebuilding right
now (in WinXP) as I type.

I went to 1004 because I incorrectly assumed that the Nvidia Raid bios
update might fix this issue. But, in my case, it made it worse. I'll
be waiting for 1005, and won't try 1004 again...
 
E

edde

It's weird how for the vast majority they have no problems running maxtor
sata drives on the Nvraid controller. But for a few, they seem to have
dropouts. Bizarre.
Works perfectly fine for me for over 1 month, with all bioses 1002, 1003,
and 1004.
I think the Nvraid controller is awesome! Totally fast and stable.
 
O

Oliver Daniel

My DiamondMax 10 200GB SATA drive was no longer detected when I
upgraded to BIOS version 1003. However it is working again with 1004.

My DiamonMax 9 200GB is still problematic with version 1002, 1003 and
1004. It is detected sometimes but usually causes random system freezes
almost immediately after boot. It works fine on another machine.
 
Z

zebak

edde said:
I have been using a Maxtor DiamondMax9 160GB SATA harddrive on
my A8N-SLI
Deluxe motherboard for over a month now, and it's been
flawless. This is my
only harddrive, and it's connected to the Nforce Sata
controller. Everything
is perfect with this board in every way. I've used all final
bioses
(1002,1003, 1004) and all work great.
I've been reading that some people are having trouble with
this board (and
nforce4 boards in general) and Maxtor SATA drives, saying XP
loses the drive
intermittantly.
I was just wondering if people out there with this combo can
report if they
are having this problem, or their drive works fine.
Thanks

A8N-SLI Deluxe (1004)
A64 3500+ (stock)
2x 512MB OCZ PC3200EL Dual Channel (1t-2-3-6 Cas2) (1000HT)
Maxtor 160GB SATA HDD (Nforce SATA)
NEC 3500a DVDRW
Enermax 350w
Leadtek 660GT PCI-E (66.93) (stock)
WinXP Sp2 - Nforce 6.39's

Same behavior for me. I’ve been struggling with all different BIOS
(1002,1003,1004,1006) and drivers from Nvidia. I’ve also tried with
different OS (XP SP2, XP SP2 X64 RC2, W2K3 X64 RC2) same problem...

A8N-SLI Deluxe
Bios 1004 (currently...)
Nvidia Nforce_6.39 drivers
AMD Athlon 64 3200+
Quantum FireBallP AS60.0 as boot drive
2 x Maxtor 6B300S0 (Diamond Max 10) configured in a Nvraid stripeset
(RAID 0)

When I managed to get the Diamond Max stripe set accessible, I’ve done
some stress testing using Virtual Server 2005 (multiple VM stressing
the hard drive) and usually I get either a blue screen on nvraid.sys
or a system hang (system is configured for enabling crashing from
keyboard but this didn’t work as I suspect the system hang is hardware
level...).

I saw lots of similar reports and I’m really getting fustrated ...
 
F

fsda

It's weird how for the vast majority they have no problems running
maxtor sata drives on the Nvraid controller. But for a few, they seem
to have dropouts. Bizarre.
Works perfectly fine for me for over 1 month, with all bioses 1002,
1003, and 1004.
I think the Nvraid controller is awesome! Totally fast and stable.

BIOS 1005 caused dropouts of the DiamondMax10's. I could only fix it by
doing as described above - that is, going back to 1002 then up to 1003,
with CMOS reset.

BIOS 1006 (NVRaid BIOS 4.81) also caused dropouts of the DiamondMax10's
after about two reboots. I could only fix it by doing as described above
- that is, going back to 1002 (NVRaid 4.79) then up to 1003 (NVRaid
4.78), with CMOS reset.

This problem is not getting fixed by ASUS.

This is clearly a bug in the Nvidia RAID BIOS.

In BIOS 1006, only one of the DiamondMax 10's would show up connected.
However, I could physically remove one of the two DiamondMax 10's, yet no
matter which I physically disconnected, the RAID BIOS always showed the
remaining drive on the same connector.

In previous iterations of this problem, I have noticed that removal or
reconfiguration of other drives on either the Nvidia RAID SATA or the
NVidia IDE connectors would sometimes magically trigger the appearance or
disappearance of one or both of the DiamondMax10's.

This indicates that the issue is configuration dependent, which may
explain why some users never have a problem...

The other thing that bugs me is that downgrading the BIOS from, say, 1005
directly to 1003 will NOT fix this problem, even with CMOS reset. I MUST
go down to 1002 first to get both Diamondmax10 drives back. I can then
upgrade to 1003 without the losing the drives. This indicates that there
is some "memory" to this board - i.e. reflashing the BIOS and resetting
CMOS does NOT completely reset everything to a fresh factory state...
 
F

fsda

BIOS 1005 caused dropouts of the DiamondMax10's. I could only fix it
by doing as described above - that is, going back to 1002 then up to
1003, with CMOS reset.

BIOS 1006 (NVRaid BIOS 4.81) also caused dropouts of the
DiamondMax10's after about two reboots. I could only fix it by doing
as described above - that is, going back to 1002 (NVRaid 4.79) then up
to 1003 (NVRaid 4.78), with CMOS reset.

This problem is not getting fixed by ASUS.

This is clearly a bug in the Nvidia RAID BIOS.

In BIOS 1006, only one of the DiamondMax 10's would show up connected.
However, I could physically remove one of the two DiamondMax 10's,
yet no matter which I physically disconnected, the RAID BIOS always
showed the remaining drive on the same connector.

In previous iterations of this problem, I have noticed that removal or
reconfiguration of other drives on either the Nvidia RAID SATA or the
NVidia IDE connectors would sometimes magically trigger the appearance
or disappearance of one or both of the DiamondMax10's.

This indicates that the issue is configuration dependent, which may
explain why some users never have a problem...

The other thing that bugs me is that downgrading the BIOS from, say,
1005 directly to 1003 will NOT fix this problem, even with CMOS reset.
I MUST go down to 1002 first to get both Diamondmax10 drives back. I
can then upgrade to 1003 without the losing the drives. This
indicates that there is some "memory" to this board - i.e. reflashing
the BIOS and resetting CMOS does NOT completely reset everything to a
fresh factory state...

I'm the one who wrote the message immediately above. I emailed Maxtor
and Asus about this problem. Asus has not repsonded after two weeks.

Maxtor replied the next day, and emailed me a new firmware for the
DiamondMax10 300MB drive(s) that I have.

To make a long story short, I upgraded (or downgraded - not sure what
they sent me) the firmware on both hard drives. I was then able to
upgrade my A8N-SLI BIOS to 1006. I have been running the system for
about three days now, and on every POST both of my DiamondMax10 drives
show up just fine with no problems. I'll feel safer after a few weeks
have gone by, as it worked for about a day previously, but I do feel
that this fixed the problem.

I have to say, though, woe to the user who has to upgrade the Maxtor
BIOS. I had a heck of a time. The drives were not visible to the
Maxtor utility when connected to the Nforce4 controller (the readme file
says this, too).

I then connected the drives to my Silicon Image 3114 controller on the
motherboard, and although they were visible at POST, the Matrox utility
could not find them.

I then (luckily I had this as a spare) installed an old PCI SATA
controller with a SI3112 controller on board. I then disabled both the
motherboard SI3114 and the NForce4 SATA & IDE, to ensure that the SI3112
BIOS could load. I connected the first DiamondMax10, and the Matrox
utility found the drive. It also warned me that the firmware on the
diskette did NOT match the drive type, and would ruin the drive. I had
seen a posting on the web elsewhere to ignore this warning, which I did
and went ahead with it anyway. The burn went fine and confirmed. After
reboot, the Maxtor utility could NOT find the drive anymore, and neither
could the 3112 at POST.

I was worried at this point, but plowed ahead with the other drive next.
I removed the just burned drive and replaced it on the 3112 with the
second DiamondMax10. This drive was identified at POST and by the
utility, burned fine and confirmed. After reboot, POST saw the drive,
and the Maxtor utility could see the drive, but could NOT list the
firmware info on the drive.

I put both drives back on the NForce4, reenabled in the BIOS, and fired
away. Windows XP loaded up just fine, and NVRaid did not complain of
any problems (my DiamondMax10's are in a Mirror set). I then updated to
BIOS 1006.

I have not had a problem since (3 days).

Anyone having trouble with the DiamondMax10 drives on the A8N-SLI, I
encourage you to email Maxtor. I then wish you the best of luck in
actually getting the firmware burnt to the drive!
 
E

edde

fsda said:
Anyone having trouble with the DiamondMax10 drives on the A8N-SLI, I
encourage you to email Maxtor. I then wish you the best of luck in
actually getting the firmware burnt to the drive!
Just goes to show that the motherboard wasnt at fault as I suspected.
But, wow, what an ordeal to upgrade the HD firmware!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top