SP4 & Hotfix

C

Cary Shultz

-----Original Message-----
The Hotfix Installation & Deployment Guide leads me to
believe that I can integrate hotfixes into my SP4 setup.
the example giving in the document shows how to integrate
Windows, SP4 and Hotfixes together, but since I don't want
to include Windows in the setup, the example doesn't do me
any good.

Does anyone have any ideas?
.
John,

What I do - which I gather you do not want to do - is to
use RIS, create the WIN2000 SP4 image, create an $OEM$
folder at the appropriate level, place all of the
hotfixes - for SP4 there should be only five or so -
inside that $OEM$ folder, throw in QCHAIN and use
CMDLINES.TXT. Naturally, I have assimilated WIN2000 and
SP4 into the one I386 folder ( the image ). I also throw
in a few other goodies, like DirectX 9.0a.

I gather that you want to somehow "integrate" the hot
fixes with the Service Pack ( because you stated that you
do not want to include Windows ). Not sure how you would
do this...Have never given that any thought.

Cary
 
G

Gerry Hickman

Hi,

I've been working on this most of the week. The important distinction is
the difference between creating a "new integrated build", as opposed to
"updating a workstation that already exists".

In some ways, it's easier to create a build from scratch than it is to
update 600 machines with the service pack and hotfixes.

<grumble mode>
For something that should be so simple and basic, the docs seem obtuse
beyond belief. The official SP4 deployment guide does not seem to cover
the basic need to update 600 workstations unless you have GPO and/or
SMS. I've never tried SMS for service pack deployment so I can't
comment, but I can't help thinking GPO would have similar limitations to
those I've experienced.
</grumble mode>

What you basically want to do is force an overnight service pack
roll-out (with post-sp4 patches) and then reboot every machine.

There's various issues with the various "remote command" systems. For
example some of them won't allow you domain access to the share. Some
need a "client" on every machine. You then have the problem that you
don't want your password hard coded in a text file and sent over the
network in plain text, so this rules out scheduling. The AT command
appears to deny domain level access (not totally sure about this though).

You then have things like SUS, but I don't know if it can do service
packs? I didn't see it in the docs.

This is then further compounded in that the service pack EXE is not CMD
friendly. It's all or nothing, you can't view it's progress, nor can you
query it's HRESULTS as it runs each process.

In the end, I went back to the stoneage. I used PsExec from SysInternals
to create remote processes in the context of a domain account and then
fired up a batch file on a network share that loaded the service pack in
silent mode (no reboot) followed by the patches. You can at least check
the return codes of each process and you can monitor them remotely with
PViewer from the Windows 2000 support tools. You can then query the
event logs with WMI to ensure all patches were installed correctly.
Ultimately, though the only way you'll ever know the HRESULT of each SP4
process is by looking at the log file on every machine. Not a lot of fun.

I can only imagine most people on here have a far superior way of doing
this.

One thing I'd say at this point though, is that PsExec is an amazing
tool, better than anything Microsoft have, and it's also free!
 
G

Gerry Hickman

Hi Dave,

The short answer is "no". Applying service packs rarely fixes thing kind
of thing. I'd strongly suggest you junk everything and start again.
Upgraded builds are absolutely horrible. If you've got roaming profs,
just install the same software and log-on as normal; nearly all the
settings will migrate.

On my own network, I've got about six builds that were upgrades from
NT4, all the rest are "clean". The upgrades get about 10 times more
support calls than the clean builds...
 
C

Cary Shultz

-----Original Message-----
Hi,

I've been working on this most of the week. The important distinction is
the difference between creating a "new integrated build", as opposed to
"updating a workstation that already exists".

In some ways, it's easier to create a build from scratch than it is to
update 600 machines with the service pack and hotfixes.

<grumble mode>
For something that should be so simple and basic, the docs seem obtuse
beyond belief. The official SP4 deployment guide does not seem to cover
the basic need to update 600 workstations unless you have GPO and/or
SMS. I've never tried SMS for service pack deployment so I can't
comment, but I can't help thinking GPO would have similar limitations to
those I've experienced.
</grumble mode>

What you basically want to do is force an overnight service pack
roll-out (with post-sp4 patches) and then reboot every machine.

There's various issues with the various "remote command" systems. For
example some of them won't allow you domain access to the share. Some
need a "client" on every machine. You then have the problem that you
don't want your password hard coded in a text file and sent over the
network in plain text, so this rules out scheduling. The AT command
appears to deny domain level access (not totally sure about this though).

You then have things like SUS, but I don't know if it can do service
packs? I didn't see it in the docs.

This is then further compounded in that the service pack EXE is not CMD
friendly. It's all or nothing, you can't view it's progress, nor can you
query it's HRESULTS as it runs each process.

In the end, I went back to the stoneage. I used PsExec from SysInternals
to create remote processes in the context of a domain account and then
fired up a batch file on a network share that loaded the service pack in
silent mode (no reboot) followed by the patches. You can at least check
the return codes of each process and you can monitor them remotely with
PViewer from the Windows 2000 support tools. You can then query the
event logs with WMI to ensure all patches were installed correctly.
Ultimately, though the only way you'll ever know the HRESULT of each SP4
process is by looking at the log file on every machine. Not a lot of fun.

I can only imagine most people on here have a far superior way of doing
this.

One thing I'd say at this point though, is that PsExec is an amazing
tool, better than anything Microsoft have, and it's also free!
Gerry,

You point is dead-on. The key difference is "new builds"
vs. "existing builds". While I have spent some time
getting my "new builds" procedure finalized, I knew that
this was but a small part of the total picture. Once
that "new build" hits the floor, it becomes an "existing
build".

This is where the fun begins....

Cary
 
G

Gerry Hickman

Hi Cary,
You point is dead-on. The key difference is "new builds"
vs. "existing builds". While I have spent some time
getting my "new builds" procedure finalized, I knew that
this was but a small part of the total picture. Once
that "new build" hits the floor, it becomes an "existing
build".

This is where the fun begins....

Yes, I've outlined my own findings above, if you get further with this
post back in the coming weeks. My current strategy allows for choosing
target workstations and times for update. Some of the automatic tools
give me cause for concern, for example if you tell GPO to apply SP4 at
midnight tomorrow to your OU, and if there's a conflict with (say) your
NICs, you could find yourself with waking up to a whole company of blue
screens, and no easy way to roll them all back!

I don't have GPO yet or SUS, so this is something I'll test in the
future, but I'm certainly surprised at the lack of docs and discussion
on service pack roll-out strategies.

I asked a couple of friends that do big companies in the UK, and they
said it's basically junior techs going round at night doing them
manually! In my job, I have to be there during office hours, and don't
get paid for evenings or week-ends so this kind of thing is always tricky.

One problem with schedules too, is how do you get the password in there;
you certainly don't want it in a BAT file. There's probably a way round
this with task scheduler and getting it to call a script or BAT file in
the context of a domain account - need to check this. Even if you do
this, you've still got the problem of how to deal with errors. It's no
good if the update.exe process has hung for some reason in the middle of
the night.
 
C

Cary Shultz

-----Original Message-----
Hey Cary,

Just a note on SUS. We deploy it in our environment and it works well. If
you test it, the best thing I've found is to make an OU with just one pc in
it, then run your tests. Of course, there will be modifications to your
server configuration during setup. Rather than test on the same pc in the
test OU, flatten the pc and RIS it again so you have a fresh install (or
take a cd to it, whatever). Some of the changes made to the SUS GPO seem to
"stick" for some time and don't immediately change as you would expect,
which leads to a lot of frustration. Using a fresh Win2k/WinXP install on
your test machine for each configuration change will save many headaches.

There is a ng for SUS, doing a google search will turn up the most common
problems:

microsoft.public.softwareupdatesvcs

If you decide to deploy SUS, feel free to ask me any questions you want.

--
Scott Baldridge
Windows Server MVP, MCSE


"Cary Shultz"



.
Scott,

Many thanks for the tip and the offer. Be careful! I
might just atake you up on it. I think that I am going
to start messing with RIS and this update stuff is
becoming a pain in the rear ( although, billable hours
are always a good thing! ).

Thanks again,

Cary
 
J

Joe Morris

Gerry said:
The short answer is "no". Applying service packs rarely fixes thing
kind of thing. I'd strongly suggest you junk everything and start
again.
Upgraded builds are absolutely horrible. If you've got roaming profs,
just install the same software and log-on as normal; nearly all the
settings will migrate.

Microsoft has been pushing the idea of "upgrade in place" for many, many
versions of Windows -- at least when you talk to the salesdroids. It's
interesting that when Windows 2000 was released my shop (~7000 Windows
desktop systems worldwide) contracted directly with Microsoft for
systems engineering support for our transition, and every one of the
Microsoft engineers (these are MS employees, not MSCE's) were adamant
that upgrading in place was a lousy idea.

When we upgraded from Win31 to WIN95 I reluctantly documented an
upgrade-in-place procedure which did work for most users, but for other
introductions of new Windows versions I've been able to enforce a policy
of supporting only clean installations. The users who have insisted on
doing it "their way" and using upgrade-in-place are the source of of
more trouble tickets than the ones who followed the rules.

===

Speaking of installing clean images rather than using upgrade-in-place,
what experience does anyone have with tools that claim to migrate user
data and configuration settings? Some of our managers have discovered
Miramar's "Desktop DNA" product and we're looking at it; does anyone
have comments or suggestions?

Joe Morris
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top