A little off topic, but ...

  • Thread starter Thread starter Tim Gallivan
  • Start date Start date
T

Tim Gallivan

I'm a developer for a government ministry (I'll let you guys figure out the
rest). The IT "powers-that-be" have stipulated that all software development
be performed on a separate network that in no way connects to the "main"
network - ever. IT's blanket reason is "network security". These separate
networks cost a lot of money - a second database server, app server and a
couple of development machines, they are unpatchable (can't connect to the
web to get updates), you've got to burn a cd everytime you bring out a new
version, and burn a DVD to move big database updates back and forth. OK,
enough griping!

Would any of you be able to point me at some resources dealing with software
development strategies, or even personal examples of how they do it where
you work? I need to come up with a strategy for my department.

Thanks in advance,
Tim Gallivan
I know I'm a great teacher because when I give a lesson, the person never
comes back.
 
We use a 4-tier development system for our enterprise software. We have a
production server, a quality insurance server (for final testing), a
development server and a sandbox server. Each of the last three are 'clones'
of the production server that are refreshed periodically.

My advice would be to follow something similar. If you are maintaining your
dev servers seperately, and patching them seperately, that will weaken their
ability to be good 'test' servers for production, since you can't guarantee
that they are the environments are the same.

-James
 
Thanks, James.

I'm assuming your four servers are on the same network. And what exactly is
a sandbox server?
 
First off, there is no reason why the development network shouldn't connect
to the web.
It doesn't have to connect to the main network in order to connect to the
web.

Secondly, if you are using a Microsoft network (and I assume you are,
considering that you posted here :-),
then suggest to IT that they can set up the dev network so that the main
network does not trust it.

Then, code developed on the dev network cannot access resources on the main
network.

A firewall can prevent most TCP ports from passing through. Nice thing
about a firewall: you can set it up so that connections that originate in
the main network can get a response from the dev network. You can then use
terminal server from your desktop to drive the server, install software, do
other work...while the dev server cannot host code that can access the main
network. Firewalls are not very expensive, either.

The "Network Security stick" has swung too far in your organization, my
friend. There are better ways to secure the network than to simply assume
that developers cannot be trusted.

--- Nick
 
Tim,

All of our systems are on the same network, but there really is no reason
that they would have to be. If you were using something like Symantec Ghost
to clone your machines, you could simply create a carbon copy of your
production server (off network) and then modify the network settings to fit
your test environment.

our sandbox is used to play around and test things that have a high
potential of blowing something up. It'sa nice lucury, but not critical.

I understand the theory of 'protection by isolation', keeping your dev
environment offline, but there are practical ways of ensuring securty while
still allowing limited network and internet access. However, since you're
working for a governmental agency, I'm sure security is a very high concern.

-James
 
James,
Thanks again. A separate network is doable, but getting data on/off it
(remember it has to be physically separate) is a royal pain not to mention
expensive, and the developer can't easily look things up on the web.
Do your web developers develop locally (e.g. with IIS installed on their
workstation?)
 
For getting data on and off of the networks, in your situation, I'd clone
to a cd/dvd and handle it that way.

We do develop on the same network as our production systems, and debug
locally for most of our stuff, but we do have some fairly strict testing and
sign-off requirements before something makes it to the production server.
 
Hello James,

James Curran said:
But there may be practical considerations: To keep the networks
physically separate (as required by the OP), two separate internet gateway
machines, each with it's only internet connection.

If you have a single ISP hookup (T3, I assume), you can put many IP address
ranges on that connection. A switch and two firewall machines will get you
to two networks, in total isolation. If you need proxy services, then you'd
need seperate proxy services. After that, there's no distinction
whatsoever.

VLANs can easily be configured on the same switches, so you don't need more
internal switches to keep the networks seperate. Routing tables can be
controlled using IPSec.

The practical consideration of having 10 developers and testers, each of
whom will spend the equivalent of 30 minutes per week in productivity loss
due to this configuration FAR outweighs the cost of the extra hardware.

To the OP:
The agency is right to be cautious about security. However, in Ontario,
there is a smart company called Sierra Systems Group. They have some
brilliant network engineers who can help the agency set up security without
compromising efficiency. They set up security at a number of provincial
agencies throughout Canada, and had a major hand in setting up the
healthcare EDI solutions used by the BC government. They are the largest
Canadian systems integration consulting company, and a former employer of
mine. I'd love it if you would give them a call and have someone come out
and discuss options for security with you.

You can also call Microsoft Consulting Services. No one knows how to secure
their products better than they do. Many of MCS's clients are far more
concerned about security than your agency is, and that says something. Think
Banks, Insurance companies, Healthcare organizations, etc.

I think you will find that you have more options than you think.

--- Nick
 
Personally I think that keeping the networks separate is a sound idea. that
way your production environment cannot be affected by the developers.
Ideally you should also have a separate test network that is a duplicate of
the production environment so that when you test the software you know that
its going to behave as it should in the production environment.

If the cost of the hardware is an issue then possibly using Vitual PC could
work. There are a number of companies offer software for this, a couple of
examples are MS Virtual PC/Server and VMWare.

What you really need is a sound Configuration Management policy. This will
allow all of the networks to kept in sync and manage any changes to the
networks and software. CM covers more that just having something like
SourceSafe to check the source code in and out. It applies to the software
you install on the computers and the patches that are applied. The way the
software is configured, which includes the operating system, networks etc,
all of the documentation that is relevant, the way bugs are raised, fixed
and released and new changes are requested by the users.

Paul.
 
To be honest, Paul, I agree with the idea of seperate hardware for dev,
test, and prod.

If anything, Virtual Server is even better for dev and maintenance because
you can keep around an entire machine setup that is particular to an
application without spending money and deskspace on a physical system.

My objection is not to having the hardware seperate. My objection is to the
unnecessarily strict interpretation that prevents packets from being routed
from one network to another, preventing data transfers, install packages,
and software updates to pass electronically...

That is silly and inefficient. Developers are not the enemy. If anything,
they are far more aware of security issues than non-technical employees.

--- Nick
 
Back
Top