Challenging ADO.NET situation (long)

E

Erik Funkenbusch

I've implemented a solution that I don't really think is very efficient,
but I can't really think of a better way to do it. If anyone has any
suggestions or citicisms, i'd be grateful to hear them. .NET 2.0/SQL 2000
(not 2005).

I have two identical databases that need to be syncronized, seperated by
the internet. I've implemented a web service on one server that will
return a dataset containing all the currently "active" records. I then
need to import (or update) those records into the local database. The
records are structured in a simple master/detail format with multiple
detail tables, depending on a "type" column in the master (ie some records
will use one table for a detail, other records will use another).

The complication is that a) I cannot change the schema. It's defined by an
off the shelf application. b) Records can be added to either database
independantly, but data is always transfered from remote to local, not the
other way around. and c) because of b, i caanot depend on the primary key
of the master being the same on both sides (an auto-increment identity
value). All data besides the primary key should be copied (or updated)
from the retrieved dataset to the local database.

What i've done is identified some unique rows that I can use to create a
secondary (though not "official") key. The database does not constrain
these values to be unique, but the application is designed in such a way
that it will not allow them to violate the uniqueness.

The general flow is like this. I iterate over the master table, then
lookup a row in the local database based on the unquie data. I then issue
an insert if no records are returned, or an update if one is, tediously
copying each field from the datarow to the SqlCommand.Parameters.

I then check my local database for any "active" records that do not exist
in the dataset returned by the remote host, because a record may have been
"closed" between updates. If I find any records not in my remote dataset,
I then have to issue a web service call to the remote host to return the
single record for the closed record. I then update that record with the
returned data ("closing" it).

Then I walk through each of the detail tables, doing a lookup in a
dictionary of remote:local record mappings that created while walking the
master table and insert or update any records there. These records do not
have to be updated if the master record has been "closed".

This is generally a lot of work, and doesn't seem very efficient. If both
databases were local (or even connectable directly via SQL) then I could
use a DTS package to do a lot of this much easier (and do a lot more in
SQL), but because I have to retrieve the data via a web service, it makes
everything more difficult.

The key points that I see as inefficient (and am looking for some thoughs
on how to simplify) are:

1) Getting the data from my remote dataset to the new database. Manually
(and tediously) copying the columns from a datarow to cmd parameters. I
tried using two datasets and copying the datarow ItemArray values, but had
all sorts of troubles with the dataadapter updates and insertions)

2) having to requery the remote database multiple times for each closed
row. It would be more efficient if I could send a list of keys to the
remote server and have it return all those rows. Any suggestions there?
Any easy way to use something like an array of integers in a SQL parameter
to return a list of items in a WHERE clause? I know i could do some stuff
with cursors if I had to, but it doesn't seem like it should be that hard.

3) Any general thoughs on my algorithms or any suggestions for alternative
ways of achieving this? I'm not married to a web service, but it seems the
most efficient. I don't want to "push" changes from the remote server, i'd
rather "pull".

Thanks.
 
G

Guest

I have a similar app but with just one table. I ended up uploading the data
to a SQL table and doing all of the manipulation on the SQL server. It was
just going to be too much hassle doing it with dataTables, etc.
 
G

Guest

Is there any way you could implement this as a DTS package instead of a
custom service? It would like it would be a good bit easier....and more
resource-friendly
 
E

Erik Funkenbusch

Is there any way you could implement this as a DTS package instead of a
custom service? It would like it would be a good bit easier....and more
resource-friendly

Well, like I said, I don't have direct sql access to the remote server. I
could use the web service to get the records, then upload them to a scratch
database, the use DTS with that, however i still have the problem of going
back to the database to get records that have been closed since the last
update.

That's a thought, though.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top