SQL transferring data between servers

G

garther

Hi

I'm trying to get something like this to work:

Application calculates data and stores them in local SQL server.
Once a week this data needs to copied to DVD. Then I need second
application which has to take this data from DVD and send to global SQL
server.

Talking directly from remote hosts to main database is not an option.

How can I dump data from remote SQL servers in an easy way and then put
them onto main database server?

Is it possible to create a "dump" of database but only realated to IDs I
want, save it to file and then just run on the main database server?

Thanks for any help with this!
 
N

Nicholas Paldino [.NET/C# MVP]

I would say that you should just do a backup of the database, but a
backup would not allow you to selectively choose the rows in the tables you
want to back up.

That being said, you might want to consider a data transformation
service which you could tweak to export the rows that you want to another
data format (excel, comma delimited file, etc, etc) and then use a data
transformation service on the main DB to import the data back into your main
DB.
 
D

DeveloperX

garther said:
Hi

I'm trying to get something like this to work:

Application calculates data and stores them in local SQL server.
Once a week this data needs to copied to DVD. Then I need second
application which has to take this data from DVD and send to global SQL
server.

Talking directly from remote hosts to main database is not an option.

How can I dump data from remote SQL servers in an easy way and then put
them onto main database server?

Is it possible to create a "dump" of database but only realated to IDs I
want, save it to file and then just run on the main database server?

Thanks for any help with this!

As Lo-tech as it sounds, one of the quickest ways I found to do this
was to spool the rows to a text file with a little c# app and then use
bulk insert.
I was trying to optimise a stored procedure that took about 20 minutes
to run. I got it down to about 30-45 seconds. The simplest way to
maintain integrity is to timestamp the rows, with an update time too
if you're looking at updating records, and record the last time an
export ran and simply spool out records created/modified after that
point. It gets a bit more complicated if you're going to have multiple
local instances which could update the same data, but it's not
impossible.

You can handle deletes in the same way. Retain the record and mark it
as deleted, update it's time stamp and then export as usual. If
there's no requirement to retain the record (for auditing perhaps)
simply schedule a job to delete anything exported and imported with a
delete flag set to true.

From memory bulk insert can be done from the command line using the
BCP utility. I can't remember the last time I used that though.

The great thing about this solution is you can stream the text file
through a cryptographic/compression stream. Here in the UK there's
been a bit of political outrage when the government lost 25 million
personal records on some cd's, so it might be worth considering if the
data is sensitive.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top