Any advice about large size database transfer?

D

Dancefire

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi,


What i am tring to do is to transfer a database tabe SRC to another
database DEST. Actually it's not only simple transfer, but the row in
DEST is calculated by row in SRC.
For example:

We have a database table SRC, there are columns A, B, C. There are 5
million records in SRC, and totaly is about 500MB.
We have a database table - DEST, which has column AA, BB, CC, DD and
EE. AA, BB and CC is calculate by SRC.A and SRC.B by one go. DD and EE
is calculate by SRC.B and C by one go. you can run algorithm 3 time to
get AA, BB and CC seperatedly, but it will be 3 times slow, and should
not be happen. And the algorithm is complicated which means, should
not be implemented in store procedure.

We want to migrate the data from SRC to DEST. what is the best
solution to get smallest memory and disk cost?

I have tried simple way dataadapterSrc.Fill(datatableSrc) and then
foreach row in datatableSrc to calculate the result and add to
datatableDest table. But after fill, it almost cost my about 800MB
memory, and during my datatableDest.Rows.Add(row) operation. it run
out of memory. and stopped.

and I tried to use DataReader, it's better, but the situation of
datatableDest is same.

I tried to use pure SqlCommand to do the "Inser into " option. But
when I use Access database, the database size increasing dramatically.
and soon reach the 2GB size limits. But after my shrink the database,
it's actually only about 100MB. Maybe too many log in mdb, and too
many garbage space caused by single insert command, but I don't how to
prevent this happen.

I think the perfect solution is just like a pipeline, without keeping
any old records in memory and will not increase the database size
unreasonable, read 1 or 100 and save them, release the resouce and
read more.

Is there any better solution to solve the problem?

Thanks.

Dancefire
- ---
CCNA
http://www.dancefires.com/
http://blog.csdn.net/dancefire/
MSN: (e-mail address removed)
I am interested in Operating System, Embedded System and Network Security.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (GNU/Linux)
Comment: Using GnuPG with Fedora - http://enigmail.mozdev.org

iD8DBQFDbswPRS5AkKgtcCcRAt+6AJ42nY6Za2M9K7Pdd50gSSM5p+B3agCdHAir
8m3lHvhNgNZUenwO3GBSxDw=
=4zEe
-----END PGP SIGNATURE-----
 
C

Cor Ligthert [MVP]

Hi,

If you don't want to use complete SQL commands to copy the tables, than try
to do this using the datareader and the "INSERT" using
command.executenonquery in the backup tables.

A dataset gives you in my opinion only overhead in this case.

Cor
 
D

Dancefire

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi,

If you don't want to use complete SQL commands to copy the tables,
than try to do this using the datareader and the "INSERT" using
command.executenonquery in the backup tables.

A dataset gives you in my opinion only overhead in this case.

Cor
Thanks Cor,

I have tried use pure SQL command, but failed. This time the memory
are perfect, but the size of database are became very large. I try to
use Access mdb file after a while, the database increase too much and
exeed the 2GB limits and stopped. But actually the pure data is not
much big, I think there are too man log or garbagge space in the
database file. When I use "compact and repair database" function, the
mdb file is only about 150 MB. But of course, I can't compact the
database during the transfer. And I don't think I keep watch the
database file size, and if it became 1.5GB, then close the connection,
do shrink ,reopen and do continue is a good solution. Is there any
other solution?

Dancefire

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (GNU/Linux)
Comment: Using GnuPG with Fedora - http://enigmail.mozdev.org

iD8DBQFDbv0fRS5AkKgtcCcRAj/9AJ9xdnlHnKxAtyOfAp9n1/IgLmODQwCdGCG0
F/P67ixnSoLPArYfpyICpvw=
=BCBf
-----END PGP SIGNATURE-----
 
C

Cor Ligthert [MVP]

DanceFire,

I assume that your operation is to huge for Access, use another database
than Access. This one is in my opinion real to small.

Even SQLExpress server has a datalimit of AFAIK 4Gb, although I have also
seen people writting here 2Gb.

If other databaseserver is not possible, than you have in my opinion only a
solution in
-> creating your own sequential datafile (Even an XML dataset will not go
because of that memorylimit)
-> Make a backup from your database.
-> Drop the table
-> Create the table and columns again
-> Read the records on disk again streaming using the sqlnonquery and
insert again into the database.

Just my thought,

Cor
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top