Q: Copying a DataTable

  • Thread starter Thread starter Geoff
  • Start date Start date
G

Geoff

Hi

I have a DataTable with thousands of rows. I want a quick way to insert the
rows into a table in an Access database with the same structure. I have been
using an update command but found it to be very slow i.e. there are about
100,000 rows of data.

Can anybody suggest a quick way to do it?

Geoff
 
I have a DataTable with thousands of rows. I want a quick way to
insert the rows into a table in an Access database with the same
structure. .. . .
there are about 100,000 rows of data

Personally, I'd dump it out into a suitable, flat file and use a
"bulk load" utility rather than trying to write it myself.

IIRC, Access uses something like "get external data" ...

HTH,
Phill W.
 
Geoff,

If you want to do it using a program because it is not a one time approach
(otherwise the way Phil suggest is for me as well the one to go), than I
would not go for the dataadapter however for a datareader and Inserts.

(If the access table is off line than can the xml dataset and your approach
of course be your friend).

I hope this helps,

Cor
 
Hi Cor

I'm afraid I don't follow. What do you mean by "not go for the dataadapter
however for a datareader and Inserts"?

The access table is off-line i.e. on the computer that the application is
running. I'm intrigued by the xml dataset you mention. Could you give a code
example?

Thanks in advance

Geoff
 
Geoff said:
Hi Cor

I'm afraid I don't follow. What do you mean by "not go for the
dataadapter however for a datareader and Inserts"?

The access table is off-line i.e. on the computer that the application is
running. I'm intrigued by the xml dataset you mention. Could you give a
code example?
Not to day anymore, however basicly the key points are.

You need while creating the dataset to set in the reading dataadapter the
property

acceptchangesduringfill = false

A second point you have to know if there are alreayde rows with the same key
in the other dataset.

Very very roughly typed here it is.

Dim conn as new olledbconnection(connections string)
dim da as new oledbdataadapter("Select * from table",conn)
da.acceptchangesduringfill = false
dim ds as new dataset
da.fill(mytable)
ds.writexml(path, schema.....) look for the intelisence for that schema

in the writing program-----------------------
Dim conn as new olledbconnection(connections string)
dim da as new oledbdataadapter("Select * from table",conn)
dim ds as new dataset
ds.readxml(path)
dim cmb as new commandbuilder(da)
da.update(mytable)

You can give it a try, however friday eve is not the best time to take
samples from me.

:-)

Cor
 
Geoff - I wrote an article about this a while ago
http://www.knowdotnet.com/articles/datasetmerge.html and it's in VB.NET.
Essentially what you do is configure two data adapters, one for the source,
one for the destination table. Then, set the AcceptChangesDuringFill
property of th efirst one to false. This will fill the datatable and set
the rowstate of each row to Added. Then use another dataadpter to point to
the destination table and call it's Update method, passing in the exact same
datatable you just filled. This will cause the adapter to fire the Insert
command for each row, thereby adding the rows to the destination table.

However, this is terribly inefficient as far as data transfer technologies
go. Depending on what tools you have available though - it may be a decent
choice. If you're dealing with large amounts of data though - I'd
definitely opt for another approach - ideally DTS.

In the 2.0 Framework, tehre's a SqlBulkCopy object that does this amazingly
quickly - http://msmvps.com/williamryan/archive/2004/07/10/9890.aspx
however it's only for Sql Server so probably won't serve your needs in this
case, I mention it only as a point of reference.

HTH,

Bill
 
Hi Cor

I don't blame you - I have a bottle of wine to open as well! :)

I'll try and understand the code you've sent.

Geoff
 
Cor - I'm not arguing with you - but why write the data to XML when you
already have it right there? Won't this just add file IO overhead?
 
Bill,
Cor - I'm not arguing with you - but why write the data to XML when you
already have it right there? Won't this just add file IO overhead?

I don't blaim you however obvious did you not read the complete thread.

:-)

I understood from Geoff after telling some alternatives that it is to
transport to an offline computer and that it is not a one time operatinon.

:-)

Cor
 
Hi Ryan

Thank you so much for you insightful comments.

As I have indicated in previous posts, I am surprised that the Update
command is so "slow". I accept that when I say "slow" maybe others would say
it isn't but I will give the following example which may indicate the
reasons why I think I'm doing something wrong:

As I mentioned before, I have a large file which I'm attempting to transfer
to a table in an Access database. This data originally exists as a cvs file
and I've written a routine to load the data into a dataset, manipulate it,
and then transfer it to a table in the database using the Update command
i.e.

MyDataAdaptor.Update(MyDataSet)

This works BUT it takes simply ages; we're talking hours here to transfer
around 100,000 rows of data.

However, if use an inbuilt Access tool like TransferText it takes minutes!!!
Can you see my confusion? As far as I can see, the same job is being done
but the difference is dramatic!!!

Either there is something very obvious I'm missing or the Update is very
inefficient (but I don't believe this - I'm more inclined to think that I'm
doing something wrong).

Any comments would be gratefully received.

Geoff
 
Sorry Cor, I've misunderstood what you meant by "off line". What I meant was
that the database is not connected via an ethernet or interenet link; rather
it is simply on the computer that the computer has the application is
running.

However, the comments about XML were most useful - and have helped me with
another unrelated problem so not all is lost.

Geoff
 
Bill,
I would be curious if the Update would still be as slow with the new Batch
Update abilities of .NET 2.0.

http://msdn2.microsoft.com/en-us/library/aadf8fk2(en-US,VS.80).aspx

I would expect it to be slower then SqlBulkCopy but faster then .NET 1.1
updates.

Of course, based on the above link, batch updates do not appear to be
supported with Access...

--
Hope this helps
Jay
T.S. Bradley - http://www.tsbradley.net


| Geoff - I wrote an article about this a while ago
| http://www.knowdotnet.com/articles/datasetmerge.html and it's in VB.NET.
| Essentially what you do is configure two data adapters, one for the
source,
| one for the destination table. Then, set the AcceptChangesDuringFill
| property of th efirst one to false. This will fill the datatable and set
| the rowstate of each row to Added. Then use another dataadpter to point
to
| the destination table and call it's Update method, passing in the exact
same
| datatable you just filled. This will cause the adapter to fire the Insert
| command for each row, thereby adding the rows to the destination table.
|
| However, this is terribly inefficient as far as data transfer technologies
| go. Depending on what tools you have available though - it may be a
decent
| choice. If you're dealing with large amounts of data though - I'd
| definitely opt for another approach - ideally DTS.
|
| In the 2.0 Framework, tehre's a SqlBulkCopy object that does this
amazingly
| quickly - http://msmvps.com/williamryan/archive/2004/07/10/9890.aspx
| however it's only for Sql Server so probably won't serve your needs in
this
| case, I mention it only as a point of reference.
|
| HTH,
|
| Bill
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top