TransactionScope cannot process big files (too many connections op

G

Guest

I use OO application architecture. I have data access object for every
dartabase tables. Each DataAccess object corresponding to one record in
table. When I insert a record, I call this DataAccess object for this table
and run it's Insert method. This method will open connection, insert record
to table, then close the connection.

Now I want to read a disk file and insert records from the disk file into
this table. I have a loop to read the file line by line, create DataAccess
object one by one, run its Insert method. Connection object will created and
closed then re-used. Everything works perfectly.

Now I want to use TransactionScope to make this file inserting process
transactional. I have the problem of I can only process small files. If I
process big files with over 4000 records, I get different strange error
messages. Actually I think the real error is TransactionScope holds all of my
connections opened in my DataAccess object open even after I explicitly
closed them in code. SQL server cannot handle this many open connections any
more.

Does anybody know how I can re-use those closed connections inside
TransasctionScope if the connection string is the same?

Thanks a lot.
 
S

Sahil Malik [MVP C#]

BF,

Your best bet in this case is to reuse the same SqlConnection object, open
it once, and process your 4000 rows, then close it.
This code within a TransactionScope connected to SQL2k5, won't promote, or
cause SQL Server to go nuts.

BTW, you can run sp_who2 on your db to actually see the actual physically
open connections.

- Sahil Malik [MVP]
ADO.NET 2.0 book -
http://codebetter.com/blogs/sahil.malik/archive/2005/05/13/63199.aspx
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top