C# not closing database connections(?)

  • Thread starter Thread starter MarkusJNZ
  • Start date Start date
M

MarkusJNZ

Hi all, I have just inherited a framework written in C# which is used
to store objects into a database via ADO.NET.

When I did a bulk insert the database started to run slow I logged into

MSSQL enterprise manager and looked at the running processes under
Management-->Current activity-->Process Info. The resulting screen shot

is shown below (sorry about the popups from the image hosting page!)

http://img153.imageshack.us/my.php?image=errorxp3.jpg

If I view the last command on each of these it is "COMMIT TRANSACTION".


What usually causes these sorts of problems???


TIA
Markus
 
Hi all, I have just inherited a framework written in C# which is used
to store objects into a database via ADO.NET.

When I did a bulk insert the database started to run slow I logged into

MSSQL enterprise manager and looked at the running processes under
Management-->Current activity-->Process Info. The resulting screen shot

is shown below (sorry about the popups from the image hosting page!)

http://img153.imageshack.us/my.php?image=errorxp3.jpg

If I view the last command on each of these it is "COMMIT TRANSACTION".

What usually causes these sorts of problems???

It is not just the standard connection pool ?

Arne
 
Hi Arne, first of all to the group, sorry about the multiple posts;
Google news was timing out when I hit the submit button.

Re the connection pooling, these processes seem to stay for a very long
time *read > 13 hours*. Also, as these appear our MSSQL server seems to
get more and more slow :(
thanks again
Markus
 
Hi Markus, you could try to reindex the needed tables to speed them up (or
create indexes if none exists)
Here's the command
DBCC DBREINDEX('TableName')
Also, check for existing locks on the table using sp_lock
Best of luck,
Patrick

Hi Arne, first of all to the group, sorry about the multiple posts;
Google news was timing out when I hit the submit button.

Re the connection pooling, these processes seem to stay for a very long
time *read > 13 hours*. Also, as these appear our MSSQL server seems to
get more and more slow :(
thanks again
Markus
 
How many clients were involved here? 1? 100? This seems a large number
if it was only the 1 client...

Connection pooling does (as suggested by the other poster) keep
connections hanging around... but it shouldn't cuase them to sit there
forever. The idea is that a [single | low number of ] connection(s)
is/are constantly re-used (per client)... but note that this can only
happen if each usage does it correctly.

I would be looking in the C# data code for some way in which:
a: the connection is not closed and disposed [contrary to popular
belief, this merely returns the connection to the thread pool for
re-use; connections *should* be closed when complete]
b: the connection is somehow made accessible outside of that code block
c: do you use different connection string / identity [context] settings
on each call?

For b, I really mean things like:
* does the data code give a reference of the connection to some (for
instance) static collection?
* does the connection reference get persisted on a form or similar?

Basically, if the connection isn't closed, and a reference is left "in
scope", then it will never get garbage collected. So it will never get
closed. This means that every time your app needs a connection it will
have to negotiate a new one with the server. The better solution is to
explicitely close / dispose the connection after *every* usage; when
you next open a connection with the same connection string you will
*probably* get the same physical connection back. If you don't "close",
but do let the reference go out of scope, you can often find a handful
of connections - which is where the previous .Net connection hasn't
been garbage collected yet, so still appears "in use"; this would
explain a low number - maybe into the tens for aggressive usage
[although /possibly/ (at a stretch) higher in a tight loop]

For hundreds of connections on a single client, I'd expect fubar.

Final comment: for a bulk insert, you can also use the specific bulk
insert tasks (descended from "bcp" if that means something...); this is
far more efficient; I would /generally/ use a dedicated bulk insert (of
sculpted, not raw, data), followed by a single stored-procedue to merge
the bulk table with the core data. Of course, not always an option.

Marc
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top