DataReader and record locking problem

M

Mike P

Hi,

I am using the ExecuteReader method from the Microsoft DAAB to return a
DataReader which I am iterating through to mainly read a binary value from
one of the returned columns. I am using this value as the raw data to render
a PDF document. After this is successful, I want to update the same record
with a timestamp to indicate the render to PDF was successful. However,using
a totally separate method in my code (using ExecuteNonQuery of the DAAB) to
do this whilst the reader is still open (using the pk of the row I want to
update), throws an exception - the call times out and if I look at the
processes in the database, the update of the timestamp is being locked by
the datareader process. My code is the only code accessing this table and
the column I am updating is not one of the ones returned by the datareader.
So now on to the questions:

1) Should I theoretically be able to read a row with a datareader and update
a column in the same row (using a separate connection/call) before moving on
to the next record?
2) If I cannot do this, what is the best and most efficient practice to
perform this kind of operation (I may have to do this for up to 30,000
rows)? Do I get a list of all the rows I want to operate on (in an array or
datatable) and then iterate through each one, pulling the relevant data back
for one row at a time in a disconnected fashion for each row and then
perform the update or should i just pull back all the records in a
disconnected fashion (this seems like it would work when there are a small
amount of rows but then if I have 30,000, that's going to be one very big
DataSet/DataTable?

Any help would be greatly appreciated

Thanks in advance

Mike
 
S

Sahil Malik

Answer to #1 - Yes you should be able to - the below code (although I'd
never write such code in production) - will work -
SqlConnection conn = new SqlConnection(CONN_STR);
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = "Select * from X Where X = 1";
conn.Open();
SqlDataReader dread = cmd.ExecuteReader();
dread.Read();
SqlConnection conn2 = new SqlConnection(CONN_STR);
SqlCommand cmd2 = conn2.CreateCommand();
cmd2.CommandText = "Update X set Z = '12/12/04' Where X = 1";
conn2.Open();
int i = cmd2.ExecuteNonQuery();
Console.WriteLine(i);
conn.Close();
conn2.Close();

Answer to #2 - You shouldn't have to resort to Dataset, though that would
certainly work. Though 30,000 rows in a dataset although is handleable is
generally more rows than I'd like to see in a DSet.

Since you must use Two seperate connection objects, the transaction object
will not be of much help.
If i had to diagnose your problem I'd do the following in order -
1. Try putting Select NO LOCK on your Select statement for datareader. See
if that helps.
2. Another thing you could try doing is get rid of DAAB (not sure what
overloads ur using) and write direct ADO.NET code similar to the what is
written above - if that locks too, we would have isolated the problem to
your tables. Maybe you are locking more than you should. In which case you
might have to address the problem outside of ADO.NET (at the d/b level), or
if that doesn't lock you could then alternatively move to #3.
3. if the Above code written in pure ADO.NET doesn't lock, dump DAAB and
write pure ADO.NET code like the above. When you do write code like the one
above, try abstracting it into a dataaccess layer, call executeReader with
commandbehavior.closeconnection and either close connections in the DAL or
close them always using "using" blocks.

PS: My table structure for X in the above code looks like this --
CREATE TABLE [dbo].[X](
[X] [int] NOT NULL,
[Y] [varchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Z] [datetime] NULL


PRIMARY KEY CLUSTERED
(
[X] ASC
) ON [PRIMARY]
) ON [PRIMARY]

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik
 
S

Sahil Malik

Answer to #1 - Yes you should be able to - the below code (although I'd
never write such code in production) - will work -

SqlConnection conn = new SqlConnection(CONN_STR);
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = "Select * from X Where X = 1";
conn.Open();
SqlDataReader dread = cmd.ExecuteReader();
dread.Read();
SqlConnection conn2 = new SqlConnection(CONN_STR);
SqlCommand cmd2 = conn2.CreateCommand();
cmd2.CommandText = "Update X set Z = '12/12/04' Where X = 1";
conn2.Open();
int i = cmd2.ExecuteNonQuery();
Console.WriteLine(i);
conn.Close();
conn2.Close();

Answer to #2 - You shouldn't have to resort to Dataset, though that would
certainly work. Though 30,000 rows in a dataset although is handleable is
generally more rows than I'd like to see in a DSet.

Since you must use Two seperate connection objects, the transaction object
will not be of much help.
If i had to diagnose your problem I'd do the following in order -
1. Try putting Select NO LOCK on your Select statement for datareader. See
if that helps.
2. Another thing you could try doing is get rid of DAAB (not sure what
overloads ur using) and write direct ADO.NET code similar to the what is
written above - if that locks too, we would have isolated the problem to
your tables. Maybe you are locking more than you should. In which case you
might have to address the problem outside of ADO.NET (at the d/b level), or
if that doesn't lock you could then alternatively move to #3.
3. if the Above code written in pure ADO.NET doesn't lock, dump DAAB and
write pure ADO.NET code like the above. When you do write code like the one
above, try abstracting it into a dataaccess layer, call executeReader with
commandbehavior.closeconnection and either close connections in the DAL or
close them always using "using" blocks.

PS: My table structure for X in the above code looks like this --
CREATE TABLE [dbo].[X](
[X] [int] NOT NULL,
[Y] [varchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Z] [datetime] NULL


PRIMARY KEY CLUSTERED
(
[X] ASC
) ON [PRIMARY]
) ON [PRIMARY]

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik
 
M

Mike P

Sahil,

Thanks for the replies and I'll certainly give your suggestions a try. I
have a suspicion it may be the DAAB - None of the overloads seem to allow
you to pass in a commnadbehavior - I guess with this kind of abstraction,
you gain in some ways (less code etc) but lose in others (not as much
control, flexibility). Just to clarify, you said "although I'd never write
such code in production" - Are you refering to the fact that you would never
have an open datareader and then issue an update or just that this code is
meant to illustrate the point and has no exception handling etc....

Thanks again for your help

Mike

Sahil Malik said:
Answer to #1 - Yes you should be able to - the below code (although I'd
never write such code in production) - will work -
SqlConnection conn = new SqlConnection(CONN_STR);
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = "Select * from X Where X = 1";
conn.Open();
SqlDataReader dread = cmd.ExecuteReader();
dread.Read();
SqlConnection conn2 = new SqlConnection(CONN_STR);
SqlCommand cmd2 = conn2.CreateCommand();
cmd2.CommandText = "Update X set Z = '12/12/04' Where X = 1";
conn2.Open();
int i = cmd2.ExecuteNonQuery();
Console.WriteLine(i);
conn.Close();
conn2.Close();

Answer to #2 - You shouldn't have to resort to Dataset, though that would
certainly work. Though 30,000 rows in a dataset although is handleable is
generally more rows than I'd like to see in a DSet.

Since you must use Two seperate connection objects, the transaction object
will not be of much help.
If i had to diagnose your problem I'd do the following in order -
1. Try putting Select NO LOCK on your Select statement for datareader. See
if that helps.
2. Another thing you could try doing is get rid of DAAB (not sure what
overloads ur using) and write direct ADO.NET code similar to the what is
written above - if that locks too, we would have isolated the problem to
your tables. Maybe you are locking more than you should. In which case you
might have to address the problem outside of ADO.NET (at the d/b level),
or if that doesn't lock you could then alternatively move to #3.
3. if the Above code written in pure ADO.NET doesn't lock, dump DAAB and
write pure ADO.NET code like the above. When you do write code like the
one above, try abstracting it into a dataaccess layer, call executeReader
with commandbehavior.closeconnection and either close connections in the
DAL or close them always using "using" blocks.

PS: My table structure for X in the above code looks like this --
CREATE TABLE [dbo].[X](
[X] [int] NOT NULL,
[Y] [varchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Z] [datetime] NULL


PRIMARY KEY CLUSTERED
(
[X] ASC
) ON [PRIMARY]
) ON [PRIMARY]

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik





Mike P said:
Hi,

I am using the ExecuteReader method from the Microsoft DAAB to return a
DataReader which I am iterating through to mainly read a binary value
from one of the returned columns. I am using this value as the raw data
to render a PDF document. After this is successful, I want to update the
same record with a timestamp to indicate the render to PDF was
successful. However,using a totally separate method in my code (using
ExecuteNonQuery of the DAAB) to do this whilst the reader is still open
(using the pk of the row I want to update), throws an exception - the
call times out and if I look at the processes in the database, the update
of the timestamp is being locked by the datareader process. My code is
the only code accessing this table and the column I am updating is not
one of the ones returned by the datareader. So now on to the questions:

1) Should I theoretically be able to read a row with a datareader and
update a column in the same row (using a separate connection/call) before
moving on to the next record?
2) If I cannot do this, what is the best and most efficient practice to
perform this kind of operation (I may have to do this for up to 30,000
rows)? Do I get a list of all the rows I want to operate on (in an array
or datatable) and then iterate through each one, pulling the relevant
data back for one row at a time in a disconnected fashion for each row
and then perform the update or should i just pull back all the records in
a disconnected fashion (this seems like it would work when there are a
small amount of rows but then if I have 30,000, that's going to be one
very big DataSet/DataTable?

Any help would be greatly appreciated

Thanks in advance

Mike
 
S

Sahil Malik

"although I'd never write such code in production"
<-- What I meant was, I'd try and abstract my data access into a data access
layer so I am sure that my connections are ALWAYS closed, everyone does use
only one connection string (to ensure pooling), and should I ever have my
own connection it'd be in a using block for the same reasons. (So dispose is
called on it automatically).

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik




Mike P said:
Sahil,

Thanks for the replies and I'll certainly give your suggestions a try. I
have a suspicion it may be the DAAB - None of the overloads seem to allow
you to pass in a commnadbehavior - I guess with this kind of abstraction,
you gain in some ways (less code etc) but lose in others (not as much
control, flexibility). Just to clarify, you said "although I'd never write
such code in production" - Are you refering to the fact that you would never
have an open datareader and then issue an update or just that this code is
meant to illustrate the point and has no exception handling etc....

Thanks again for your help

Mike

Sahil Malik said:
Answer to #1 - Yes you should be able to - the below code (although I'd
never write such code in production) - will work -
SqlConnection conn = new SqlConnection(CONN_STR);
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = "Select * from X Where X = 1";
conn.Open();
SqlDataReader dread = cmd.ExecuteReader();
dread.Read();
SqlConnection conn2 = new SqlConnection(CONN_STR);
SqlCommand cmd2 = conn2.CreateCommand();
cmd2.CommandText = "Update X set Z = '12/12/04' Where X = 1";
conn2.Open();
int i = cmd2.ExecuteNonQuery();
Console.WriteLine(i);
conn.Close();
conn2.Close();

Answer to #2 - You shouldn't have to resort to Dataset, though that would
certainly work. Though 30,000 rows in a dataset although is handleable is
generally more rows than I'd like to see in a DSet.

Since you must use Two seperate connection objects, the transaction object
will not be of much help.
If i had to diagnose your problem I'd do the following in order -
1. Try putting Select NO LOCK on your Select statement for datareader. See
if that helps.
2. Another thing you could try doing is get rid of DAAB (not sure what
overloads ur using) and write direct ADO.NET code similar to the what is
written above - if that locks too, we would have isolated the problem to
your tables. Maybe you are locking more than you should. In which case you
might have to address the problem outside of ADO.NET (at the d/b level),
or if that doesn't lock you could then alternatively move to #3.
3. if the Above code written in pure ADO.NET doesn't lock, dump DAAB and
write pure ADO.NET code like the above. When you do write code like the
one above, try abstracting it into a dataaccess layer, call executeReader
with commandbehavior.closeconnection and either close connections in the
DAL or close them always using "using" blocks.

PS: My table structure for X in the above code looks like this --
CREATE TABLE [dbo].[X](
[X] [int] NOT NULL,
[Y] [varchar](10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Z] [datetime] NULL


PRIMARY KEY CLUSTERED
(
[X] ASC
) ON [PRIMARY]
) ON [PRIMARY]

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik





Mike P said:
Hi,

I am using the ExecuteReader method from the Microsoft DAAB to return a
DataReader which I am iterating through to mainly read a binary value
from one of the returned columns. I am using this value as the raw data
to render a PDF document. After this is successful, I want to update the
same record with a timestamp to indicate the render to PDF was
successful. However,using a totally separate method in my code (using
ExecuteNonQuery of the DAAB) to do this whilst the reader is still open
(using the pk of the row I want to update), throws an exception - the
call times out and if I look at the processes in the database, the update
of the timestamp is being locked by the datareader process. My code is
the only code accessing this table and the column I am updating is not
one of the ones returned by the datareader. So now on to the questions:

1) Should I theoretically be able to read a row with a datareader and
update a column in the same row (using a separate connection/call) before
moving on to the next record?
2) If I cannot do this, what is the best and most efficient practice to
perform this kind of operation (I may have to do this for up to 30,000
rows)? Do I get a list of all the rows I want to operate on (in an array
or datatable) and then iterate through each one, pulling the relevant
data back for one row at a time in a disconnected fashion for each row
and then perform the update or should i just pull back all the records in
a disconnected fashion (this seems like it would work when there are a
small amount of rows but then if I have 30,000, that's going to be one
very big DataSet/DataTable?

Any help would be greatly appreciated

Thanks in advance

Mike
 
W

W.G. Ryan eMVP

Mike:

Looks like Sahil hit on everything but let me mention one thing. When you
use a DataAdapter to fill a datatable - it takes all of the data in the
query and then fills the datatable with it.

A DataReader is a different beast. Lets say that you had 100,000 records in
a query and didn't have a where clause in it - hence returning all of the
rows in a given table. So you call .ExecuteReader and then throw in a
thread.Sleep(someNumber). Now, assume that while someNumber had yet to
elapse, another 100,000 rows were added to the db. Then someNumber elapses
and you called while(dataReader.Read()){

}

It's entirely possible that you are going to end up with 200,000 passes in
that .Read() loop even though there were only 100,000 rows at the onset.
Since DB operations are usually very fast, it would probably take a pretty
controlled environment to get exactly 200,000 but the point is that if data
is added when Read is being called, the number of records that you may have
been originally expecting may well change.

This isn't true with the DataAdapter b/c even though a reader is used to
fill it - you'll only get what was there when you fired the query.

If you're dealing with 30,000 records and you don't need them as soon as
that update is ready - I'd definitely try to avoid using a DataAdapter here.

As usual, I agree with Sahil here.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top