Timeout expired - on a simple SqlDataReader:Read() call

C

CuriousGeorge

I have a very simple .Net 1.1 app that I'm writing to upgrade our
applications database for a new version. This app has a pretty tight loop
where I'm using a SqlDataReader to walk through all records in a fairly
large table, manipulate the data, then write one of the columns back out.
The problem I'm having is that after a certain period of time I'm running
into a System.Data.SqlClient.SqlException specifically:

System.Data.SqlClient.SqlException: Timeout expired. The timeout period
elapsed prior to completion of the operation or the server is not
responding.

The odd thing to me is that the error if happening part way through, after a
couple of thousand records have been processed, and on the Read() call of
the SqlDataReader. Through the logging capabilites in my app I can see each
iteration through the loop and I'm getting about 25 iterations per second.
Why would a read from the SqlDataReader time out like this mid-way? It just
doesn't make sense to me why it would be looping fine then suddenly it can't
read from the datareader? There is no time limit on how long I can have a
reader open is there? It can't be a physical communication issue as it
happens every time, and usually after about the same number of iterations.

SQLServer is running on my machine, as is the application hitting it. When
I start getting the timeouts there is no disk activity and almost no CPU
usage. I've tried increasing the CommandTimeout on the SqlCommand to no
avail (it just makes it take longer to fail but still fails on the same
basic iteration).

Here is a snippet of my logging. You can see it iterating and then within a
split second it fails on a timeout. Doesn't make sense to me.

2/13/2006 3:08:44 PM 3068 - ExtDocStat:
p-0aaba9ff-cd53-4656-b866-23e8362bcd2e
2/13/2006 3:08:44 PM 3069 - ExtDocStat:
p-37e842e5-c86a-4baf-a529-23eb82999d2c
2/13/2006 3:08:44 PM 3070 - ExtDocStat:
p-2df6cfea-04c5-467f-ae87-23ed60232a18
2/13/2006 3:08:44 PM Unexpected error in ExtractDocumentStatus:
System.Data.SqlClient.SqlException: Timeout expired. The timeout period
elapsed prior to completion of the operation or the server is not
responding.


Below is the code I'm using, simplified to show the important info (no
comments and reduced error handling). Am I doing something basic wrong
here?

-Brett-

public void ExtractDocumentStatus(string strConn)
{
SqlConnection sqlConnRead = new SqlConnection(strConn);
SqlConnection sqlConnWrite = new SqlConnection(strConn);

try
{
sqlConnRead.Open();
sqlConnWrite.Open();

SqlCommand sqlCmd = new SqlCommand("SELECT * FROM Portfolio",
sqlConnRead);
SqlDataReader sqlRdr = sqlCmd.ExecuteReader();
int ordID = sqlRdr.GetOrdinal("PortID");

while (sqlRdr.Read())
{
SqlGuid id = sqlRdr.GetSqlGuid(ordID);
string sql = string.format("UPDATE DocData SET Stat='OK' WHERE
Pid='{0}'", id);
SqlCommand sqlCmd2 = new SqlCommand(sql, sqlConnWrite);
sqlCmd2.ExecuteNonQuery();
}

sqlRdr.Close();
}
finally
{
sqlConnRead.Close();
sqlConnWrite.Close();
}
}
 
V

Val Mazur \(MVP\)

Hi,

Why are you doing it in a loop. I believe you could use just one SQL
statement to do this in one batch, something like

UPDATE DocData SET Stat='OK' WHERE Pid IN (SELECT PortID FROM Portfolio)
 
D

Dan Guzman

The CommandTimeout property specifies how long a command can execute on the
server before the client API cancels the query. The timer starts when you
invoke ExecuteReader and is not affected by subsequent Read calls. Note
that you can still retrieve results via your data reader while the command
is executing on the back-end or consume cached results after the timeout.
You'll get the timeout exception when you invoke Read sometime after the
timeout occurs.

As Val suggested, a set-based UPDATE is the best approach if this is
possible in your actual application. You still need to set the
CommandTimeout property to allow for the maximum time the command will
execute on the server. Personally, I usually set CommandTimeout to zero and
keep transactions as short as possible. 25 rows/sec seems a bit slow so
check for check for blocking and perform tuning.
 
D

Dave Markle

Well, first, I would very much recommend trying to do this inside SQL
Server itself, if you can.

Second, it's important to understand what SqlDataReader is actually
doing here. When you call Read() on a DataReader, the ADO.NET framework
isn't actually doing this:

App to DB: Get me one line of data
DB to App: Here's your data
App to DB: Get me one line of data
DB to App: Here's your data
App to DB: Get me one line of data
DB to App: Here's your data

What it really is doing is this:
App to DB: Get me all of the data
DB to App: I'm sending it all now! Buffer my output in another thread
so you don't have to make another round trip!

and that's it. Read() is actually reading the next data row from the
connection's internal buffer, not actually making an extra call to the
database. This gives us an important clue -- your query is still
returning data while you're reading! So you get through a number of
rows before the timeout to the query actually passes, at which point you
get your exception. At least that would be my guess. How many records
are you dealing with in that base table? If you have too many, that's
probably what's causing it to blow up.
 
C

CuriousGeorge

My sample code was a slimmed down version of what I'm really doing. In
reality I am doing a bunch of parsing on each record from the SqlDataReader
then doing an update. I don't have the T-SQL expertise to do the string
parsing in a single sql statement so I'm doing it in code.

-Brett-
 
C

CuriousGeorge

I have experimented with setting the CommandTimeout on the command used for
the SqlDataReader and it does not help. It really just inderts a delay
before the error happens.

-Brett-
 
C

CuriousGeorge

Yes I am sifting through a lot of data with the SqlDataReader. It can't
possibly be caching the entire select results. Otherwise how would it be
any differnet in memory consumption than a DataSet? As I understood it the
whole purpose of a DataReader was to effeciently access large amounts of
data and allow you access to it row by row.

-Brett-
 
W

William \(Bill\) Vaughn

When you execute a query with a DataReader, the backend executes the query
and begins finding rows that match the criteria. Once it finds enough rows
to fill it's buffer (probably cachesize) the server signals the client that
there are rows available and stops looking for rows. It does not cache the
entire rowset on the server or in the pipe. Once a buffer is available
ADO.NET returns control to you (assuming you aren't using the async
BeginExecuteReader). Each time you do a Read, the DAL fetches the cache and
the server fills the next cache block. This means that until you Read the
last buffer, the server holds locks on portions of the server-side rowset
being processed. Your client is delivered a block of rows at a time via a
low-level TDS pipe (assuming we're talking about SQL Server).

Actually, the DataReader is used by the DataAdapter Fill method as well.
It's simply the low-level data access mechanism that's always been there
since the dawn of time--it's just been exposed for the first time since
DB-Library.

No, none of the data access interfaces (like DAO, OLE DB, ODBC, ADO or
ADO.NET) are designed to help you move large amounts of data. That's what
the DTS or BCP interfaces are for. And no, you should avoid application
designs that visit each row in a rowset for individual processing--that's
what stored procedures are for. Why move the data to the client for
processing when the server can do it in place?

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
C

CuriousGeorge

That makes more sense to me Bill. Thanks for the detail.

So what I am trying to do is innefficient, I'll freely admit that, my
reasoning is purely lack of T-SQL expertise and a large amount of string
processing that I need to do on each record (which is simple in C#,
assumably not so simple in T-SQL).

Ignoring my innefficient design, there is nothing fundamentally wrong with
what I am trying to do, correct? So what might be causing my timeout
errors? Am I somehow hitting a deadlock between the DataReader and the
ExecuteNonQuery call on the same table? I really don't know where to go
from here.

-Brett-
 
W

William \(Bill\) Vaughn

Yes, it's not uncommon to lock oneself out of a data table doing precisely
what you're attempting to do.
Another approach you might consider if you have a lot of string handling to
do is to write a TSQL stored procedure that calls a CLR function to do the
string handling. I'm working on the CLR executables chapter for my new book
and my tests show that heavy string handling can be done faster (in some
cases a lot faster) than plain TSQL. If you factor in the overhead of moving
the data to the client and back...

So, why are you timing out? Check sp_lock to see if there are any locks
(remember, they might be row, page or table locks) that are preventing you
from continuing. Another approach might be to create a local DataTable with
the rowset and post-process those rows. Again, I still think a SQL
Server-only solution would be the best alternative. Stretching your TSQL
skills is good for your career too... ;)


--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top