ADO.NET query execution much slower than SQL Management Studio

D

dustbort

I have a stored procedure that when called from SQL 2005 Management Studio
takes less than one second to execute. The same SP, when called from .NET
code takes about 13 seconds. I am using a SqlCommand object with
CommandType set to StoredProcedure and I am passing arguments through the
parameters collection. I have tried using a SqlDataReader and a DataAdapter
to retrieve the data, but both are equally slow. From stepping thru the
debugger, I know that the specific statement that takes a long time to
execute is either reader.ExecuteReader() or dataAdapter.Fill(dataTable),
equivalently depending on the method I tried. I did a trace in the
Profiler, and got nearly identical result for either method of .NET
SqlClient Data Provider. Here is an example:

EventClass: RPC:Completed
CPU: 13390
Reads: 559475
Writes: 0
Duration: 13496
Binary Data: (a long hex value)

When I copied the TextData (SQL Statement being executed) from Profiler into
SQL Management Studio, I get the following trace:

EventClass: SQL:BatchCompleted
CPU: 437
Reads: 9998
Writes: 0
Duration: 440
BinaryData: (empty)

(Immediately prior to this there is a corresponding SQL:BatchStarting trace,
with empty CPU, Reads, Writes, and Duration columns.)

What could explain the orders of magnitude difference in reads and duration?
Is the problem due to RPC? What about the binary data? I have tried using
the overload of ExecuteReader(CommandBehavior.SingleResult) with no
improvement. What can I try to improve it?

Thanks,
Dustin
 
C

Cowboy \(Gregory A. Beamer\)

Make sure you are testing properly. Run the query twice from ADO.NET and see
if you have a different read time. There are two reasons I can think of,
without much thought, why you might get different read times from #1 to #2.

1. JIT time
2. Stats being compiled on SQL Server

If you find a difference, it may just be preloading your app before use will
solve your issue. If not, here are a variety of things you can look at:

1. Library used to connect to SQL Server - If you can use In Memory, it is
fastest; none should be overall slow, so this is just shaving a bit of time,
not a full solution
2. ADO.NET library used (ODBC, OLEDB or SQL)
3. Check the indexes
a) Any fragmented indexes need to been recompiled
b) Do you have the right indexes for your query
4. Would adding some locking hints, etc. help your query?
5. Do you need hints for statistics?
6. Can you improve the query? - Look at the execution plan, as it will give
you a lot of information on straightening out your world

If you are dynamically adding statements in a sproc, you will find that it
will end up having to recompile each time and redo its stats. When you
compare this to the profiler command (meaning what is actually run in the
sproc versus running the sproc statement from profiler), you can see a huge
difference in time. This is why you need to make sure you are running the
sproc in each instance and not just the statement. Depending on how you
profile, you might only see the sproc call.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
 
D

dustbort

Gregory,

Thanks for your ideas. I went to test them and it seems that the problem
has miraculously vanished. I had just copied the production database to my
workstation and updated the stored procedures for testing, prior to having
the slow results. I wonder if the database does some initialization that
had not completed at the time of the first tests? Something that would
affect the .NET SqlClient but not SQL Mgmt Studio?
 
W

William Vaughn

A stored procedure uses a query plan that's created when the SP is first
executed and reuses that plan regardless of the suitability of the plan from
that point forward--until it's replaced. The plan is optimized based on the
indexes and the statistics that describe how the data is distributed in the
target indexes/tables (and many other factors). It's not a good idea to test
with VS for a number of reasons. Create an executable and run that to test
how well it performs.

hth

--
__________________________________________________________________________
William R. Vaughn
President and Founder Beta V Corporation
Author, Mentor, Dad, Grandpa
Microsoft MVP
(425) 556-9205 (Pacific time)
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
____________________________________________________________________________________________
 
C

Cowboy \(Gregory A. Beamer\)

Definitely.

When you first start using a new database it creates stats. Until created,
the first few queries run slowly, as will any unique type of query, as it
cannot hit stats.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
 
G

google

Definitely.

When you first start using a new database it creates stats. Until created,
the first few queries run slowly, as will any unique type of query, as it
cannot hit stats.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|







- Show quoted text -

I had an similar issue. I think it's due to the fragmentation of the
tables. Look at the indexes on the tables.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top