ADO.NET Query Execution time is 10 times slower than via Query Analyzer

O

Oleg

Hi,

I have the stored procedure which uses temporary tables
and does some calculations.
It executes in 3 sec when I use Query Analyzer.
When I execute it using ADO.NET client it takes 30 sec.
I looked at execution time using Profiler so this is not some the .NET
Framework bottleneck.
SQL server executed the same stored procedure ten times slower.
I use SQL 2000 SP2.
Another interesting thing that I've found:
When I changed the stored procedure to use table variables (@tablename)
instead of temporary (#) tables, Query Analyzer execution time becomes one
minute(!!!)
and ADO.NET execution time remains the same (30 sec).

Does anyone know how to solve the problem, because my application is written
in C#
and I don't want my SP to execute for 30 seconds when only 3 sec is really
required.

Any help is appreciated,

Oleg
 
F

Frans Bouma [C# MVP]

Oleg said:
Hi,

I have the stored procedure which uses temporary tables
and does some calculations.
It executes in 3 sec when I use Query Analyzer.
When I execute it using ADO.NET client it takes 30 sec.
I looked at execution time using Profiler so this is not some the .NET
Framework bottleneck.
SQL server executed the same stored procedure ten times slower.
I use SQL 2000 SP2.
Another interesting thing that I've found:
When I changed the stored procedure to use table variables (@tablename)
instead of temporary (#) tables, Query Analyzer execution time becomes one
minute(!!!)
and ADO.NET execution time remains the same (30 sec).

Does anyone know how to solve the problem, because my application is written
in C#
and I don't want my SP to execute for 30 seconds when only 3 sec is really
required.

ADO.NET uses: EXEC sp_executesql...
You might have used in QA: EXEC proc...

it shouldn't make a real difference though. Do you use the same user
for connecting to the db in the ADO.NET example as in QA? Also, 30
seconds is the default timeout time. Are you sure the stored proc is
finished after 30 seconds or your routine just returns and swallows the
exception?

temp tables can be slow if the tempdb is full and have to be expanded.
For the rest it should be fast. What's weird is that your memory tables
(table vars) are slower than temptables.

Frans


--
 
W

William \(Bill\) Vaughn

We often see performance differences between OSQL/ISQL/SQLCMD and ADO. I
expect it has to do with rowset population issues.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
O

Oleg

Hi Frans,
it shouldn't make a real difference though. Do you use the same user for
connecting to the db in the ADO.NET example as in QA? Also, 30 seconds is
the default timeout time. Are you sure the stored proc is finished after
30 seconds or your routine just returns and swallows the exception?

Stored proc is finished after 30 seconds and returned the result.
temp tables can be slow if the tempdb is full and have to be expanded. For
the rest it should be fast. What's weird is that your memory tables (table
vars) are slower than temptables.

It's not weird... Many times I found that table variables are much slower
than temp tables especially when the number of precessed rows is large.
My experience shows that for large amounts of data
temp tables usually faster than @tables despite BOL states the opposite.

Oleg
 
O

Oleg

Hi Bill,For me it has not to do with rowset population issues.
because this is execution time difference reported by SQL Profiler.
I did not measure it on the client machine.

Oleg
 
A

angelsbadillos[ms]

This is a topic that keeps repeating and I really would like to get to
the bottom of it. The problem I have had trying to understand this
behavior is that I _don't_ have a query that has a dramatic performance
difference _and_ does what it is intended to do in query analizer.

The last time I investigated this issue I was told that query analizer
was optimized for reading operations and would sacrifice correctnes for
performance. Not buying too much into this I looked at the SET
properties and could only find one difference, SET ARITHABORT is on in
one and off in the other (QA / SqlClient, sorry don't remember which is
which).

I guess what I want to ask is 1) is it possible to share a query/table
schema that repros this problem in a standalone way?
2)Probably more realistic, can you try setting artihabort on and off to
see if the problem is that your query throws an exception and QA just
returns inmediately while SqlClient tries to do the right thing?

Thanks,
Angel
 
O

Oleg

Hi Angel,

Thank you for the information.
It makes the problem even more mysterious.
First, I don't get any exception
(at least SQL Profiler neither show any exception from QA nor from .NET
Client).
The result sets returned by Query Analyzer and .NET SQL Client are
absolutely equal and correct.
Now it looks like this:

-QA - 3 sec, .NET Client - 30 sec
-SET ARITHABORT ON for the .NET SQL Client
-Still have .NET Client - 30 sec
but
-after I even once executed my stored proc in QA after setting ARITHABORT ON
for the .NET SQL Client
.NET Client execution time becomes 3 sec too!
So, with arithabort ON, .NET Client executes for 30 sec only if QA did not
call the stored proc at all.
If QA called stored proc at least once
then .NET SQL Client execution time becomes also 3 sec.
It looks like QA forces the SQL Server to recompile the stored proc in a
more efficient way.

So, arithabort on helps only if I have a chance to use QA to call the stored
proc before
actually calling it using .NET client...

Thanks,

Oleg
 
O

Oleg

Angel,

This is what SQL Profiler shows for the stored proc execution time:


-QA
Duration: 2753
Reads: 51275

-arithabort set to ON, QA never executed the stored proc
-after it was compiled
-.NET Client
Duration: 26170
Reads: 471275

-arithabort set to ON, QA executed the stored proc
-after it was compiled
-.NET Client
Duration: 2693
Reads: 50666

Oleg
 
O

Oleg

Angel,

I've found the difference.
It looks like for the .NET Client
SQL Server does not use temporary tables
statistics when stored proc execution plan is compiled.
QA execution plans contain (taken from syscacheobjects table)
contain records like this:
SELECT statman([GroupLevel],[GroupID],@PSTATMAN)
FROM (SELECT TOP 100 PERCENT [GroupLevel],[GroupID] FROM
[dbo].[#temp1_______
..NET Client execution plans does not have such a records.

The only question is why is it so? :)


Oleg
 
A

Anton Sokolovsky

Try cleaning out procedure cache and memory buffers in QA
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE

Doing so before you test your query in QA prevents usage of cached execution
plans and previous results cache.
 
B

bgundas

Hello, I need immediate help with this problem.
From C# code I am trying to call a stored procedure, which is in a
oracle 8i db.





First of all here is the definition of the function(oracle function).

//==================================================================

FUNCTION INSERT_COMMENTS( iv_acc_id IN VARCHAR2,

iv_user_id IN VARCHAR2,

iv_comment IN VARCHAR2 )

RETURN BOOLEAN;

//==================================================================









My Catch catches this error code coming for Oracle side.

//==================================================================

:In insertComments2 method => ORA-06550: line 1, column 7:

PLS-00306: wrong number or types of arguments in call to
'INSERT_COMMENTS'

ORA-06550: line 1, column 7:

PL/SQL: Statement ignored

//==================================================================











And here is my C# code that tries to make the call to the oracle.

//==================================================================

public static void insertComments2(string accID)

{

try

{

string useridv = "DAYBREAK";

string STR_COMMMENT = "Wellcome letter has been
generated";


//==============================================================================

//create an instance of the command object
giving the procedure name

OleDbCommand sqlCmd2 = new
OleDbCommand("ACCP50.INSERT_COMMENTS",myConn) ;



// Define the command type u r executing as a
Stored Procedure.

sqlCmd2.CommandType =
CommandType.StoredProcedure ;




sqlCmd2.Parameters.Add("iv_acc_id",OleDbType.VarChar,20);

sqlCmd2.Parameters["iv_acc_id"].Direction =
ParameterDirection.Input ;




sqlCmd2.Parameters.Add("iv_user_id",OleDbType.VarChar, 20);

sqlCmd2.Parameters["iv_user_id"].Direction =
ParameterDirection.Input ;




sqlCmd2.Parameters.Add("iv_comment",OleDbType.VarChar,40);

sqlCmd2.Parameters["iv_comment"].Direction =
ParameterDirection.Input ;



sqlCmd2.Parameters.Add("RETURN
BOOLEAN",OleDbType.Boolean);

sqlCmd2.Parameters["RETURN BOOLEAN"].Direction
= ParameterDirection.ReturnValue;



detailLog(accID,w);



//sqlCmd2.Parameters["RETURN
BOOLEAN"].Direction = ParameterDirection.ReturnValue;

//sqlCmd2.Parameters["RETURN BOOLEAN"];



sqlCmd2.Parameters["iv_acc_id"].Value = accID;
;



sqlCmd2.Parameters["iv_user_id"].Value =
useridv;



sqlCmd2.Parameters["iv_comment"].Value =
STR_COMMMENT;



// execute the stored procedure

sqlCmd2.ExecuteNonQuery();



// if ((string) (sqlCmd2.Parameters["RETURN
BOOLEAN"].Value.ToString()) == "true")

// detailLog(" Success. Comments has been
inserted successfully.", w);

// else

// detailLog(" FAILED. Comment insertion
failed.", w);



}

catch (Exception error)

{

detailLog("In insertComments2 method => "
+error.Message, w);

System.Console.Write(error.Message);

}

}



//==================================================================
 
A

Angel Saenz-Badillos[MS]

Oleg,
I don't really understand this, I am trying to get other people to look at
this.

Thank you for taking the time to investigate this,

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




Oleg said:
Angel,

I've found the difference.
It looks like for the .NET Client
SQL Server does not use temporary tables
statistics when stored proc execution plan is compiled.
QA execution plans contain (taken from syscacheobjects table)
contain records like this:
SELECT statman([GroupLevel],[GroupID],@PSTATMAN)
FROM (SELECT TOP 100 PERCENT [GroupLevel],[GroupID] FROM
[dbo].[#temp1_______
.NET Client execution plans does not have such a records.

The only question is why is it so? :)


Oleg


angelsbadillos said:
This is a topic that keeps repeating and I really would like to get to
the bottom of it. The problem I have had trying to understand this
behavior is that I _don't_ have a query that has a dramatic performance
difference _and_ does what it is intended to do in query analizer.

The last time I investigated this issue I was told that query analizer
was optimized for reading operations and would sacrifice correctnes for
performance. Not buying too much into this I looked at the SET
properties and could only find one difference, SET ARITHABORT is on in
one and off in the other (QA / SqlClient, sorry don't remember which is
which).

I guess what I want to ask is 1) is it possible to share a query/table
schema that repros this problem in a standalone way?
2)Probably more realistic, can you try setting artihabort on and off to
see if the problem is that your query throws an exception and QA just
returns inmediately while SqlClient tries to do the right thing?

Thanks,
Angel
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top