Re using Stored Procedures

S

Steven Blair

Hi,

I have a system which processes 1000's of transactions per hour. Part of
each transaction process invloves me calling a stored procdure which
updates various tables.
What I want to know is, is a Stored Procedure like a function in terms
of performance, each time the SP is about to get called, the function
will have memory allocated for it on the stack(including all variables
required). So, this process would happen however many transactions I was
processing.

If this is the case, is there a way in c# to cache the stored procedure
to prevent this happening. After the first transaction has been
processed, the SP will exist in memory.

I am using SQL Server.

Any help on this would be appreciated.

Regards,

Steven
 
D

David Browne

Steven Blair said:
Hi,

I have a system which processes 1000's of transactions per hour. Part of
each transaction process invloves me calling a stored procdure which
updates various tables.
What I want to know is, is a Stored Procedure like a function in terms
of performance, each time the SP is about to get called, the function
will have memory allocated for it on the stack(including all variables
required). So, this process would happen however many transactions I was
processing.

If this is the case, is there a way in c# to cache the stored procedure
to prevent this happening. After the first transaction has been
processed, the SP will exist in memory.

I am using SQL Server.

The SP will be cached in memory by Sql Server automatically. Sql Server may
allocate storage for the parameters and local variables, but that is so
cheap it isn't even worth considering.

David
 
S

Scott Allen

Hi Steven:

When SQL Server encounters a stored procedure it needs to generate an
optimized plan of execution to run the stored procedure. This plan is
cached by SQL Server and may be reused on a subsequent calls.

There is realy nothing you can do from c# to further optimize this
process, but you could monitor SQL Server to measure how many
recompiles take place with perfmon.

Sometimes a saved query plan will get kicked out of the cache because
it is not used frequently, for instance. Other times SQL might force
a recompile because the underlying data has changed and it might be
able to generate a better plan. Sometimes the SQL you write can impact
the cacheability of the plan. Procedures that intermingle DDL and DML
(to create temp tables for instance) can create procedures that need
to recompile.
 
N

Nicholas Paldino [.NET/C# MVP]

Steven,

In addition to what David and Scott said, your transaction rate is
actually pretty low (even at 9999 an hour, that's a little under 3 a second,
which, depending on the stored procedure, really is nothing for SQL server
to handle). It is good that you have stored procedures, but with this low
of a throughput, I would say that you can't do better than stored
procedures.

However, this would make one wonder why you ask the question. Are you
experiencing a bottleneck, and if so, what are you doing that could be
causing it.
 
S

Steven Blair

We are looking at ways yo put through more transactions.
Studying the syste, the time it takes to load the transactions is the
killer. The numbers I told you (10000 per hour) is actually a lot less
than we are trying. I think we should be doing 4/5 second anyway.
Its quite a lot of hits to the Database. What I might have missed is the
amount of work the SP's actually do. This isnt just a few lines, we are
talking pages of code.
 
D

David Browne

Steven Blair said:
We are looking at ways yo put through more transactions.
Studying the syste, the time it takes to load the transactions is the
killer. The numbers I told you (10000 per hour) is actually a lot less
than we are trying. I think we should be doing 4/5 second anyway.
Its quite a lot of hits to the Database. What I might have missed is the
amount of work the SP's actually do. This isnt just a few lines, we are
talking pages of code.

If you are invoking the SP many times in quick succession, you should wrap
multiple calls in one transaction. SQL must do a certian amount of physical
IO to commit each transaction. If you don't wrap your commands in a
transaction, then SQL must commit after each statement which updates data.
There may be many such statements in each procedure call, so this can be a
significant bottleneck.

David
 
S

Steven Blair

David,

Not sure I understand what you mean.
One transaction requires a lot of processing before being loaded. One SP
is is responsible for this. In this SP are multiple statments, insert,
updated multiple tables.

As Is aid before, the SP' can be pages long.

Steven
 
S

Scott Allen

Hi Steven:

There is a lot of good information in the following document:

Improving .NET Application Performance and Scalability
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnpag/html/scalenet.asp

Particularly:

Chapter 12 - Improving ADO.NET Performance
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnpag/html/scalenetchapt12.asp

Chapter 14 - Improving SQL Server Performance
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnpag/html/scalenetchapt14.asp
 
D

David Browne

Steven Blair said:
David,

Not sure I understand what you mean.
One transaction requires a lot of processing before being loaded. One SP
is is responsible for this. In this SP are multiple statments, insert,
updated multiple tables.

As Is aid before, the SP' can be pages long.

Your procedures may run significantly faster if you start a SqlTransaction.
If you don't start a transaction then after each seperate insert, update or
delete, Sql Server must flush the transaction log to disk. If you start a
transaction, Sql Server can wait until you commit the transaction to flush
the log to disk.

David
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top