Data Sets Vs Data Readers

M

michaeltorus

H

I've gont an n-Tier web app, consisting of ASP.Net, a Business Object Layer and a Data Access Layer. Up to now we've always used Datasets. But we hardly use any of the functionality provided by datasets

One of the early reasons for not using Data readers from the start was because Data readers hold a Connection open to the DB. Another reason was because run all our ASP.Net on different machines to the BOL and DAL layers

What I'd like to know is, does it really matter, when in 95% of all cases all we ever do with a dataset is bind to a data grid of somesort

If we switched this to using a data reader, what would be any problems

Also, If in some Business Layer Code, I was to retireve say 100 000 records, then loop through them and preform some kind action on each loop, whether it be make a DB call, or execture and Asynchronous process ...etc et... Would it Lock the Database records for the entire duration of the loop if it was done with a data reader, or does it only apply a lock on the row it is currrently reading

Thank

Mike
 
C

Cor Ligthert [MVP]

Michael,

Maybe this simple answer make your decision very easy.

A dataadapter (used to read a dataset or a datatable) uses internally a
datareader.

I hope this helps,

Cor
 
M

michaeltorus

Hi

That's not really what I'm asking ....What I'm trying to find out is, by changing all my code to use a Data reader, are thee any problems when binding to an ASP server control, when the Web Box and the BOL/DAL machines are different, and separated by a firewall?

And ... when Looping through a Data reader, what kind of Locks are appliied by SQl server, especially if I am performing opertations on each line record returned by the datareader. Does it lock the entire results, or just the line that it being streamed?

Thanks
 
S

Sahil Malik [MVP]

Michael,

If you aren't using any DataSet features, then it may not be a bad idea to
switch to the usage of DataReaders exclusively. But do consider that by
using the Data Adapter and DataSet combination you are leveraging the fact
that your connections are closed ASAP (assuming you let the data adapter
handle your connections for you). Also if you are updating data back, then
in singular table situations data adapters are very helpful.

It may be tempting to consider that you can simply use a data reader bound
directly to say a drop down. While theoretically that may be a good idea,
you have to be careful of not leaving open connections. You may want to
consider using a hybrid approach like this -
http://codebetter.com/blogs/sahil.malik/archive/2004/12/11/36078.aspx . In
other words, ensure that your architecture - the data layer or some close
loop around it is wholly responsible for keeping things clean i.e. closing
connections. If you can acheive that, then I see no problems switching to
data readers.

But if there is any doubt, then that extra oomph of performance you are
trying to garner may be lost very easily.

Regards your other questions ---

Firewall etc. shouldn't cause much of an issue. But I am personally not
comfortable saying that data reader databinding will be the same as dataset
databinding. Maybe for readonly purposes, but I'd test it out in every
situation. If you are not worried about updating, you may consider creating
a dumb data bucket other than dataset.

The locks applied by SQL Server are not any different than what would be in
the case of a data adapter filling a dataset (well what gets locked is like
black magic almost, but this general blanket statement is relatively
accurate). The lock could be row by row, or could be page by page, or the
entire table - depends on a lot of factors. The bigger deal over here is to
just assume that some rows will be locked due to the implicit transaction,
and you donot really have a behavior different from a data adapter - with
one exception. The data adapter ensures that the locks are cleanly released,
and the locks are kept for the minimum possible time (because the data
adapter fills the dataset and quits). Keeping open connections is ULTRA
AWFUL. leaving one open connection is WAYYYY worse than all the huge memory
consuming datasets on your web server.

HTH :)

- Sahil Malik [MVP]
ADO.NET 2.0 book -
http://codebetter.com/blogs/sahil.malik/archive/2005/05/13/63199.aspx
 
M

michaeltorus

Hi

Thanks for the reply.

I'm pretty confident that I'll be closing connections correctly and as soon as possible (i.e. as soon as I've looped through the Data reader), however, Is there still anything inherently wrong with passing my Data reader reference across the wire so to speak to bind it to a server control?

Mike
 
C

Cor Ligthert [MVP]

Sahil,
I'm not challenging what you said, but what is the basis of that
information that a dataadapter internally uses a data reader.
More places however by instance just the introduction of the dataadapter
gives in my idea a very simple answer on your question.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/vbcon/html/vbconDataAdapters.asp

By instance this sentence however there is more.

The adapter creates a data reader object (SqlDataReader, OleDbDataReader,
OdbcDataReader, or OracleDataReader) to read the data into a dataset.

If I read this wrong, than please explain otherwise I hope this helps,.

Cor
 
M

Mark Rae

If we switched this to using a data reader, what would be any problems?

Do you use DataGrids? If so, do you implement paging? If so, you'll get the
following error:

AllowCustomPaging must be true and VirtualItemCount must be set for a
DataGrid with ID <whatever> when AllowPaging is set to true and the selected
datasource does not implement ICollection.

If you then set the datagrid's AllowCustomPaging property to true and its
VirtualItemCount property to be the number of records returned by the
SqlDataReader (you'll have to evaluate that separately), the error will
disappear and the datagrid's paging hyperlinks will correctly show the
correct number of pages. However, all pages will contain the first page of
records... :)
 
W

W.G. Ryan MVP

DataReaders need an open and available connection and aren't serializable -
so when you mention passing it over the wire, you can't pass them through
tiers. Hence if you're using Remoting or Web Services , then DataReaders
won't work.

The other problem is that while you may be ok using them now, what if in the
future your app grows and you need to run some of the dlls on an app server
(or multiple servers) ? You can't just make your class MarshalByRef and
then remote it if you're using datareaders.. so you'd need to create a
facade, use the reader only on one tier, fill a serializable object and pass
it back. If this app needs to scale, that's a lot of risk for a really
small benefit.
 
S

Sahil Malik [MVP]

As Bill mentioned correctly, the only problem is "architecture" especially
as your application grows.

One additional thing I'd like to mention is, that even if you will be
closing connections immediately after looping through the data reader, just
by the fact that you are databinding a data reader, not a dataset is
actually slower in concurrent applications. Why??

Because databinding is a damn slow process. That leads to more open
concurrent connections on the database, which affects your application
performance negatively. In other words, between each record, you are doing a
lot of processing, thus keeping connections open for a longer than usual
time duration.

Given everything, I like the idea of creating an alternate dumb data bucket,
instead of databinding datareaders directly.

- Sahil Malik [MVP]
ADO.NET 2.0 book -
http://codebetter.com/blogs/sahil.malik/archive/2005/05/13/63199.aspx
 
W

William \(Bill\) Vaughn

I'm with Bill (the other one) and Sahil.
I'm currently working through a customer problem caused by (you guessed it)
a DataReader. I know of several shops all over the world that write some
pretty serious code and they force their developers to use Fill and
DataTables instead of a DataReader. Yes, the DataReader is slightly faster,
but the disconnected DataTable is far easier to code, debug, maintain and
(more importantly) USE. It's serializable (as has been mentioned). It's
bindable (everywhere), it's sortable, seekable, findable, alterable,
manageable and does NOT require an open connection to maintain.
Another point. I don't abide with applications that bring 10,000 rows (or as
many as several million) to the client with a DataReader (or Fill) to
process them row-by-row and use another connection to send them back. Unless
you're working with a toy database like JET that has limited procedure
functionality, I recommend processing rows on the server. That's what stored
procedures are for. What can a client application do that a SP can't? Okay,
there are some complex algorithms that are tough to implement with PL/SQL or
TSQL, but these can be managed with CLR procedures in 2005 and Extended SPs
before that. So you want to manage data from several sources and merge them?
Use BCP or DTS to move the data to the server and do the work there. It's
silly to transport the database to a client for processing when the server
can do it all.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
W

W.G. Ryan MVP

Sahil:

I got a confirmation on this for what it's worth. A reader is used for all
of the Excecutexxx methods as well. One sure tipoff is the stacktrace
though whenever something goes wrong, you typically see a reference to the
datareader
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top