DataReader from DataSet

T

Tushar Karsan

My application uses a DataReader to get information from a database. For
testing purposes, I would like to use hardcoded dummy values instead of
reading anything from the database. As it is possible to store multiple
tables in a DataSet, is there a way to serialize the DataSet then
deserialize the stream and create a DataReader from it? Can a DataReader be
created any other way using hardcoded values?
 
V

Vibhu

For testing purposes, if you really dont want to read
values from the database, you can build a test harness
that fills your data set with dummy values instead of
going to the database.
You can easily do this in your datalayer by using dummy
object for filling the dataset your self.
 
D

David Browne

Tushar Karsan said:
My application uses a DataReader to get information from a database. For
testing purposes, I would like to use hardcoded dummy values instead of
reading anything from the database. As it is possible to store multiple
tables in a DataSet, is there a way to serialize the DataSet then
deserialize the stream and create a DataReader from it? Can a DataReader be
created any other way using hardcoded values?

No. This is one of the many reasons DataTables are more flexible than
DataReaders.

If your application code consumed DataTables instead of DataReaders, this
would be trivially easy.

So consider returning DataTables instead of DataReaders to your application.

Top 10 reasons to use DataTables instead of DataReaders:

10. Sorting
9. Rows.Count
8. Random Access
7. Using DataTables returns the connections to the pool sooner.
6. Serializable
5. DataTables can easily be switched from live data to canned data or
load-time data
4. Strong Typing Available
3. DataTables can be saved in Session state or a global variable and reused
and shared
2. DataTables don't block the connection (you can use the connection while
iterating the rows)
1. DataTables are idiot-proof (needn't be closed)

David
 
M

Marina

There are a lot of reasns to use datasets.

But there are also reasons not to.

Datareaders provide much quicker access to the data with less overhead. They
are efficient at forward only, read only access to data. And though this
sounds limiting, for many operations, this is exactly what's needed. And for
performance reasons, it is makes much more sense to use a datareader.

Every object has its place and time. It is not fair to generalize and say
that a dataset is always better.
 
B

bruce barker

the commaon approch is to create your own IDataReader class that has this
functionality. Its not very hard to implement.
 
D

David Browne

Marina said:
There are a lot of reasns to use datasets.

But there are also reasons not to.

Datareaders provide much quicker access to the data with less overhead. They
are efficient at forward only, read only access to data. And though this
sounds limiting, for many operations, this is exactly what's needed. And for
performance reasons, it is makes much more sense to use a datareader.

Every object has its place and time. It is not fair to generalize and say
that a dataset is always better.

Which is why I didn't say a DataTable is always better, I just listed 10
advantages of DataTables, and suggested that the poster _consider_ using
one.

DataReaders definitely have their place. It's back in the bowels of the
data access layer, and in selected performance-sensitive applications.

Lot's of people seem to think "DataReaders are faster so I should use them
if I can". This is really backwards. DataTables are easier and more
useful, so you should use them unless you can't. A DataTable's cost is
measured in Memory and CPU cycles. A DataReader's cost is measured in time
spent coding, testing, changing and fixing. Modern computers are incredibly
fast, and getting faster all the time. Developers' time is incredibly
expensive, and getting more expensive all the time. You just have to throw
it all into the mix, and build the cheapest application that does what it
needs to.


David

David



David
 
M

Marina

I don't find datareader development to be any more difficult then using a
dataset. I don't spend more time fixing problems with a datareader, then I
do with anything else. How do you figure that it takes more work to maintain
a datareader? And how exactly are datatables easier? As far as I am
concerned, they require equal amounts of work.

Additionally, we are talking about the time spent on database access. If
you measure filling a datatable with 5000 rows, vs looping through a
datareader with 5000 rows, you will notice a big difference.
 
D

David Browne

Marina said:
I don't find datareader development to be any more difficult then using a
dataset. I don't spend more time fixing problems with a datareader, then I
do with anything else. How do you figure that it takes more work to maintain
a datareader? And how exactly are datatables easier? As far as I am
concerned, they require equal amounts of work.

For some applications they do.
If you can make good use of the DataTables features, it can save you tons of
time.
Additionally, we are talking about the time spent on database access. If
you measure filling a datatable with 5000 rows, vs looping through a
datareader with 5000 rows, you will notice a big difference.

No you won't. Filling a datatable with 5000 rows is almost exactly the same
as reading them from a datareader. The DataAdapter uses a datareader to
fill the datatable, and the only difference is the memory you need to
allocate to hold the results. Fetching rows across the network is _very_
slow, and DotNet allocates memory _very_ quickly. Much more quickly than
previous languages, so the difference in elapsed time is minimal. (run the
test program at the end to prove it to yourself).

The performance differences are only that the datatable occupies memory, and
that your fetching is seperate from your application processing. After you
load the DataTable, you still have to go back and sort, filter iterate the
rows. Whereas with a datareader you can do it in one shot.



David





Module Module1

Sub Main()
Dim cn As New SqlClient.SqlConnection("Network Library=DBMSSOCN;Data
Source=ressptdb;Initial Catalog=northwind;User ID=usr;Password=pass")

cn.Open()

Dim i As Integer
Dim t As DateTime = Now

Console.WriteLine(String.Format("Begin {0:d}", t))
For i = 0 To 99
ReadDataReader(cn)
Next

Console.WriteLine(String.Format("DataReader elapsed {0:i}",
Now.Subtract(t)))

t = Now
For i = 0 To 99
LoadDataTable(cn)
Next

Console.WriteLine(String.Format("DataTable elapsed {0:i}",
Now.Subtract(t)))
Console.WriteLine(String.Format("End {0:d}", t))

End Sub

Public Sub LoadDataTable(ByVal cn As SqlClient.SqlConnection)
Dim cmd As New SqlClient.SqlCommand("select * from [order details]", cn)
Dim ds As New DataSet()
Dim da As New SqlClient.SqlDataAdapter(cmd)
da.Fill(ds)
cmd.Dispose()
Console.WriteLine(String.Format("DataAdapter filled {0} rows",
ds.Tables(0).Rows.Count))

End Sub

Public Sub ReadDataReader(ByVal cn As SqlClient.SqlConnection)

Dim cmd As New SqlClient.SqlCommand("select * from [order details]", cn)


Dim dr As SqlClient.SqlDataReader = cmd.ExecuteReader
Dim OrderId As Integer
Dim ProductID As Integer
Dim UnitPrice As Decimal
Dim Quantity As Integer
Dim Discount As Double


Dim i As Integer


Do While dr.Read
OrderId = dr("OrderId")
ProductID = dr("ProductID")
UnitPrice = dr("UnitPrice")
Quantity = dr("Quantity")
Discount = dr("Discount")
i += 1
Loop
dr.Close()
cmd.Dispose()
Console.WriteLine(String.Format("DataReader read {0} rows", i))


End Sub

End Module
 
T

Tushar Karsan

As I said, both IDataReader and IENumberable needs to be implemented, that
is what makes it hard work although imeplementing is not a problem, it does
become one if there are a lot of tables.
 
W

William \(Bill\) Vaughn

I'm with you David. For typical queries, the difference in performance is
negligible. Don't throw up the "5000 rows" argument. If you're fetching 5000
rows, you're probably not thinking about performance anyway--you're thinking
about bulk transfers. I think (and lots of development managers agree) that
DataAdapter.Fill is cheaper. Since I don't have to worry about reading each
row and moving each column of data, it IS easier to code Fill. I would add
"Automatic Connection management", "DataViews" and several more items to
your list.

--
____________________________________
Bill Vaughn
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
K

Kathleen Dollard

Tushar,

Can you add conditional compilation that skips the DataReader?

#IF DebugSpecial
MyDataRetrivalMethod
#ELSE
MyNormalDataRetrievalMethodViaSQLDataReader
#ENDIF

Kathleen
 
K

Kathleen Dollard

Marina,

Well, I guess it's easy to go round if we are comparing apples and apple
blossoms. I happen to like both.

Many applications, almost all non-web applications and many web applications
rely on data containers. The DataTable is a ready made free data container.
If you need data container functionality you will spend expensive
development time creating a container unless you use the DataTable. If you
don't need a data container, then loading the DataTable is overhead you
don't need.

I agree with your assessment that looping through a DataReader is faster,
and for applications that just need to iterate through data they are great.
Anything other than a single pass through the data results in programmer
written code that can compromise the architecture and really hurt
performance.

In terms of data containers, a friend of mine, Bill McCarthy, whose one of
the heavier perf geeks works on high performance data containers and I did
some comparisons. When we ran a full functional DataTable (strongly typed
with extensions - my own code gen) against a minimal version of his (also
strongly typed), the difference was greatest on very small recordsets
(single table, single record, retrieved via primary key on local database).
The highest differences we were able to find were 2-4 fold (we argued tons
about his testing - the answer is lesss than 2, in case he drops by) or
18-32 ms (18, btw <g>) per call. We never really ascertained the perf on a
"real world' implementation of his data container. Specifically his light
weight version did not manage relations, have sort capabilities,
autoincrment keys, serilizable, or a slew of other behaviors I consider
critical.

Please note - we were measuing his implementation vs my implementation, NOT
the underlying technologies. In the real world, that 18-32 ms would quickly
be swallowed by a slew of other issues. This is all based on my memory of
stuff that happened more than a year ago.

So, anyway, that's all just to back my opinion. I have spent, and seen
spent, a ton of time on this issue, and I think that if you are using a data
container, you can't do better than about a two fold increase in the time it
takes to actually load the data if you pull out every stop and drop every
feature (other than strong typing). I stated all the above, because I
realize my opinion differs from the MSDN article on the subject. For most
applications, there are better places to make perf gains.

And for all applications, there are better things to argue about. Whichever
you use, use it and get on with it. I actually use a lot of proprietary data
containers for a number of reasons. But perf is not a significant one.

BTW, Billhappens to be speaking on his data container work at Australian
user groups this week and next, if you get a chance to drop by and see him.

Kathleen
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top