What thing to Go in my scenario (Recordset/DataTable)?

G

Guest

Hi SQL Gurus,

I have a scenario like following:

My DAL is sitting on a BOX with 4 GB RAM and calling the SQL Server BOX via
Network (Internal). The records returned are huge in range of say 30000 to
300000.
I am having 6 Columns on average in each row.

I am generating an XML in a SQL Server 2005 Stored Procedure using 'For XML
Path' the result of this Select is a Single row/line returning the entire XML.

Now, given the above statistics, I am in doubt what specific feature of
ADO.Net to go for:

1. DataTable: which will pull the entire ResultSet at one go (which is
effective connection usage n network usage wise, but inefficient on Memory
consumption part)

OR

2. RecordSet: which will keep connection locked until the entire data is
passed or RecordSet is closed. Given the number of records 30000 to 300000,
this will again make that many network hops, but is much liter on the Memory
Consumption side.

I want my application to be Scalable and High Performing, what thing to go
for in the scenario like above.

I really appreciate any help on this.

Thanks,
Sachin R. Chavan.
 
G

Guest

Hi,
This snippet taken from below link should give you the insights:
http://www.datadirect.com/developer/net/dot-net-optimizing/index.ssp
Selecting .NET Objects and Methods
The guidelines in this section will help you to optimize system performance
when selecting and using .NET objects and methods.
Choosing Between a DataSet and DataReader
Whether you should connect with a DataReader or DataSet depends on your
production environment. The DataReader uses more resources on the database
server while the DataSet ties up more resources on the client.
If you need to retrieve many records rapidly, use a DataReader. The
DataReader object is fast, returning a fire hose of read-only data from the
server, one record at a time. In addition, retrieving results with a
DataReader requires significantly less memory than creating a DataSet. The
DataReader does not allow random fetching, nor does it allow for updating the
data. However, ADO.NET data providers optimize their DataReaders for
efficiently fetching large amounts of data.
In contrast, the DataSet object is a cache of disconnected data stored in
memory on the client. In effect, it is a small database in itself. Because
the DataSet contains all of the data that has been retrieved, you have more
options in the way you can process the data. You can randomly choose records
from within the DataSet and update/insert/delete records at will. You can
also manipulate relational data as XML. This flexibility provides impressive
functionality for any application, but comes with a relatively high cost in
memory consumption. In addition to keeping the entire result set in memory,
the DataSet maintains both the original and the changed data, which leads to
even higher memory usage. Using DataSets with very large result sets
drastically reduces the scalability of the application.
--
Hope this helps.
Thanks and Regards.
Manish Bafna.
MCP and MCTS.
 
W

WenYuan Wang [MSFT]

Hello Sachin,

DataReader is a one-way, forward-only method of reading data.
DataSet is a database-independent, in-memory data store that enables the
developer to directly access all rows and columns of tables.
Which to use? DataReader or DataSet? This has been a common issue in
ADO.net world. There is no clear-cut choice between them.

If you need read-only access to the data, (i.e ASP.NET), using a DataReader
makes sense.
But, the connection must be open while you are accessing the data. ( As you
see, this keep connection locked until the entire data is passed or
DataReader is closed.)

If you will be doing lengthy processing with each row of returned data,
DataSet might be a better idea.
By the way, the DataSet could also be filled by DataReader and then you can
use the data in the DataSet at your leisure without consuming a connection.

If you are building a Windows Forms application, I prefer DataSet.
Since the application will maintain its state over the entire time it is
running, and client context does not tie up server resources, using DataSet
can be very CONVENIENT for Windows Forms applications.
However, in DataSet, the data remains in memory for as long as you use it.
If you are retrieving thousands or millions of rows (In your scenario:
300000 records...). DataSet is probably not an ideal solution. You should
consider using DataReader...

My opnion is using DataReader for ASP.NET and DataSet for Windows Forms
applications.
For High Performance, DataReader is a good choice if you can reliably use a
safe code pattern to ensure the underlying connection will be closed in
time.

Hope this helps,
Sincerley,
Wen Yuan
Microsoft Online Community Support
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
 
G

Guest

Thanks Guys,

For all your suggestions.

I have comeup with little bit different solution, I have actually made my
stored Procedure paginated so that it will extract record in chunks and this
result will be loaded into DataTable.

Now, I will keep calling this storedproc in a loop and keep loading the
Dataset until I have fetched all the rows in chunks.

Thanks,
Sachin R. Chavan.
 
W

WenYuan Wang [MSFT]

You are welcome, Sachin.

Paging Stored Procedure is really a good solution in such scenario.
Thanks for your share :)
BTW, if there is any further issue on this, please don't hesitate to update
there again.
We are glad to work with you.

Have a great day,
Sincerely,
Wen Yuan
Microsoft Online Community Support
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top