Large number of records

  • Thread starter Thread starter JeremiahOSullivan
  • Start date Start date
J

JeremiahOSullivan

Hi

I am looking at getting a large number of records (1.8m) into a
datatable.The data resides in SQL Server, but can be moved to csv,xml
if required (for performance).

Is this type of load, in 1 go, feasibile? What sort of speed can I get
it down to

This is for a Winforms application

Thanks,
Jerry
 
So you want to...decrease performance..., right? Because, you will probably
not get any more efficient then putting it in a database that is designed
for data access and processing. Why would you think that CSV and XML are
more performant than a database? If you design the database and right
define the tables and relationships correctly, data access can be nearly
instantaneous.
 
Peter,

I was having trouble getting the database to select the records quickly
(1.7million) so I thought that a csv file(essentiall the query
results) that was just read straight from disc (no query overhead),
might be more proficient.

Thanks for your comments
Jerry
 
I can help, but you'll have to give me a lot more details,
specifically, what the end goal of this process is. If you do wanna go
the datatable route, I can tell you that the datatable in 1.0 and 1.1
most likely will not handle the data memorywise. The 2.0 datatable can
handle the data, but will still probably take a couple hundred mb of
memory from what I've seen. I'm not sure how long it will take to load
the datatable, but I could see it taking upwards of a minute or more.
If you want to read from a csv file, you can try out my parser I sell,
http://www.csvreader.com . I'm just not sure how many of these records
you're trying to actually deal with at any point. If your issue is just
that you have 1.7 million rows of data in the db, and you're doing
queries against it for portions of the data, say like a couple hundred
rows, and that's not happening quickly, then your issue is most likely
in the db structure. I'd be looking at the indexing on the table vs
your query in that case.

Bruce Dunwiddie
 
Hi Jerry,

Think again. No way on earth you're gonna beat SQL performance using CSV,
and certainly not Xml (In-memory or not, doesn't matter).
Stick to database and read up on how to properly design/query Sql. :)
 
What do you plan to do with 1.7 milliion records in a DataTable?

In fact, I'm not sure of this, but a DataTable is easily serialized into
XML. You might store the data as an XML DataTable, and just deserialize it.
There are other factors in play as well. SQL Server uses quite a bit of
caching, so if the data is not being fetched for the first time, it may well
remain in a memory cache, which would definitely be faster. But again, why
would you need all 1.7 million records at one time?

--
HTH,

Kevin Spencer
Microsoft MVP
..Net Developer
If you push something hard enough,
it will fall over.
- Fudd's First Law of Opposition
 
Back
Top