Dataset with large data...PLEASE HELP

P

palgre

Hi ALL,

I am working in an windows based application using SQL Server 2000 as
database. There are few tables (refer parent tables) in the
application which are uploaded by a seprate application.

My application fetches data from parent tables and put data in
separate tables (chils tables) of application.
I am using dataset to fetch data from parent tables and insert/update
data in child table.
the problem is the records are so high (3 4 millions) then data is too
large and it takes hell lot of time to complete the process. Also
application server CPU utilization shoots out to max.

What will be the best way to achive this.
1. Should i use DataRepeater instead of dataset. or
2. Should i do processing in chunks. how can I do processing in
chunks???

or is there any other way i can process data.

Thanks
PAL
 
M

Mr. Arnold

Hi ALL,

I am working in an windows based application using SQL Server 2000 as
database. There are few tables (refer parent tables) in the
application which are uploaded by a seprate application.

My application fetches data from parent tables and put data in
separate tables (chils tables) of application.
I am using dataset to fetch data from parent tables and insert/update
data in child table.
the problem is the records are so high (3 4 millions) then data is too
large and it takes hell lot of time to complete the process. Also
application server CPU utilization shoots out to max.

What will be the best way to achive this.
1. Should i use DataRepeater instead of dataset. or
2. Should i do processing in chunks. how can I do processing in
chunks???

or is there any other way i can process data.

Use the SQL Command Object, data reader, dynamic SQL statements or calling
Stored Procedures.
 
A

AlexS

I am not sure I understand what you are doing with data.

In any case, dataset loads everything into memory, that's why you get into
troubles.

If you just pump data from one table into another with some processing in
between, use Bulk Copy (SqlBulkCopy or bcp) for inserts. Then you can just
read from some data reader, which should solve your issues.

For updates you have to issue them individually, but even in this case you
can batch them up to command string limit and issue in batches. Say, save
all updates in some file. When reading is complete, read back updates and
run them.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top