Using Excel as Interface for big SQL Server Tables?!

W

Webtechie

Hello,

My boss asked me to create some macros in VBA to process a list of records
in a spreadsheet.

That is easy. I have to go out to websites and get data based on the number
id in the spreadsheet and bring back data. I then take the data from the
websites and upload the row in the spreadsheet for each number id.

Ok, I got that to work well.

Now he is giving me huge files to process with same routine. Some CSVs,
some DBFs, some Word docs that will need to be parsed and a couple of
spreadsheets.

With all the different datasources, I thought I should load them into SQL
Server and create one database. Each table would represent one of the
datasources that I have been given.

Done that. But the tables have 300,000 records in one, 200,000 in another
and so on. Big tables and there are 8 tables in all.

Now back to the Excel macro that he wanted me to create in the first place.
I can convert the macro to get the number id from SQL Server and process the
information and then even store it back to SQL Server.

However, I am looking for best practices. Is using EXCEL VBA the right way
to process hundreds of thousands of records in SQL Server?

If so, what is the best way to do this? Create VBA code to bring over all
records from one table, to excel and then process it , now that Excel can go
over a million rows?

Write VBA code to process blocks of number ids? Should I still be using
EXCEL as an interface since now I am pulling number ids from a SQL server
database and not a spreadsheet?

Thanks for helping me clarify which way to go now with this project.
 
U

UglyChicken

Hello,

My boss asked me to create some macros in VBA to process a list of records
in a spreadsheet.

That is easy.  I have to go out to websites and get data based on the number
id in the spreadsheet and bring back data.  I then take the data from the
websites and upload the row in the spreadsheet for each number id.  

Ok, I got that to work well.

Now he is giving me huge files to process with same routine.  Some CSVs,
some DBFs, some Word docs that will need to be parsed and a couple of
spreadsheets.

With all the different datasources, I thought I should load them into SQL
Server and create one database.  Each table would represent one of the
datasources that I have been given.

Done that.  But the tables have 300,000 records in one, 200,000 in another
and so on.  Big tables and there are 8 tables in all.

Now back to the Excel macro that he wanted me to create in the first place.  
I can convert the macro to get the number id from SQL Server and process the
information and then even store it back to SQL Server.

However, I am looking for best practices.  Is using EXCEL VBA the rightway
to process hundreds of thousands of records in SQL Server?

If so, what is the best way to do this?  Create VBA code to bring over all
records from one table, to excel and then process it , now that Excel cango
over a million rows?

Write VBA code to process blocks of number ids? Should I still be using
EXCEL as an interface since now I am pulling number ids from a SQL server
database and not a spreadsheet?

Thanks for helping me clarify which way to go now with this project.

Is there a reason you can't do what you want to by a stored procedure?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top