Most efficient way to process this set of records???

H

Hexman

Hello All,

I'd like to know the most efficient way to process a set of records. I've built a composite record that is to contain columns from 6 other "Master"
tables. I need to update the composite record with the values of those columns and perform some calculations along the way. I coded a vb.net (2005)
using Access (I know it should be SQL Server, but not allowed by client) to accomplish the task, but it takes an extremely long time. The code was
procedural as:

Get composite record
Set up Master-Table-1-Select-Parameters
Get Master-Table-1-Columns and place in Composite

Set up Master-Table-2-Select-Parameters
Get Master-Table-2-Columns and place in Composite

.....and so on with the other 4 Master tables.....

Update the datatable
Accept Changes
Go get another composite record and process until end of composites

It is further complicated by volume:

Composite table - 162,000 records
Master-Table-1 - 18,000 (1)
Master-Table-2 - 77,500 (1)
Master-Table-3 - 5,500 (2)
Master-Table-4 - 15,000 (1)
Master-Table-5 - 130,000 (1)
Master-Table-6 - 1,742,000 (2) (not a typo)
(The number in parens is the number of records I have to retrieve for EACH composite record.)

I tried to create a datarelation between a Master-Table and the Detail-Composite (in code) but ran into problems, which I posted a message to
"microsoft.public.dotnet.languages.vb". (See "DataRelation Woes!!! Need Help (column argument cannot be null)")

So my questions are:

By properly setting up datarelations, does that substantially increase performance over the way I originally coded it? (My guess is that I approached
this in the worst way possible.)

.....OR better yet,

Can anyone provide the process/logic to improve the performance to get this task accomplished??

Thanks,

Hexman
 
C

Cor Ligthert [MVP]

Hexman,

In my idea is composite records not a name used by Microsoft, my best search
on that gave me links to Sebastian Bach and Beethoven,

Can you give us in in other words a describtion what you want to accomplish
and why that is so different from methods as standard at AdoNet?

That you use Access or whatever is not the question in this.

Cor
 
G

Guest

What he is needing to do is retrieve values from one or more columns from
each of the six Master tables and then inserting them into a seventh table.
Each row in the seventh table contains a composite, or aggregation, of the
data retrieved from the other six tables with a column corresponding to each
of the columns returned by his queries against the six tables. Think of it
as being similar to the Composite pattern, which is used in Microsoft's
Composite UI Application Block.
 
H

Hexman

Yes, Cor,

What I mean by "Composite" record is a group of columns from several tables to create a new table. In this particular task, there are 6 tables to
get columns from to create approximately 60 data columns plus about 10 fields used to create keys to select from those 6 tables. This newly created
table is to be used for analysis. Each of the columns will contain data that will allow the user to get a feel for his operation. For example, one
of the columns will be "days since last order". From that column (along with others) he will be able to tell if his customer base is stretching out
their ordering time frame - maybe that will suggest to offer incentives to the customers.

In short, what I am calling "Composite" is nothing more than columns in a detail record that come from several "Master" records (as in a Master-Detail
relationship). I didn't want to say "Master-Detail" because that may give someone the idea that these would have to be linked as an end result.

I ask the question to find out how best to create this new table.
1. Do procedurally as I have already done (probably not)
2. Use datareaders only to select and update the new table.
3. Use datarelation between a master to the new detail (as opposed to retrieve each master)
4. Use SQL commands only

I've noticed that when updating say 30,000 records in Access (and SQL EE) (setting one column to zero) it takes very little time. And when I add the
procedure to setup command parameters and retrieve a master record, the time increases significantly (much, much more than I would have thought). Now
adding 7 more retrieves especially against a large database (I consider 1.7m records large), becomes unbearable.

Are you aware of any books that talk about and give direct examples on performance enhancement?

I don't think what I'm trying to do is different - just getting data from 6 different tables and updating a single table.. (In my mainframe days we
would setup a new table with the columns we wanted and made passes with each of the master tables filling the new table. Maybe not the most
efficient, but it worked.)

Any help would be appreciated.

Hexman
 
C

Cor Ligthert [MVP]

Hexman,

I am more a dotNet guy than a SQL server guy, this seems for me typical a
question for Bill Vaughn if this is possible using direct the SQL server, if
not that I can maybe find a solution with tables, but that will be for sure
in this case much slower than direct processing on the Server.

It is not impossible he has this in one of his books.

Cor
 
H

Hexman

Thanks Cor,

I'll check out his books. I thought Bill's books were strictly for SQL Server. In my trials I toyed with SS and found it to be a much more robust
solution. Unfortunately, I am stuck with Access for this particular project.

The one thing I found that would make my task easier is that in SS you could use the "Insert Into.....From" statement which appears to be absent in
Access. And with properly coded joins, it would probably provide a more efficient solution.

Thanks,

Hexman
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top