M
max6166
Everyday, I receive a dump of about 20,000 records keyed by an account
number. There are about 20 fields in the table.
Each day, I need to determine which account records have changed, and
append any changed records to an historical log. The log has the exact
same table structure as the incoming table, with the exception that it
adds a date stamp. Typically, only about 10 records change each day.
I have managed to find a few methods to do this task, but they are all
*extremely* slow. Most of my methods consist of looping through the
new incoming data on record at a time, and then looking up the last
stored record for the given account by using either a query or another
loop.
I have a feeling there is a much better way to approach the problem,
but am coming up blank. Does anyone have any suggestions or advice for
doing this task in a more efficient manner?
number. There are about 20 fields in the table.
Each day, I need to determine which account records have changed, and
append any changed records to an historical log. The log has the exact
same table structure as the incoming table, with the exception that it
adds a date stamp. Typically, only about 10 records change each day.
I have managed to find a few methods to do this task, but they are all
*extremely* slow. Most of my methods consist of looping through the
new incoming data on record at a time, and then looking up the last
stored record for the given account by using either a query or another
loop.
I have a feeling there is a much better way to approach the problem,
but am coming up blank. Does anyone have any suggestions or advice for
doing this task in a more efficient manner?