T
Tolga Erdogus
I have two tables, MASTER and DETAIL. For every master row there are
about 800 detail records.
I have two grids representing the master detail relationship. In my
dataset I have a relationship defined which allows me to have the
detail records automatically populate through client side filtering
whenever the master record changes.
When I want to refresh the DETAIL recordset manually for a given
master record, I have two choices: 1)refresh the entire DETAIL table
and let client side filtering bring the newest version of the detail
records in to the grid or 2)have a query that only refreshes the
DETAIL records that pertain to the currently selected master record.
If this operation were always to bring back the same primary keys (but
with potentially changed non-primary key fields) everything would be
great. I could do a merge on the dataset and it would work nicely.
However, the detail records could have been added to or deleted from
by another user so I really should remove all detail records
corresponding to the current master record and then do a merge from
the query.
The problem is: if I loop through all the detail records (about 800)
and do a deleterow it takes 30-40 seconds even if I disable
constraints. Is this normal? If so, isn't there a quicker way to do
this? Like a function equivalent to Clear() in speed???
Thanks
about 800 detail records.
I have two grids representing the master detail relationship. In my
dataset I have a relationship defined which allows me to have the
detail records automatically populate through client side filtering
whenever the master record changes.
When I want to refresh the DETAIL recordset manually for a given
master record, I have two choices: 1)refresh the entire DETAIL table
and let client side filtering bring the newest version of the detail
records in to the grid or 2)have a query that only refreshes the
DETAIL records that pertain to the currently selected master record.
If this operation were always to bring back the same primary keys (but
with potentially changed non-primary key fields) everything would be
great. I could do a merge on the dataset and it would work nicely.
However, the detail records could have been added to or deleted from
by another user so I really should remove all detail records
corresponding to the current master record and then do a merge from
the query.
The problem is: if I loop through all the detail records (about 800)
and do a deleterow it takes 30-40 seconds even if I disable
constraints. Is this normal? If so, isn't there a quicker way to do
this? Like a function equivalent to Clear() in speed???
Thanks