G
Guest
Hi. My users want to be able to be compare data. The data for each object to
be compared is in identical DataSets so we decided to use DataSet.Merge() and
then bind the resultant DataSet to a grid. By appending a number to the
column name in each DataSet and nominating one column to be the "key" column
we can rely on the Merge functionality to create a single DataTable with a
column for each property of each object (see code below).
The problem is the Merge is very slow. I am wondering if this is because we
are using keyed data with completely different schemas (as far as Merge is
concerned). Is there anything I can do to speed things up?
There is quite a lot of data (~500 rows, 50 columns per DataTable BEFORE
merging) but I have profiled the code and the cost of the Merge is a lot
higher than I would have expected.
Thanks
kh
// code example (snipped for clarity)
public static DataSet CombineDataSets( string keyField, DataSet[] dataSets)
{
for( int dsIndex = 0; dsIndex < dataSets.Length; ++dsIndex )
{
// get reference to table
DataTable currentDataTable = dataSets[ dsIndex ].Tables[ 0 ];
// add a primary key so merging works correctly
currentDataTable.PrimaryKey = new DataColumn[] {
currentDataTable.Columns[ keyField ] };
// number columns so they are unique when the tables are merged
for( int colIndex = 0; colIndex < currentDataTable.Columns.Count;
++colIndex )
if( !currentDataTable.Columns[ colIndex ].ColumnName.Equals(
keyField ) )
currentDataTable.Columns[ colIndex ].ColumnName =
currentDataTable.Columns[ colIndex ].ColumnName + (dsIndex + 1).ToString();
}
// merge
DataSet ds = new DataSet();
for( int dsIndex = 0; dsIndex < dataSets.Length; ++dsIndex )
ds.Merge( dataSets[ dsIndex ] );
}
be compared is in identical DataSets so we decided to use DataSet.Merge() and
then bind the resultant DataSet to a grid. By appending a number to the
column name in each DataSet and nominating one column to be the "key" column
we can rely on the Merge functionality to create a single DataTable with a
column for each property of each object (see code below).
The problem is the Merge is very slow. I am wondering if this is because we
are using keyed data with completely different schemas (as far as Merge is
concerned). Is there anything I can do to speed things up?
There is quite a lot of data (~500 rows, 50 columns per DataTable BEFORE
merging) but I have profiled the code and the cost of the Merge is a lot
higher than I would have expected.
Thanks
kh
// code example (snipped for clarity)
public static DataSet CombineDataSets( string keyField, DataSet[] dataSets)
{
for( int dsIndex = 0; dsIndex < dataSets.Length; ++dsIndex )
{
// get reference to table
DataTable currentDataTable = dataSets[ dsIndex ].Tables[ 0 ];
// add a primary key so merging works correctly
currentDataTable.PrimaryKey = new DataColumn[] {
currentDataTable.Columns[ keyField ] };
// number columns so they are unique when the tables are merged
for( int colIndex = 0; colIndex < currentDataTable.Columns.Count;
++colIndex )
if( !currentDataTable.Columns[ colIndex ].ColumnName.Equals(
keyField ) )
currentDataTable.Columns[ colIndex ].ColumnName =
currentDataTable.Columns[ colIndex ].ColumnName + (dsIndex + 1).ToString();
}
// merge
DataSet ds = new DataSet();
for( int dsIndex = 0; dsIndex < dataSets.Length; ++dsIndex )
ds.Merge( dataSets[ dsIndex ] );
}