VS2003 transfer data between two data sources

G

Guest

I am loading a DataSet from an access file (exported from another app that a
vendor provided) and want to appendthe data into an SQL Table that has the
same structure.

I can easily append rows to the SQL table, but that can't be a best
practice! Does anyone have any guidance as to taking the filled dataset from
excel and then doing an append to the SQL table without having to step
through all the rows (obviously the two have the same structure)

Currently:

private void TransferData()

string _eConnectionString;
// Connection string for the Excel file selected through UI into TextBox
fileName
_eConnectionString = @"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" +
this.fileName.Text + ";Extended Properties=\"Excel 8.0;HDR=Yes;IMEX=1\"";
OleDbCommand cmd = new OleDbCommand("SELECT * FROM [CIPEXPORT$]", oConn);

DataSet ds = new DataSet();
OleDbDataAdapter da = new OleDbDataAdapter();
da.SelectCommand = cmd;

da.Fill(ds,"testData");
DataTable dt = new DataTable();

// Open up the SQL Server Connection
SqlConnectionConnection sConn = new SqlConnection(ConnectionString);
SqlCommand sCmd = new SqlCommand("_StoredProcedure", sConn);
SCmd.CommandType = CommandType.StoredProcedure;

sCmd.Parameters.Add("@Field1", SqlDbType.NvarChar, 10);
// list of fields by variable

// At this point what's the best/fastest method to transfer the dataset to
the Sql Server table? I could export to XML. There must be an easy method,
but I'm not fluent enough in ADO.NET yet.

Thanks!
 
O

Otis Mukinfus

I am loading a DataSet from an access file (exported from another app that a
vendor provided) and want to appendthe data into an SQL Table that has the
same structure.

I can easily append rows to the SQL table, but that can't be a best
practice! Does anyone have any guidance as to taking the filled dataset from
excel and then doing an append to the SQL table without having to step
through all the rows (obviously the two have the same structure)

Currently:

private void TransferData()

string _eConnectionString;
// Connection string for the Excel file selected through UI into TextBox
fileName
_eConnectionString = @"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" +
this.fileName.Text + ";Extended Properties=\"Excel 8.0;HDR=Yes;IMEX=1\"";
OleDbCommand cmd = new OleDbCommand("SELECT * FROM [CIPEXPORT$]", oConn);

DataSet ds = new DataSet();
OleDbDataAdapter da = new OleDbDataAdapter();
da.SelectCommand = cmd;

da.Fill(ds,"testData");
DataTable dt = new DataTable();

// Open up the SQL Server Connection
SqlConnectionConnection sConn = new SqlConnection(ConnectionString);
SqlCommand sCmd = new SqlCommand("_StoredProcedure", sConn);
SCmd.CommandType = CommandType.StoredProcedure;

sCmd.Parameters.Add("@Field1", SqlDbType.NvarChar, 10);
// list of fields by variable

// At this point what's the best/fastest method to transfer the dataset to
the Sql Server table? I could export to XML. There must be an easy method,
but I'm not fluent enough in ADO.NET yet.

Thanks!

Is this a one time exercise, or do you need to do it frequently? If you need to
do it frequently and have access to DTS on the MS SQL Server database, then
create a DTS package to do the transfer of data. Even if it's a one-timer you
can do it very easily with DTS.

If it must be done in an application, then I believe the best way is a row at a
time (but not with a DataSet), even though it may seem tedious, you might be
surprised how fast it is.

Question: In your question you say the file is an "access file", but in your
code you seem to be loading an Excel file. Just curious, not throwing rocks.


Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
G

Guest

Hi Otis,

Actually this is a monthly event. It's really not that big, probably about
500 records, but I thought since I had a data set/ data table, theoretically
there might be a way to attach it to another data adapter. Oops on the
datasource it is an Excel file (old habits are tough to break ;) ). A DTS
package might be a good solution except that the column names the SQL Server
sees will be inconsistent, thanks. When I originally wrote this applet it was
in MS Access where it's really simple to link data. I want to write a simple
windows app in .Net so I can work with the roles I designed within SqlServer.

The problem with reading an excel table without a header row is that the
some of the field names that are arbitraly assigned are almost random (this
was an issue when I did an attach to the server or tried a dts package) Some
columns are coming up as F1, F2... others are picking up the content of the
cells in the first row as column names. At least with a data reader I can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a more
elegant solution - especially if you think of potentially scallable problems.
Another not so elegant solution might be to get the UI to read the Excel
file, write an XML, have a predefined XSD with field names, attach the XLM
and do a set appent?

Thank you for the feedback.

- Abe

Otis Mukinfus said:
I am loading a DataSet from an access file (exported from another app that a
vendor provided) and want to appendthe data into an SQL Table that has the
same structure.

I can easily append rows to the SQL table, but that can't be a best
practice! Does anyone have any guidance as to taking the filled dataset from
excel and then doing an append to the SQL table without having to step
through all the rows (obviously the two have the same structure)

Currently:

private void TransferData()

string _eConnectionString;
// Connection string for the Excel file selected through UI into TextBox
fileName
_eConnectionString = @"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" +
this.fileName.Text + ";Extended Properties=\"Excel 8.0;HDR=Yes;IMEX=1\"";
OleDbCommand cmd = new OleDbCommand("SELECT * FROM [CIPEXPORT$]", oConn);

DataSet ds = new DataSet();
OleDbDataAdapter da = new OleDbDataAdapter();
da.SelectCommand = cmd;

da.Fill(ds,"testData");
DataTable dt = new DataTable();

// Open up the SQL Server Connection
SqlConnectionConnection sConn = new SqlConnection(ConnectionString);
SqlCommand sCmd = new SqlCommand("_StoredProcedure", sConn);
SCmd.CommandType = CommandType.StoredProcedure;

sCmd.Parameters.Add("@Field1", SqlDbType.NvarChar, 10);
// list of fields by variable

// At this point what's the best/fastest method to transfer the dataset to
the Sql Server table? I could export to XML. There must be an easy method,
but I'm not fluent enough in ADO.NET yet.

Thanks!

Is this a one time exercise, or do you need to do it frequently? If you need to
do it frequently and have access to DTS on the MS SQL Server database, then
create a DTS package to do the transfer of data. Even if it's a one-timer you
can do it very easily with DTS.

If it must be done in an application, then I believe the best way is a row at a
time (but not with a DataSet), even though it may seem tedious, you might be
surprised how fast it is.

Question: In your question you say the file is an "access file", but in your
code you seem to be loading an Excel file. Just curious, not throwing rocks.


Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
O

Otis Mukinfus

Hi Otis,

Actually this is a monthly event. It's really not that big, probably about
500 records, but I thought since I had a data set/ data table, theoretically
there might be a way to attach it to another data adapter. Oops on the
datasource it is an Excel file (old habits are tough to break ;) ). A DTS
package might be a good solution except that the column names the SQL Server
sees will be inconsistent, thanks. When I originally wrote this applet it was
in MS Access where it's really simple to link data. I want to write a simple
windows app in .Net so I can work with the roles I designed within SqlServer.

The problem with reading an excel table without a header row is that the
some of the field names that are arbitraly assigned are almost random (this
was an issue when I did an attach to the server or tried a dts package) Some
columns are coming up as F1, F2... others are picking up the content of the
cells in the first row as column names. At least with a data reader I can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a more
elegant solution - especially if you think of potentially scallable problems.
Another not so elegant solution might be to get the UI to read the Excel
file, write an XML, have a predefined XSD with field names, attach the XLM
and do a set appent?

Thank you for the feedback.

- Abe
[snip]

These comments are opinions, so take them with a grain of salt if you wish.

I wouldn't do it with XML unless you are getting the data from outside your
organization. My experience is that XML is over-used. It's bloated and
unnecessary unless your moving data across the web. It seems to me that a fad is
in place where people are so enamored with it, they use it at every opportunity.

I wouldn't do it with a DataSet. Reading the Excel document is going to be time
consuming enough. Putting the data in a DataSet while reading it is just
another added step. If one does that, they have to now parse the DataSet a row
at a time.

I think opening a DataReader in the Excel document is the way to go. One read
per row, one insert per row.

I've been doing backend data manipulation for more than twenty years and have
seen a lot of changes in this biz for the better, but when applying those new
and better technologies, a good rule to follow is still, and will continue to
be; Keep it simple, you may have to maintain it some day.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
G

Guest

Thanks. That actually turned out to be my implimentation. Open up the
firehose reading the excel file with a datareader and firing off a stored
procedure on the Server for each read(). It was a little tedious, but you're
right, 500 records inserted takes well under a second to append. The only
issue with it is that it sort of thrashes the DB server for that split second
with an insert per record. I just thought that in the "modern" OO world there
would be a model for basically transfering the "dataset" between data
providers and do it in one insert operation. I've been yelled at by folks
like Joe Celko who have berated my "procedural" solutions (according to them
"so 70's"), so I was trying to get with the program ;-)

Anyway, thanks so much for the feedback. Take care.

Abe Rosner

Otis Mukinfus said:
Hi Otis,

Actually this is a monthly event. It's really not that big, probably about
500 records, but I thought since I had a data set/ data table, theoretically
there might be a way to attach it to another data adapter. Oops on the
datasource it is an Excel file (old habits are tough to break ;) ). A DTS
package might be a good solution except that the column names the SQL Server
sees will be inconsistent, thanks. When I originally wrote this applet it was
in MS Access where it's really simple to link data. I want to write a simple
windows app in .Net so I can work with the roles I designed within SqlServer.

The problem with reading an excel table without a header row is that the
some of the field names that are arbitraly assigned are almost random (this
was an issue when I did an attach to the server or tried a dts package) Some
columns are coming up as F1, F2... others are picking up the content of the
cells in the first row as column names. At least with a data reader I can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a more
elegant solution - especially if you think of potentially scallable problems.
Another not so elegant solution might be to get the UI to read the Excel
file, write an XML, have a predefined XSD with field names, attach the XLM
and do a set appent?

Thank you for the feedback.

- Abe
[snip]

These comments are opinions, so take them with a grain of salt if you wish.

I wouldn't do it with XML unless you are getting the data from outside your
organization. My experience is that XML is over-used. It's bloated and
unnecessary unless your moving data across the web. It seems to me that a fad is
in place where people are so enamored with it, they use it at every opportunity.

I wouldn't do it with a DataSet. Reading the Excel document is going to be time
consuming enough. Putting the data in a DataSet while reading it is just
another added step. If one does that, they have to now parse the DataSet a row
at a time.

I think opening a DataReader in the Excel document is the way to go. One read
per row, one insert per row.

I've been doing backend data manipulation for more than twenty years and have
seen a lot of changes in this biz for the better, but when applying those new
and better technologies, a good rule to follow is still, and will continue to
be; Keep it simple, you may have to maintain it some day.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
W

William \(Bill\) Vaughn

While you can deliver coal with a bicycle, I would not recommend it as a
"best practice".
Check out the SqlBulkCopy class. It's designed to do specifically what
you're asking to do. It permits you to open a DataReader against the source
data and blast the rows to a SQL Server table. Once the "temporary" table is
imported, you use a SP to filter/validate/refine the rows that are added to
the production table.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

AbeR said:
Thanks. That actually turned out to be my implimentation. Open up the
firehose reading the excel file with a datareader and firing off a stored
procedure on the Server for each read(). It was a little tedious, but
you're
right, 500 records inserted takes well under a second to append. The only
issue with it is that it sort of thrashes the DB server for that split
second
with an insert per record. I just thought that in the "modern" OO world
there
would be a model for basically transfering the "dataset" between data
providers and do it in one insert operation. I've been yelled at by folks
like Joe Celko who have berated my "procedural" solutions (according to
them
"so 70's"), so I was trying to get with the program ;-)

Anyway, thanks so much for the feedback. Take care.

Abe Rosner

Otis Mukinfus said:
Hi Otis,

Actually this is a monthly event. It's really not that big, probably
about
500 records, but I thought since I had a data set/ data table,
theoretically
there might be a way to attach it to another data adapter. Oops on the
datasource it is an Excel file (old habits are tough to break ;) ). A
DTS
package might be a good solution except that the column names the SQL
Server
sees will be inconsistent, thanks. When I originally wrote this applet
it was
in MS Access where it's really simple to link data. I want to write a
simple
windows app in .Net so I can work with the roles I designed within
SqlServer.

The problem with reading an excel table without a header row is that the
some of the field names that are arbitraly assigned are almost random
(this
was an issue when I did an attach to the server or tried a dts package)
Some
columns are coming up as F1, F2... others are picking up the content of
the
cells in the first row as column names. At least with a data reader I
can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a
more
elegant solution - especially if you think of potentially scallable
problems.
Another not so elegant solution might be to get the UI to read the Excel
file, write an XML, have a predefined XSD with field names, attach the
XLM
and do a set appent?

Thank you for the feedback.

- Abe
[snip]

These comments are opinions, so take them with a grain of salt if you
wish.

I wouldn't do it with XML unless you are getting the data from outside
your
organization. My experience is that XML is over-used. It's bloated and
unnecessary unless your moving data across the web. It seems to me that a
fad is
in place where people are so enamored with it, they use it at every
opportunity.

I wouldn't do it with a DataSet. Reading the Excel document is going to
be time
consuming enough. Putting the data in a DataSet while reading it is just
another added step. If one does that, they have to now parse the DataSet
a row
at a time.

I think opening a DataReader in the Excel document is the way to go. One
read
per row, one insert per row.

I've been doing backend data manipulation for more than twenty years and
have
seen a lot of changes in this biz for the better, but when applying those
new
and better technologies, a good rule to follow is still, and will
continue to
be; Keep it simple, you may have to maintain it some day.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
G

Guest

Thanks Bill,

I KNEW there had to be a better way ;-) This was the "eligant solution" I
was looking for. I just could not figure out where in the framework they hid
it.

Best,
- Abe

William (Bill) Vaughn said:
While you can deliver coal with a bicycle, I would not recommend it as a
"best practice".
Check out the SqlBulkCopy class. It's designed to do specifically what
you're asking to do. It permits you to open a DataReader against the source
data and blast the rows to a SQL Server table. Once the "temporary" table is
imported, you use a SP to filter/validate/refine the rows that are added to
the production table.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

AbeR said:
Thanks. That actually turned out to be my implimentation. Open up the
firehose reading the excel file with a datareader and firing off a stored
procedure on the Server for each read(). It was a little tedious, but
you're
right, 500 records inserted takes well under a second to append. The only
issue with it is that it sort of thrashes the DB server for that split
second
with an insert per record. I just thought that in the "modern" OO world
there
would be a model for basically transfering the "dataset" between data
providers and do it in one insert operation. I've been yelled at by folks
like Joe Celko who have berated my "procedural" solutions (according to
them
"so 70's"), so I was trying to get with the program ;-)

Anyway, thanks so much for the feedback. Take care.

Abe Rosner

Otis Mukinfus said:
Hi Otis,

Actually this is a monthly event. It's really not that big, probably
about
500 records, but I thought since I had a data set/ data table,
theoretically
there might be a way to attach it to another data adapter. Oops on the
datasource it is an Excel file (old habits are tough to break ;) ). A
DTS
package might be a good solution except that the column names the SQL
Server
sees will be inconsistent, thanks. When I originally wrote this applet
it was
in MS Access where it's really simple to link data. I want to write a
simple
windows app in .Net so I can work with the roles I designed within
SqlServer.

The problem with reading an excel table without a header row is that the
some of the field names that are arbitraly assigned are almost random
(this
was an issue when I did an attach to the server or tried a dts package)
Some
columns are coming up as F1, F2... others are picking up the content of
the
cells in the first row as column names. At least with a data reader I
can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a
more
elegant solution - especially if you think of potentially scallable
problems.
Another not so elegant solution might be to get the UI to read the Excel
file, write an XML, have a predefined XSD with field names, attach the
XLM
and do a set appent?

Thank you for the feedback.

- Abe

[snip]

These comments are opinions, so take them with a grain of salt if you
wish.

I wouldn't do it with XML unless you are getting the data from outside
your
organization. My experience is that XML is over-used. It's bloated and
unnecessary unless your moving data across the web. It seems to me that a
fad is
in place where people are so enamored with it, they use it at every
opportunity.

I wouldn't do it with a DataSet. Reading the Excel document is going to
be time
consuming enough. Putting the data in a DataSet while reading it is just
another added step. If one does that, they have to now parse the DataSet
a row
at a time.

I think opening a DataReader in the Excel document is the way to go. One
read
per row, one insert per row.

I've been doing backend data manipulation for more than twenty years and
have
seen a lot of changes in this biz for the better, but when applying those
new
and better technologies, a good rule to follow is still, and will
continue to
be; Keep it simple, you may have to maintain it some day.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
G

Guest

Hi Bill,

An important note here. This seems to be a great method, but it is only
available in Framework 2.0. under the Using System.Data.SqlClient; class.
Since I'm writing a VS 2003 project I still have to plod along on my bicycle
;-)

This will be great for future projects though!

Best,
- Abe

William (Bill) Vaughn said:
While you can deliver coal with a bicycle, I would not recommend it as a
"best practice".
Check out the SqlBulkCopy class. It's designed to do specifically what
you're asking to do. It permits you to open a DataReader against the source
data and blast the rows to a SQL Server table. Once the "temporary" table is
imported, you use a SP to filter/validate/refine the rows that are added to
the production table.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

AbeR said:
Thanks. That actually turned out to be my implimentation. Open up the
firehose reading the excel file with a datareader and firing off a stored
procedure on the Server for each read(). It was a little tedious, but
you're
right, 500 records inserted takes well under a second to append. The only
issue with it is that it sort of thrashes the DB server for that split
second
with an insert per record. I just thought that in the "modern" OO world
there
would be a model for basically transfering the "dataset" between data
providers and do it in one insert operation. I've been yelled at by folks
like Joe Celko who have berated my "procedural" solutions (according to
them
"so 70's"), so I was trying to get with the program ;-)

Anyway, thanks so much for the feedback. Take care.

Abe Rosner

Otis Mukinfus said:
Hi Otis,

Actually this is a monthly event. It's really not that big, probably
about
500 records, but I thought since I had a data set/ data table,
theoretically
there might be a way to attach it to another data adapter. Oops on the
datasource it is an Excel file (old habits are tough to break ;) ). A
DTS
package might be a good solution except that the column names the SQL
Server
sees will be inconsistent, thanks. When I originally wrote this applet
it was
in MS Access where it's really simple to link data. I want to write a
simple
windows app in .Net so I can work with the roles I designed within
SqlServer.

The problem with reading an excel table without a header row is that the
some of the field names that are arbitraly assigned are almost random
(this
was an issue when I did an attach to the server or tried a dts package)
Some
columns are coming up as F1, F2... others are picking up the content of
the
cells in the first row as column names. At least with a data reader I
can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a
more
elegant solution - especially if you think of potentially scallable
problems.
Another not so elegant solution might be to get the UI to read the Excel
file, write an XML, have a predefined XSD with field names, attach the
XLM
and do a set appent?

Thank you for the feedback.

- Abe

[snip]

These comments are opinions, so take them with a grain of salt if you
wish.

I wouldn't do it with XML unless you are getting the data from outside
your
organization. My experience is that XML is over-used. It's bloated and
unnecessary unless your moving data across the web. It seems to me that a
fad is
in place where people are so enamored with it, they use it at every
opportunity.

I wouldn't do it with a DataSet. Reading the Excel document is going to
be time
consuming enough. Putting the data in a DataSet while reading it is just
another added step. If one does that, they have to now parse the DataSet
a row
at a time.

I think opening a DataReader in the Excel document is the way to go. One
read
per row, one insert per row.

I've been doing backend data manipulation for more than twenty years and
have
seen a lot of changes in this biz for the better, but when applying those
new
and better technologies, a good rule to follow is still, and will
continue to
be; Keep it simple, you may have to maintain it some day.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
O

Otis Mukinfus

While you can deliver coal with a bicycle, I would not recommend it as a
"best practice".
Check out the SqlBulkCopy class. It's designed to do specifically what
you're asking to do. It permits you to open a DataReader against the source
data and blast the rows to a SQL Server table. Once the "temporary" table is
imported, you use a SP to filter/validate/refine the rows that are added to
the production table.

hth

My thanks also, bill. I was unaware of that little feller.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 
W

William \(Bill\) Vaughn

Nope, while the 2.0 framework exposes the SqlBulkCopy class, the BCP (or
DTS) utility can be invoked in code from all versions. Use SQLDMO or SQLSMO
to invoke it. It's also available as a TSQL function or a stand-alone
utility.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

AbeR said:
Hi Bill,

An important note here. This seems to be a great method, but it is only
available in Framework 2.0. under the Using System.Data.SqlClient; class.
Since I'm writing a VS 2003 project I still have to plod along on my
bicycle
;-)

This will be great for future projects though!

Best,
- Abe

William (Bill) Vaughn said:
While you can deliver coal with a bicycle, I would not recommend it as a
"best practice".
Check out the SqlBulkCopy class. It's designed to do specifically what
you're asking to do. It permits you to open a DataReader against the
source
data and blast the rows to a SQL Server table. Once the "temporary" table
is
imported, you use a SP to filter/validate/refine the rows that are added
to
the production table.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no
rights.
__________________________________

AbeR said:
Thanks. That actually turned out to be my implimentation. Open up the
firehose reading the excel file with a datareader and firing off a
stored
procedure on the Server for each read(). It was a little tedious, but
you're
right, 500 records inserted takes well under a second to append. The
only
issue with it is that it sort of thrashes the DB server for that split
second
with an insert per record. I just thought that in the "modern" OO world
there
would be a model for basically transfering the "dataset" between data
providers and do it in one insert operation. I've been yelled at by
folks
like Joe Celko who have berated my "procedural" solutions (according to
them
"so 70's"), so I was trying to get with the program ;-)

Anyway, thanks so much for the feedback. Take care.

Abe Rosner

:

On Sun, 9 Apr 2006 14:34:01 -0700, AbeR
<[email protected]>
wrote:

Hi Otis,

Actually this is a monthly event. It's really not that big, probably
about
500 records, but I thought since I had a data set/ data table,
theoretically
there might be a way to attach it to another data adapter. Oops on
the
datasource it is an Excel file (old habits are tough to break ;) ). A
DTS
package might be a good solution except that the column names the SQL
Server
sees will be inconsistent, thanks. When I originally wrote this
applet
it was
in MS Access where it's really simple to link data. I want to write a
simple
windows app in .Net so I can work with the roles I designed within
SqlServer.

The problem with reading an excel table without a header row is that
the
some of the field names that are arbitraly assigned are almost random
(this
was an issue when I did an attach to the server or tried a dts
package)
Some
columns are coming up as F1, F2... others are picking up the content
of
the
cells in the first row as column names. At least with a data reader I
can get
column values by ordinal.

Maybe I'm making too much out of this, but I thought there must be a
more
elegant solution - especially if you think of potentially scallable
problems.
Another not so elegant solution might be to get the UI to read the
Excel
file, write an XML, have a predefined XSD with field names, attach
the
XLM
and do a set appent?

Thank you for the feedback.

- Abe

[snip]

These comments are opinions, so take them with a grain of salt if you
wish.

I wouldn't do it with XML unless you are getting the data from outside
your
organization. My experience is that XML is over-used. It's bloated
and
unnecessary unless your moving data across the web. It seems to me
that a
fad is
in place where people are so enamored with it, they use it at every
opportunity.

I wouldn't do it with a DataSet. Reading the Excel document is going
to
be time
consuming enough. Putting the data in a DataSet while reading it is
just
another added step. If one does that, they have to now parse the
DataSet
a row
at a time.

I think opening a DataReader in the Excel document is the way to go.
One
read
per row, one insert per row.

I've been doing backend data manipulation for more than twenty years
and
have
seen a lot of changes in this biz for the better, but when applying
those
new
and better technologies, a good rule to follow is still, and will
continue to
be; Keep it simple, you may have to maintain it some day.

Good luck with your project,

Otis Mukinfus
http://www.arltex.com
http://www.tomchilders.com
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top