Is there limit to num of tbl cols sql Adapter can support?

R

Rich

Hello,

I have some large tables in a Sql Server DB, like 180
columns/fields per table, holding about a million+ records
a piece. I was hoping to use ADO.Net (VB.Net) and a sql
dataAdapter to Fill my tables with data. When I was
configuring the dataAdapter the dialog box said there were
errors configuring the adapter if I included more than 100
of the table columns. But if I only included 100 of the
columns, then the adapter did not display any error
messages. So is there a limit to how many colunns I can
add to the sql Adapter? Or is there some project property
I need to change/increase to add the rest of the columns?
If yes, may I ask what property - location?

Note: I am currently using vb6 and com ADO to write data
to my tables, line by line (across the wire). This has
been working fine for quite a while. Thing is that I keep
the source(non-Microsoft or RDBMS)/destination connections
open for a long time. The goal/hope was to read data to
server memory (got plenty of memory - 32gigs), close the
source connection and fill the tables locally on the
server with the data in memory. If com ADO can handle
this, surely ADO.Net can handle it, can't it?

Or is there another method available in ADO.Net for
filling a sql server table besides an Adapter?

Thanks,
Rich
 
W

William \(Bill\) Vaughn

The first question I would ask is why do you have that many columns? Have
you normalized the table? Have you considered use of "sparse" columns and
SQLVariant? IIRC ADO.NET has an upper limit on the number of Fields it
supports.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
R

Rich

The company where I am employed has a legacy application
that consists of a document based database system with
structures (their equivalent of tables) containing around
180 columns apiece. I have been tasked with retrieving
all of the data on a daily basis and placing it into an
rdbms system. I could take it or leave it (I probably
sound like an aspiring journalist starting out in the
tabloids here - World News :). Right now they make it
worth my time to take on the task. So I read the raw data
into sql tables that are all nvarchar (some ntext) minus
the PK fields (@@Identity) of each table (all the sql and
vb stuff is my own creation - total free reign on that).
My VB program first checks each field of raw data to
ensure it is ascii text (no line/form feed chars for
example - DTS is out of the question - too many faults
with the data - and the ODBC driver that is available
can't support the volume of data/columns etc anyway).
Once the data is placed into the sql text field tables I
have sp's that parse the data into int/datetime/nvarchar
fields etc. Everything is working fine for now using com
ADO, but I would like to step it up to DotNet.

I am aware that com ADO is fairly basic compared to the
functionality of ADO.Net. But as long as a sql table
supports 1024 columns and I could manipulate that many
columns with com ADO - my question is if there is a way to
use ADO.net in a similar way? If there isn't then I am
guessing DotNet has a different way to deal with this kind
of scenario (I hope). Or am I limited to com for this
kind of operation? I would be kind of amazed if the
DotNet architecture won't support something like this (or
would this be like Whidbey country - or whatever the next
version of VS is called).

Thanks,
Rich
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top