best way to open tables?

  • Thread starter Thread starter Richard Fagen
  • Start date Start date
R

Richard Fagen

Hi Everyone,

I have been writing xBase applications for many years but I am new to
VB/SQL for larger applications.

In xBase, I would start the main program (form) by opening up all the
datafiles (dbf) and indexes (cdx) that I would need to reference
throughout the application. I would also keep these files open
throughout the application.

I know how to declare and use SqlConnection, SqlDataAdapter, DataSet to
get a simple grid on a form. My problem occurs when I want to have many
different tables opened (many filtered) through out the application.

My questions:
- What is the equivalent way to do this in VB? (open several tables at
the start)

- Where and how do I declare the connections (SqlCon, DataAdapters,
Datset)? At the top of the main form? In a code module? As public
variable?

- Is it best to filter and sort tables via DataViews?

Thanks

Richard
 
Richard,

The best way is to begin.

You can for that choose if you go the way of the designer method or do all
in code.
When you choose the desigener route, than you will probably see that after a
while you go to the coded way.

In both situations have a look first what is a dataset, as I try to describe
it beneath as simple that I can do.

A dataset is nothing more than an object that keeps together one or more
datatables.
Those datatables have rows(objects) and those have items(objects). The items
are described in the datatable columns. A dataset can as well hold
datarelations.

Therefore what method you take, keep an eye that you don't create all kind
of datasets. One is normally for your application enough (It can be that you
want to use it for by instance webservice or whatever. Than you will see
that you will create more).

You can however create as much datatables as you wish in a dataset. Every
datatable, or combined datatable, needs his own SQL string. You can use one
datadapter from which you change everytime the SQLstring. Probably is more
easy to create for every datatable that you use its own dataadatapter.

And comming back on your main question. Because it is disconnected. Keep as
less data in your tables as you are possible to do, it is better to do more
times updates. And therefore don't start to load all your tables at your
client. When you work with Net try than to work as it was for a PDA or that
you had to sent the data over the Internet.

I hope this gives some idea's?

Cor
 
Are you going to be using dbf tables ?
if so: we have common ground, in short this is a waste of time in
straight vb using oledb or similar, odbc is very slow on large dbf
tables (eg. > 100Mb)

the best article i've read on this was at http://www.west-wind.com/

post reading this, I've constructed my own dbf wrapper and am happily
still access data using very traditional seeks, open, closes
 
Hi Cor,

I had a feeling that the designer method is just for beginners to learn
how to do things. Once you get beyond 2-3 tables, it gets more
complicated and I agree it is best to code things so you get a better
understanding as well as more control.
You can for that choose if you go the way of the designer method or do all
in code.
When you choose the desigener route, than you will probably see that after a
while you go to the coded way.

I like your description of a dataset. Things are starting to make more
sense now :)
In both situations have a look first what is a dataset, as I try to describe
it beneath as simple that I can do.

Thanks for the tip. I was thinking about having a data adapter and
dataset for each table of the database. From your message, it appears
that I can put multiple tables in a single dataset and greatly simplify
things. I will do more experimenting with my test application using a
single dataset.
Therefore what method you take, keep an eye that you don't create all kind
of datasets. One is normally for your application enough (It can be that you
want to use it for by instance webservice or whatever. Than you will see
that you will create more).

Thanks again for the tip. I did a small test application where I loaded
500,000 rows of data into a dataset and then binded it to a datagrid. I
was able to then filter/sort that grid in seconds, but I realized I read
500,000 rows :)

From your message, it seems like a better idea to filter the request
before filling the dataset via the data adapter (from user input like
get only those rows for customer X from 2002-2004)

You are right, it seems to make more sense to work with smaller number
of rows and then make another read request for another set of rows.
And comming back on your main question. Because it is disconnected. Keep as
less data in your tables as you are possible to do, it is better to do more
times updates. And therefore don't start to load all your tables at your
client. When you work with Net try than to work as it was for a PDA or that
you had to sent the data over the Internet.

Yes, you have given me several ideas to test out. Thank you!
I hope this gives some idea's?

Richard
 
Hi Dave,

Yes, I was 'thinking' about accessing some old DBF files that are still
being used.

Saying that, I've been reading a few great SQL books and I've
experimented by importing DBF files and the speed of SQL just blows me
away. While I have to experiment a bit more, at this point, I don't see
why everyone shouldn't import DBF into the more robust SQL format.

Of course, we all have to live with legacy applications and it would be
great to have a new VB/SQL application reference older DBF files in
parallel:)

I'll check out the link.
Thanks for the idea.

Richard
 
Back
Top