Best practice on establishing DB connection

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I've been reworking a particular DB to gain efficiency in the network
environment in preparation for moving it out of a terminal services
environment to the users' remote LAN so they can run it across their LAN. I
am currently testing the FE on the user's PC's, linked to the BE via a driver
mapped to the folder where the test BE resides on a Win2K server on the LAN.

After getting rid of most of my domain aggregate functions, etc., almost
everything is much, much faster. In testing, the users tell me it is as good
or faster, even when they are all running reports at the same time. I am
stuck with two ponts that are evidently much slower in tis configuration:

1. Opening the first form that requires BE access: the first user has no
problem, but it is painfully slow to open for all subsequent users. Is there
some way to make the first connection to the BE faster? I am just guessing
that this is the bottleneck, because the first form that opens is an unbound
login screen, and a significant pause occurs after the user enters the login
information, and the code checks the user name & password against the user
table (not using integrated Access security - just a users table).

2. Opening a main form that has about 50 bound controls as well as a number
of unbound controls.

I've heard the whole thing about limiting the number of controls, but I've
been over & over it with the users, and every control on the form is, indeed,
necessary to provide a valid record in the table; besides that, it took all
of 1-2 seconds to open in the terminal server environment, even on their
aging Win2K server.

Any ideas?
 
After getting rid of most of my domain aggregate functions, etc., almost
everything is much, much faster. In testing, the users tell me it is as
good
or faster, even when they are all running reports at the same time. I am
stuck with two ponts that are evidently much slower in tis configuration:

Good stuff. A lot of people point out that getting rid of them can help. In
fact, it really kind of depends on "where" and "when" you use them. they are
actually not that bad, but to "learn" when to use them, and when they don't
work takes a long time, so the general suggest is to just get rid of them,
then it takes too long to explain the ins and out!!
1. Opening the first form that requires BE access: the first user has no
problem, but it is painfully slow to open for all subsequent users. Is
there
some way to make the first connection to the BE faster? I am just guessing
that this is the bottleneck, because the first form that opens is an
unbound
login screen, and a significant pause occurs after the user enters the
login
information, and the code checks the user name & password against the user
table (not using integrated Access security - just a users table).

In your startup code, the FIRST THING you want to do is build a persist
connection to the back end. Doing this will not only eliminate the above big
delay, but as a general rule will sped up MANY other operations. So, simply
open up a table from the back end (it can be any table..but open it...and
keep it open). Your delays should go away.

2. Opening a main form that has about 50 bound controls as well as a
number
of unbound controls.

I've heard the whole thing about limiting the number of controls, but I've
been over & over it with the users, and every control on the form is,
indeed,
necessary to provide a valid record in the table; besides that, it took
all
of 1-2 seconds to open in the terminal server environment, even on their
aging Win2K server.

I agree. If I got a form with 20 controls (simple text box control) and a
form with 40 controls (that is 100% more), you likely will NOT notice one
bit of difference in load time. However, if you got 5 more combo boxes, then
you can take a big hit indeed. And, the same goes for sub-forms (note that
subfomrs are just controls..and you can actually "set" what "form" sub-form
points to. So, a sub-form is actualy a contorl).

So, just like the rules about getting rid of domain aggregate functions,
again, there is a hidden lining here. You can be quite generous and loosely
goosey in the number of controls. Choping out 50% of the contors on a form
will do absoulty noting if all the contorls are simple text boxes.

But, if those controls are sub-forms, or combo boxes, then you got some
work, and caution at hand.

Also, for that amazing "persistent" performance tick above that restores
your performance, and gets rid of the delay, you can read the following faq
for that trick and more. You do wan to check this out:

http://www.granite.ab.ca/access/performancefaq.htm
 
Thank you, Albert. I had heard about the persistent connection thing but
wasn't sure how to set it up. Is there any issue in the multi-user
envirnoment with record-locking on the single record in the
persistent-connection-holding table?

My biggest bottleneck was a host of DSum's & DLookup's inside my queries.
Those are all gone now, replaced by union & stacked queries.

I'm pretty well stuck with the combo boxes; however, it was not a problem on
the terminal server, so I suspect the biggest issue is that in between the
first record access (user authentication at logon) and the loading of that
form was a menu form that (you guessed it) was not bound to a table. All I
have to do now is bind my main menu (which remains open in the background at
all times) to a table (thus the above question about record-locking). Does
that sound about right?
 
Albert,
So then the poster should/could do the following...

-Check any comboBoxes/listBoxes to ensure that any underlying queries
are as streamlined as possible.
-Possibly(?), use code that executes when a comboBox/listBox receives
the focus to set the control's rowSource. (I'm thinking that doing so
would cut down on the number of queries being executed at load time
assuming that there are multiple comboBoxes & listBoxes)
-Along with that thought, leaving any subforms as unbound until they
first receive the focus, again cutting down on the number of queries
being executed.

Correct? Ideas?

David H
 
Ah ha! Why do you have to have the Main Menu open at all times? If
you're using it to run code when the DB closes, why not use an alternate
form with 0 controls?
 
I guess I'm just lazy, or maybe its just a bad habit. I make all my forms &
reports modal so I don't have to second-guess what a user might go and change
while in the middle of doing something that might affect the data on the form
he is using (or forget that he has a form open/minimized & try to reopen it -
I don't have to check if it's already open, because I KNOW he doesn't have a
form open if he's clicking on the only button that opens it...)

The main menu also has the Quit button that closes the application, which is
where I do the "Do you really want to close?" thing (based on user
preferences). The form just sits benignly behind everything (it's not pretty,
but it works), and the modal form layout forces the users to retreat to the
main menu to close the program (as long as they behave and stay away from the
close button in the Access control box - I've never gotten around to
implementing the "hide the Access close button" gadget).
 
Hello, again.

Already did #1. That was a large part of the "getting rid of domain
aggregates"...

If I were in another environment (i.e. other than Access) that didn't take a
lot of the work out of all this system-level manipulation, I would probably
be a lot more disciplined about managing form & control source loading,
instead of spending all my time on user interface & logic issues. I've
thought about setting RowSource on the Enter event of various controls, but
it has not been an issue until now, because it has not caused any particular
problem in the remote/RDP environment.

On #3: In this app, both the header & detail records are so complex that I
elected to have the user complete the header record, then click a button to
open the detail form (not a subform). With the modal approach to forms, I
just leave the header form open, and one of its controls acts as the source
for the filter on the detail form.

Thanks.
 
Brian said:
Thank you, Albert. I had heard about the persistent connection thing but
wasn't sure how to set it up. Is there any issue in the multi-user
envirnoment with record-locking on the single record in the
persistent-connection-holding table?

If you open the table, no locks occur. So, no, I see a problem with that
record.
All I
have to do now is bind my main menu (which remains open in the background
at
all times) to a table (thus the above question about record-locking). Does
that sound about right?

Yes, often in place of code opening a "global" recordset var, you can use a
form bound to a table.

As long as the form has "no locks" set, you should be ok (you might want to
check the forms "data" tab to be sure you don't have any locks set.........
 
Albert,
So then the poster should/could do the following...

-Check any comboBoxes/listBoxes to ensure that any underlying queries are
as streamlined as possible.

Yes, the big one in the above is to ensure that there is a index on the "id"
field used for the combo box.
-Possibly(?), use code that executes when a comboBox/listBox receives the
focus to set the control's rowSource. (I'm thinking that doing so would
cut down on the number of queries being executed at load time assuming
that there are multiple comboBoxes & listBoxes)
-Along with that thought, leaving any subforms as unbound until they first
receive the focus, again cutting down on the number of queries being
executed.

Yes, both above ideas can yield improvements. The sub-form idea is a real
winner when you are using a tab control. No use loading up a sub-form, and
the data until the user clicks on the correspond tab. This again is one of
those "how many" questions. One or two combo boxes, or one or two sub-forms
usually is just fine. You have 6 sub-forms, then you have to start looking
at reducing this. (sub forms tend to be near the top in terms of cost).
 
Back
Top