First, ms-access can execute about 100 million vba instructions per second
on a average computer today.
THAT IS 100,000,000 commands per second
So, lets assume that option explicit speeds things up by 20%
Your program code is likely running about 100, or perhaps 200 vba
instructions?
So,
100,000,000 = 1 second
200 = 0.000002
now, lets take 20% of 0.000002
we get 0.00000004
So, you save 4 100ths of a millionth of a second.
Good job, you can pat your self on the back!!! You speeded up the code by
20%
Really, this is not a ms-access issue, but one of computing, and how
computers work. I don't think can recall a database application that is
processing limited for CONSIDERABLE NUMBER OF years. Can you ?
While processing has gone through the roof, the speed of networks, and
bandwidth issues has not changed for a number of years.
When you see the above numbers that a 20% increase in processing/speed of
your code is complete insignificant, you can see why I am being so hard
here. So the speed of your code is REALLY REALLY insignificant. I just want
to stress this concept, and it is a basic concept that applies to computers
in general, and applies to any development tools you use, including
ms-access.
Ok, we cleared this up a bit.
Of course, good coding practices mean that you will ALWAYS use option
expected. And good coding practices means that you want to use correctly
typed variables as that can catch errors in code at compile time, and not
runtime. (so, good, but it is not going to fix our sql server performance
here).
Sql server does not look at the client, and go hum...that is ms-access, I
think I run slower.
Sql server does not care if your client is written in hand coded assumer, or
vba, or is a apache web site asking for data. They ALL PERFORMA at the same
speed. Hence, it is thus up to the developer to limit the amount of data
transferred. So, that is the secret to good performance!
Ok, now that we got our minds set on the right road, where is some great
performance enhancements that can be had (or, more specifically, where do
most access developers fail when using ms-access as a sql client?).
The first thing is quite obvious, you need to limit records loaded to a
form.
for example, sometime a developer will be lazy,a nd open up a form attached
to a large table. At the point the user is sitting there looking at a form,
and it has 50,000 records attached to it!! (gee, 3 users launch their
application, and they have forms with a possible 150,000 records -- worse,
we NOT DONE ONE BIT OF USEFUL WORK yet!!!).
I don't think it takes much though to realize how bad of design it is to
open up forms to large data tables without regards to what records are to be
loaded. If I asked my 92 year old grandmother if it was stupid to download
all peoples accounts into a instant teller bank machine, AND THEN ASK for
the account number, I bet she should understand this.
Why then do so many access developers open a form to a large table, and THEN
ASK the user what they want? - you can see this makes no sense at all.
hint:
When you use a instant teller machine, you are asked for the account
number...that information is sent...not send everything, and THEN Ask!!!
Common sense for a old lady, and something that more developers need to
adopt.
So, ask the user for an account number, or name, or something BEFORE YOU
LOAD the form. that is my first tip.
here is set of screen shots that works well for JET based applications, or
sql server based applications when searching
http://www.members.shaw.ca/AlbertKallal/Search/index.html
You can well used bound forms in ms-access, and the "where" clause will
restrict the form to ONE record. How can transferring one record to a form
be slow?
the next tip is of course if you are using odbc linked tables, then if the
sql has joins (more then one table), simply replace that as a view in the
sql server side, and link to that (you do this, because OFTEN linked tables
will not perform joins well). Of course, if this is for a report, then you
use pass-through queries, and everything is done server side anyway (well,
ok there is two tips in this paragraph).
and, of course, any processing routines that work with LOTS of records
should be moved to a stored procedure (server side).
In place of loading up a large reocrdset, and doing finds on it, load up the
record set with the ONE record you need. Now, of course if you need 10
records, then don't load up 10 separate reorders, load up one roecrdset with
the 10 records, and do a find on it.
Eliminate combo boxes with large data sources. (you can't expect a form to
load fast if it has a bunch of combo boxes that load up large data sets).
It is not clear how large the tables you are working with. Even without sql
server, tables in the 150,00 to 500,000 record range are actually quite
small, and applications that deal with such small tables generally perform
without noticeable delays to the user. the trick is to limit the data
transferred. This advice applies to web based systems, c++, or vb.net, or
ms-access.