using Option explicit and not variant

  • Thread starter Thread starter Roy Goldhammer
  • Start date Start date
R

Roy Goldhammer

Hello there

I've started new job on Access linked to sql server.

The access is realy work slow. one of the "simple things" i've seen is that
almost all the modules and from modules aren't working witn option explicit,
and many of the variants don't have decleration.

For my experience this can cause the application to work slower. and it is
"simple thing" to change in the entire application to inprove performance.

If i'm correct. can i get some documentation about it?
 
hi Roy,

Roy said:
The access is realy work slow. one of the "simple things" i've seen is that
almost all the modules and from modules aren't working witn option explicit,
and many of the variants don't have decleration.
The Option Explicit forces the declaration of variables in use. If it is
omitted, then every undeclared (Dim) variable get its data type at runtime.
Variables declared with out data type are treated as Variant. Which of
course uses always implicit casts.

Using strong typing - declare all variables in use in the correct type -
gives you some speed up.

When you are saying "aren't working" you mean you can't compile it? This
indicates that you have a lot of undeclared variables, which you have to
declare (Dim) first before compiling.
For my experience this can cause the application to work slower. and it is
"simple thing" to change in the entire application to inprove performance.
Have turned the ObjectName AutoCorrection off?
Have you actived the AutoCompact?

Are you using a single application or a splitted one?


mfG
--> stefan <--
 
I would strongly recommend using explicit declaration and strongly typed
variables whenever possible, but this is primarily for the benefit that it
brings to maintenance and trouble-shooting rather than for performance
reasons. There are performance benefits, yes, but they may not be very
dramatic. Minimizing the data retrieved across the network by using
appropriate criteria and careful indexing may bring much greater performance
benefits.

A possible exception would be where a variant is used within a loop that is
executed many times. That variant would be a prime candidate for replacement
with a strongly-typed variable - provided, of course, the value to be
assigned can never be null.
 
the AutoCorrection is off
As i explain before the database is on SQL server and i use Link tables on
access using ODBC DSN to connect to sql server.

there are also many things i found on the applicatio. like using rs.find to
locate data instead of using recordset to exact data.

For now i'm looking for quich solutions to improve performance.

do you have another idea?
 
hi Roy,

Roy said:
the AutoCorrection is off
As i explain before the database is on SQL server and i use Link tables on
access using ODBC DSN to connect to sql server.
Have SQL Server tables a TIMESTAMP field. Its mere existence will reduce
the network traffic between Access/Jet and the SQL Server.

Have all tables primary keys?
Have all tables proper addional indices?
there are also many things i found on the applicatio. like using rs.find to
locate data instead of using recordset to exact data.
When using sub forms, use a delayed loading if possible.
Instead of using client side queries use views on the server. You can
link them like tables.

Avoid the use of * in your record sources.

mfG
--> stefan <--
 
First, ms-access can execute about 100 million vba instructions per second
on a average computer today.

THAT IS 100,000,000 commands per second

So, lets assume that option explicit speeds things up by 20%

Your program code is likely running about 100, or perhaps 200 vba
instructions?

So,

100,000,000 = 1 second

200 = 0.000002

now, lets take 20% of 0.000002

we get 0.00000004

So, you save 4 100ths of a millionth of a second.

Good job, you can pat your self on the back!!! You speeded up the code by
20%

Really, this is not a ms-access issue, but one of computing, and how
computers work. I don't think can recall a database application that is
processing limited for CONSIDERABLE NUMBER OF years. Can you ?

While processing has gone through the roof, the speed of networks, and
bandwidth issues has not changed for a number of years.

When you see the above numbers that a 20% increase in processing/speed of
your code is complete insignificant, you can see why I am being so hard
here. So the speed of your code is REALLY REALLY insignificant. I just want
to stress this concept, and it is a basic concept that applies to computers
in general, and applies to any development tools you use, including
ms-access.

Ok, we cleared this up a bit.

Of course, good coding practices mean that you will ALWAYS use option
expected. And good coding practices means that you want to use correctly
typed variables as that can catch errors in code at compile time, and not
runtime. (so, good, but it is not going to fix our sql server performance
here).

Sql server does not look at the client, and go hum...that is ms-access, I
think I run slower.

Sql server does not care if your client is written in hand coded assumer, or
vba, or is a apache web site asking for data. They ALL PERFORMA at the same
speed. Hence, it is thus up to the developer to limit the amount of data
transferred. So, that is the secret to good performance!

Ok, now that we got our minds set on the right road, where is some great
performance enhancements that can be had (or, more specifically, where do
most access developers fail when using ms-access as a sql client?).

The first thing is quite obvious, you need to limit records loaded to a
form.

for example, sometime a developer will be lazy,a nd open up a form attached
to a large table. At the point the user is sitting there looking at a form,
and it has 50,000 records attached to it!! (gee, 3 users launch their
application, and they have forms with a possible 150,000 records -- worse,
we NOT DONE ONE BIT OF USEFUL WORK yet!!!).

I don't think it takes much though to realize how bad of design it is to
open up forms to large data tables without regards to what records are to be
loaded. If I asked my 92 year old grandmother if it was stupid to download
all peoples accounts into a instant teller bank machine, AND THEN ASK for
the account number, I bet she should understand this.

Why then do so many access developers open a form to a large table, and THEN
ASK the user what they want? - you can see this makes no sense at all.

hint:
When you use a instant teller machine, you are asked for the account
number...that information is sent...not send everything, and THEN Ask!!!
Common sense for a old lady, and something that more developers need to
adopt.

So, ask the user for an account number, or name, or something BEFORE YOU
LOAD the form. that is my first tip.

here is set of screen shots that works well for JET based applications, or
sql server based applications when searching

http://www.members.shaw.ca/AlbertKallal/Search/index.html

You can well used bound forms in ms-access, and the "where" clause will
restrict the form to ONE record. How can transferring one record to a form
be slow?

the next tip is of course if you are using odbc linked tables, then if the
sql has joins (more then one table), simply replace that as a view in the
sql server side, and link to that (you do this, because OFTEN linked tables
will not perform joins well). Of course, if this is for a report, then you
use pass-through queries, and everything is done server side anyway (well,
ok there is two tips in this paragraph).

and, of course, any processing routines that work with LOTS of records
should be moved to a stored procedure (server side).

In place of loading up a large reocrdset, and doing finds on it, load up the
record set with the ONE record you need. Now, of course if you need 10
records, then don't load up 10 separate reorders, load up one roecrdset with
the 10 records, and do a find on it.

Eliminate combo boxes with large data sources. (you can't expect a form to
load fast if it has a bunch of combo boxes that load up large data sets).

It is not clear how large the tables you are working with. Even without sql
server, tables in the 150,00 to 500,000 record range are actually quite
small, and applications that deal with such small tables generally perform
without noticeable delays to the user. the trick is to limit the data
transferred. This advice applies to web based systems, c++, or vb.net, or
ms-access.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top