Tips on domain aggregate replacements

D

David C. Holley

Apart from what's been suggested, have you thought about an approach
where by each person pulls data onto their own PC to run the report. The
idea of replication also comes to mind, however while I'm quite familar
with the concept, I haven't actually used it myself.
I will try moving the DSum's to the report. I already have several others
there, and the quirk is that it seems to make navigating from page to page
rather slow (of course, there could be some other reason for that).

This already is a split db, but the users are 400 miles away from the server
hosting the data:) They are all running the same copy of the FE locally on a
single terminal server via TS/RDP sessions. So, for all intents & purposes,
there already is a copy of executable on each user's station. The BE sits in
the same folder on that server where the FE sits. Would there be some
advantage to having them each run their own copies locally on the terminal
server?

It doesn't seem to have much impact on performance; the report takes just
about as long to run if they are all logged in as it does when I am logged on
testing it at night.

Having said all of this, I always split my databases. The only reason that I
don't have the BE sitting on a server in their office with a FE on each
workstation is that I have to get all my inefficiencies worked out first.
Several of my other FE/BE db's seem to suffer from network latency issues,
and I have not had time yet to go through the process of keeping the
connection to the db open, trimming out domain aggregate functions, etc. to
maximize efficiency.

I spend virtually all my time on user interface and data structure/integrity
issues - don't have much time to rethink methodologies...

:

Okay, Brian, this is going to send you screaming to the looney bin. I went
back to an app I did about 6 years ago, because I remember a specific report
that took forever to run, and I made some mods that took it from about an
hour to run to about 2 minutes. Here is the funny part. I took all the
aggregate functions out of the query. Then in the report. I would do DSums
from the controls on the report on the data returned by the query.

The other thing is you really need to split your database. Put the back end
Thanks.

In the specific example that is plaguing me, it is a multi-user DB, but it
is being run by all users locally on a terminal server, and the performance
is the same whether one or all are using the app. The performance is
suffering largely due to the complexity of certain reporting requirement (in
addition to the speed of the server - the particular report takes about 1/4
as long to run locally on my development station, butthat is still about
1-3/4 minutes)

The biggest offender is a certain report that (in addition to a lot of other
info) needs to Sum a particular complex total for each of the next twelve
months (the DSum in VBA was a sideline to this issue) and present these as
items on each line of output.

The report layout, simplified, looks like this:

Item Descr Month1Total Month2Total Month3Total, etc. -> Month 12
1 ABC $1200 $1500 $1700
2 DEF $ 350 $ 0 $ 25
Totals at bottom

I calculate these like this:

First month's total -> DSum (blah, blah, blah where date is within next month
Second month's total -> DSum (blah, blah, blah where date is within the
month after next (using dateadd to group into the correct months)

I'm trying to find an alternative, because I suspect the DSum as the
bottleneck here.

:


Minimizing the use of domain aggregates is a laudable goal; however, one can
never get rid of all of them. Whether you use a query or a DSum, DLookup,
depends on a number of factors. If you just want the one value returned, I
would suggest the DSum. If you have additional data you want at the same
time, the query would be the way to go.
One key point, as I stated earlier, is to use the Execute method rather than
the RunSQL. It is much faster. the Execute method only works for action
queries. For select queries, I would suggest using a recordset. In fact, in
my personal experience, it seems to be faster if I first create a recordset
with no domain aggregates in it if I need a group of data. Then If I need to
total a field, do a DSum on the recordset I created.

If you are having performance problems, there may be other factors to
consider. Are you using a split database? Are there multiple users? Does
each user have a copy of the front end on their own workstation? Users
sharing a front end database or an unsplit database over a network is a real
performance issue.


:


That's how I am already doing it (DSum, DLookup, etc.). However, everywhere I
turn for help on speeding up my DB's, I hear, "Get rid of your domain
aggregates", and that is why I asked the question in the first place.

Am I missing something here? Is there really some advantage to using Sum in
a parameterized query over using DSum?

:


One note. Instead of the RunSQL, use the Execute method instead. It is much
faster.
Now to do the sum of Quantity where Customer is 1234 and you want the
results in ABC

First, make ABC a Variant data type, because in the DSUM function, it is
possible to get a null return if there are no matches. Since I don't know
where you are call it from, I will, for example purposes, assume you have a
text box on a form named txtCustID, and the current record is Customer 1234

ABC = DSUM("[Quantity]", "MyCustomerTableName", "[CustomerID] = " & txtCustID
The above assumes the data type of [CustomerID] is a numeric field. If it
is a text field, then use this version:
ABC = DSUM("[Quantity]", "MyCustomerTableName", "[CustomerID] = '" &
txtCustID & "'"
If there are no records matching the criteria, then ABC will contain Null;
otherwise, it will contain the sum for quantity for the selected customer.

:


Thanks, ALlan. Just a follow-up here, though, because I have always used a
lot of queries & subqueries and am not very familiar with building recordsets
in VBA.

As an example, let's say I want to assign the Sum of the Quantity field from
the Invoices table where CustomerID=1234 to variable ABC. What I have been
doing is writing a Sum query against Invoices, and then (in VBA) doing a
DFirst (since the query will return only a single entry) or DLookup to grab
the SubOfQuantity.

There are two parts I don't understand yet:

1. Can I just do this in VBA, without the query?
2. If not, how do I pass the Where clause to the query on the fly without
using it in the DFirst or DLookup? Obviously, I cannot hard-code the criteria
in the query.

:


Brian, if you just want a single value in VBA, just go for it.

If you need multiple values, or values in a loop, you are probably better to
build a recordset to get those values. It will be based on a totals query,
or possibly a query with a subquery.

If subqueries are new, see:
How to Create and Use Subqueries
at:
http://support.microsoft.com/?id=209066

--
Allen Browne - Microsoft MVP. Perth, Western Australia.

Reply to group, rather than allenbrowne at mvps dot org.


Everything I am reading tells me I should minimize my use of domain
aggregate
functions, so I am in the process of going through several apps to replace
these. In many cases, I can use a join in a query, but there are times
when I
need to run the statement from VBA. I need a tip on how to accomplish
this.

I know how to run action queries using RunSQL, but (how) can I run select
queries to return Max, Sum, etc. results to a variable or array in VBA?
 
G

Guest

Thanks, David. That makes sense. The primary one case where I am having
trouble finding a more efficient way to do things (and I'm reposting a small
portion of this thread to bring attention to it) is this:

The biggest offender is a certain report that (in addition to a lot of other
info) needs to Sum a particular complex total for each of the next twelve
months and present these as fields on each line of the report. The report
layout, simplified, looks like this:

Item Descr Month1Total Month2Total Month3Total, etc. -> Month 12
1 ABC $1200 $1500 $1700
2 DEF $ 350 $ 0 $ 25
Totals at bottom $1550 etc.

I calculate these like this:

Month1Total -> DSum (blah, blah, blah where date is within next month
from today
Month2Total -> DSum (blah, blah, blah where date is within the
month after next (using dateadd to group into the correct months)
etc. through Month12

I'm trying to find an alternative, because I suspect the DSum as the
bottleneck here. The only other thing I can think of is 12 subqueries - one
for each of the 12 month-related field - would this be more efficient.
 
D

David C. Holley

My initial thinking would be to use a query that SELECTS the records to
be totaled, loops through them getting the SUMS, and then store the data
in a temporary table that is then used for the final report. Not at all
undoable. Are the Months that your using always the same in the sense of
are they always the last twelve months, the next twelve months, calendar
year-to-date, or can the vary based on user input? If its always the
same reporting period AND the values *won't* change after month-end, you
could use a table to store the month-end totals. The query to sum the
totals would run the day after month-end and then add the values to the
table - again spreading the processing across months. You would then use
that table for the reporting.
 
G

Guest

Thanks for the response, David. This is a commodity trading contract
management program, so it is always the next twelve months from the day the
report is run (as of today, this would be November 05 through October 06. The
data is extremely dynamic, and may be different as it is run up to several
times per day by various traders (typically, it is at least once per day). A
change to a single contract by any trader will materially, if not greatly,
change the output.

I considered using temporary tables, but haven't gone that route yet because
then I have to be more proactive in my compact/repair process to avoid DB
bloat, which could be difficult in this multi-user environment.
 
D

Dirk Goldgar

Brian said:
Thanks, David. That makes sense. The primary one case where I am
having trouble finding a more efficient way to do things (and I'm
reposting a small portion of this thread to bring attention to it) is
this:

The biggest offender is a certain report that (in addition to a lot
of other info) needs to Sum a particular complex total for each of
the next twelve months and present these as fields on each line of
the report. The report layout, simplified, looks like this:

Item Descr Month1Total Month2Total Month3Total, etc. ->
Month 12 1 ABC $1200 $1500
$1700 2 DEF $ 350 $ 0
$ 25
Totals at bottom $1550 etc.

I calculate these like this:

Month1Total -> DSum (blah, blah, blah where date is within next
month from today
Month2Total -> DSum (blah, blah, blah where date is within the
month after next (using dateadd to group into the correct months)
etc. through Month12

I'm trying to find an alternative, because I suspect the DSum as the
bottleneck here. The only other thing I can think of is 12 subqueries
- one for each of the 12 month-related field - would this be more
efficient.

You might use a union query that brings together 12 monthly queries
(each a totals query for the given month), or a single crosstab query.
Either approach would be a *lot* more efficient than running DSum 12
times for each record.
 
M

Michel Walsh

Hi,

Have you tried GetRows on a recordset (ADO or DAO)? That return a transposed
array of variant of your records. Being transposed, the records are disposed
in columns (second index), the field are disposed in lines (first index) and
indices start at 0. See the help file.

One of "the" problem with data in "memory" is that you cannot easily share
it with other users (in addition to eventually consume a lot of memory). I
won't suggest that solution except for very simple tasks.

Hoping it may help,
Vanderghast, Access MVP
 
G

Guest

Thanks, Dirk. I must confess my relative ignorance of crosstab queries,
because I have had no great need for them until now. This is incredibly fast
(of course) compared to my old clunky DSums.

While the crosstab query runs very fast, I am posting a followup question in
Access Queries, because I am getting a "Data type mismatch in criteria
expression" whenever I set the ColumnHeadings property to standardize the
output to send to the report.
 
D

David C. Holley

You may want to consider this approach.
-Create a 2nd backend DB to house the temp tables. This 2BE would only
contain temp tables nothing else and sit on the users PC. Since it only
contains temp tables AND sits on the users PC, only 1 person would have
it open at any given time thus the Compact/Repair would be much easier
to handle. I'm thinking you might be able to handle it via code, however
I personally don't have experience with it. If you do have problems with
it or the code, since it only contains temp tables, it would be just a
matter of deleting the file and copying it to the new location.

-In the FE add links to the 2nd backend. This would create a situation
where the FE sees both backend dbs.

-In the FE add the code that builds/populates the temp tables. Since the
FE sees both backends, it can manage pulling the data from the backend
with the hard data and load it into the backend with the temp tables.
 
G

Guest

No, not yet. I'm not very familiar yet with doing a lot of this manually and
still rely on the query builder for a lot rather than VBA. As it turned out,
after converting all the DSums to joined queries, I could hardly tell the
difference in speed, and upon further scrutiny, I discovered a simple DLookup
of a customer name from a customer code in my query that was responsible for
probably 95% of the bottleneck. I guess I should have started there, but I
did learn a lot about crosstabs, etc.,etc. with all the answers I received.
Now I just need to start working with Recordsets manually (when I have time).

I guess it was just a bad case of tunnel vision on my part.
 
G

Guest

Hi Brian,

Not sure if you are still following this thread, but this caught my attention:
Several of my other FE/BE db's seem to suffer from network latency issues,
and I have not had time yet to go through the process of keeping the
connection to the db open, ....

Please see the following article:

Implementing a Successful Multiuser Access/JET Application
http://www.access.qbuilt.com/html/multiuser_applications.html

Pay particular attention to:
1.) Disabling NameAutocorrect
2.) Setting all subdatasheets to [None] and
3.) Establishing a permanent connection for each user (very easy to do).

In addition, this article includes lots of useful tips such as using indexes
to help prevent table scans, and using an unbound search form to help locate
records that the user may want to edit.

If you can remove a page count in reports (Page X of Y), your reports should
open much faster too.


Tom Wickerath, Microsoft Access MVP

http://www.access.qbuilt.com/html/expert_contributors.html
http://www.access.qbuilt.com/html/search.html
__________________________________________
 
G

Guest

Thanks, Tom. I have implemented much of this in my current/ongoing
development, but you know how it is with old apps sometimes. The
methodologies I used three years ago are very different from the way I
develop today, and it is painful to watch some of the old apps in action
without the time to go through them & update some of the efficiency issues
(too many new projects to do...)

Tom Wickerath said:
Hi Brian,

Not sure if you are still following this thread, but this caught my attention:
Several of my other FE/BE db's seem to suffer from network latency issues,
and I have not had time yet to go through the process of keeping the
connection to the db open, ....

Please see the following article:

Implementing a Successful Multiuser Access/JET Application
http://www.access.qbuilt.com/html/multiuser_applications.html

Pay particular attention to:
1.) Disabling NameAutocorrect
2.) Setting all subdatasheets to [None] and
3.) Establishing a permanent connection for each user (very easy to do).

In addition, this article includes lots of useful tips such as using indexes
to help prevent table scans, and using an unbound search form to help locate
records that the user may want to edit.

If you can remove a page count in reports (Page X of Y), your reports should
open much faster too.


Tom Wickerath, Microsoft Access MVP

http://www.access.qbuilt.com/html/expert_contributors.html
http://www.access.qbuilt.com/html/search.html
__________________________________________

Brian said:
I will try moving the DSum's to the report. I already have several others
there, and the quirk is that it seems to make navigating from page to page
rather slow (of course, there could be some other reason for that).

This already is a split db, but the users are 400 miles away from the server
hosting the data:) They are all running the same copy of the FE locally on a
single terminal server via TS/RDP sessions. So, for all intents & purposes,
there already is a copy of executable on each user's station. The BE sits in
the same folder on that server where the FE sits. Would there be some
advantage to having them each run their own copies locally on the terminal
server?

It doesn't seem to have much impact on performance; the report takes just
about as long to run if they are all logged in as it does when I am logged on
testing it at night.

Having said all of this, I always split my databases. The only reason that I
don't have the BE sitting on a server in their office with a FE on each
workstation is that I have to get all my inefficiencies worked out first.
Several of my other FE/BE db's seem to suffer from network latency issues,
and I have not had time yet to go through the process of keeping the
connection to the db open, trimming out domain aggregate functions, etc. to
maximize efficiency.

I spend virtually all my time on user interface and data structure/integrity
issues - don't have much time to rethink methodologies...
 
G

Guest

Hi Brian,

Glad to see that you got my message. The most important issues I pointed out
are all implemented very quickly (under 10 minutes):

1.) Disabling NameAutocorrect
2.) Setting all subdatasheets to [None] and
3.) Establishing a permanent connection for each user (very easy to do).

For item # 2 above, there is code available that you can import into your
database to make this a quick procedure. See
http://support.microsoft.com/?id=275085

For item # 3 above, just create a one-record table in the BE database. Then
create a form in the FE database that is bound to this table. Open the form
in hidden mode either using an Autoexec macro or VBA code.


Tom Wickerath, Microsoft Access MVP

http://www.access.qbuilt.com/html/expert_contributors.html
http://www.access.qbuilt.com/html/search.html
__________________________________________


Brian said:
Thanks, Tom. I have implemented much of this in my current/ongoing
development, but you know how it is with old apps sometimes. The
methodologies I used three years ago are very different from the way I
develop today, and it is painful to watch some of the old apps in action
without the time to go through them & update some of the efficiency issues
(too many new projects to do...)

Tom Wickerath said:
Hi Brian,

Not sure if you are still following this thread, but this caught my attention:
Several of my other FE/BE db's seem to suffer from network latency issues,
and I have not had time yet to go through the process of keeping the
connection to the db open, ....

Please see the following article:

Implementing a Successful Multiuser Access/JET Application
http://www.access.qbuilt.com/html/multiuser_applications.html

Pay particular attention to:
1.) Disabling NameAutocorrect
2.) Setting all subdatasheets to [None] and
3.) Establishing a permanent connection for each user (very easy to do).

In addition, this article includes lots of useful tips such as using indexes
to help prevent table scans, and using an unbound search form to help locate
records that the user may want to edit.

If you can remove a page count in reports (Page X of Y), your reports should
open much faster too.


Tom Wickerath, Microsoft Access MVP

http://www.access.qbuilt.com/html/expert_contributors.html
http://www.access.qbuilt.com/html/search.html
__________________________________________

Brian said:
I will try moving the DSum's to the report. I already have several others
there, and the quirk is that it seems to make navigating from page to page
rather slow (of course, there could be some other reason for that).

This already is a split db, but the users are 400 miles away from the server
hosting the data:) They are all running the same copy of the FE locally on a
single terminal server via TS/RDP sessions. So, for all intents & purposes,
there already is a copy of executable on each user's station. The BE sits in
the same folder on that server where the FE sits. Would there be some
advantage to having them each run their own copies locally on the terminal
server?

It doesn't seem to have much impact on performance; the report takes just
about as long to run if they are all logged in as it does when I am logged on
testing it at night.

Having said all of this, I always split my databases. The only reason that I
don't have the BE sitting on a server in their office with a FE on each
workstation is that I have to get all my inefficiencies worked out first.
Several of my other FE/BE db's seem to suffer from network latency issues,
and I have not had time yet to go through the process of keeping the
connection to the db open, trimming out domain aggregate functions, etc. to
maximize efficiency.

I spend virtually all my time on user interface and data structure/integrity
issues - don't have much time to rethink methodologies...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top