Allow routine to finish when called multiple times

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I have two forms. They can both be open at the same time. Form1 is the
driver form and has a "next" button. When pressed, it calls Routine1 which
reads the next set of data, populates the fields on Form1, then calls
Routine2 to populate the fields on Form2.

This works great until they start pressing the Next button really fast.
Routine2 doesn't finish processing each call and eventually an error is
generated, and the wrong data can be on Form2.

Is there any way to ensure that when calling Routine2, it is completed
before it is called again? Doesn't seem like this is a DoEvents issue, but
any help would be greatly appreciated.
 
Two possible approaches:
Private Sub cmdNext_Click()
cmdNext.enabled = false
call routine1
call routine2
cmdNext.enabled = true
End Sub

(You might need to SetFocus to another control temporarily before disabling
cmdNext.)

Private Sub cmdNext_Click()
on error goto ExitHere
Static bolWorking as boolean
if bolWorking = True then exit sub
bolWorking = True
call routine1
call routine2
ExitHere:
bolWorking = False
End Sub

HTH,
 
I once had a user that pressed a bulk email button continuously and wound up
spamming our customers to death. I set focus to a different control, then
disabled the button. You must do something similar (aircode):

Sub cmdWhatever_Click()
Me.SomeOtherControl.SetFocus
Me.cmdWhatever.Enabled = False
'Run your other code
'Tou may need a DoEvents here
Me.cmdWhatever.Enabled = True
End Sub

That should prevent any clicking on the button.
 
Both solutions are good, however, I wasn't clear enough when describing the
routines. Routine1 calls Routine2, so, cmdNext_Click doesn't really know
about it.

Private Sub cmdNext_Click()
Call Next_Rec
End Sub

Public Sub Next_Rec()
If Not IsEmpty(aRecords) Then
...Bunch of processing...
If some condition Then
Call RefreshLimitedInfo
End If
End Sub

Public Sub RefreshLimitedInfo()
...Bunch of processing...
Call RefreshOpenForms(strRepID, "Limited")
End Sub

Public Sub RefreshOpenForms()
...Bunch of processing...
If FormIsOpen("frm_Load_Image") then
Call PopulateImageList()
end if
End Sub

Public Sub PopulateImageList()
...Read images for this property
...there are 6 image controls on the form
...set the picture, hyperlinkaddress properties...

This routine is giving the error: -2147417848 Automation error the object
invoked has disconnected from its clients

End Sub

When I look up the error, Microsoft has a kb article,
(http://support.microsoft.com/default.aspx?scid=kb;en-us;319832) but it
doesn't seem to apply to my situation.

Thanks for looking...

Sandy G
 
I may have to do this, but I was hoping for a different solution.

Thanks for your input!

Sandy G.
 
Routine1 calls Routine2, so, cmdNext_Click doesn't really know about it.
Sure it does, indirectly. It knows when Routine1 has ended, and Routine1
won't end until Routine2 has ended.

Step thru your code. cmdNext passes control to routine1 which passes control
to Routine2. *Only when its done* will routine2 pass control back to
Routine1. *Only when routine1 is done*, will control return back to
cmdNext_Click.

Set a breakpoint in Routine2. Then, in the VBE, View>Call Stack. You'll see
the current "stack" of executing procedures. The top member has execution
control, the rest are waiting for execution control to return to them, in
the order shown. (If you select any but the top entry of the Call stack you
will see the "pending" line marked with a green triangle. When execution
control is returned, that line will be "done" and the next line will be
executed.)

So, simply remove the 'call routine2' line. It wouldn't hurt to replace it
with a DoEvents. Whatever the next line is (enabling the button or setting
bolWorking to False), it won't execute until Routine1 *and anything routine1
calls* have completed.

HTH,
 
Okay, understood. So, I'm going to try this.

Public Sub cmdNext_Click()
on error goto ExitHere
Static bolWorking as boolean
if bolWorking = True then exit sub
bolWorking = True
Call Next_Rec
doEvents()
ExitHere:
bolWorking = False
End Sub

I'll let you know my results...

Sandy G.
 
The advantage to do it this way is that it provides positive
feedback to the user. He/she soon learns not to even try to click
the button continuously.

This is a key UI principle: don't allow the user to do what the user
should not be allowed to do. I hate clicking a button and having a
message pop up saying "you can't do that now", because I shouldn't
have been able to click that button in the first place!

If, for instance, you have a dialog form that's collecting criteria
for some purpose, the OK button should not be enabled until all the
controls on the form have been populated by the user. I usually do
this by writing a function that checks all the fields for valid
values, and then assigning that function to the AfterUpdate event of
all the controls (you can do it in one step by selecting them all at
once). All that function does is something like this:

Me!cmbOK.Enabled = Not Isnull(Me!txt1 + Me!txt2 + Me!txt3)

Any other type of control ought to have a default value (checkboxes,
option groups), and for any control that has an obvious value (like
a date range) you should prepopulate to make it as easy as possible
for the end user. If you prepopulate everything, then you should
have the OK button enabled by default, but still use the function
above, in case the user deletes a value from one of the fields.
 
Agreed. Prepopulating with defaults, client-side validation and obvious
direction of program flow is what makes desktop applications, with any
degree of complexity, superior to web applications. That much web enabled
client-side code or even most multi-tiered web applications make data entry
dog slow. Anyone who can remember how fast DOS data entry applications were,
years ago, will agree. Business applications are successful only if they
show positive return on investment.
 
Prepopulating with defaults, client-side validation and obvious
direction of program flow is what makes desktop applications, with
any degree of complexity, superior to web applications. That much
web enabled client-side code or even most multi-tiered web
applications make data entry dog slow. Anyone who can remember how
fast DOS data entry applications were, years ago, will agree.
Business applications are successful only if they show positive
return on investment.

AJAX is changing that, but at the cost of increasing the weight of
the pages (i.e., all that JavaScript means loading the page takes
longer). This is no problem for Intranet apps, but lots of people
are still on dialup.
 
AJAX is changing that, but at the cost of increasing the weight of
the pages (i.e., all that JavaScript means loading the page takes
longer). This is no problem for Intranet apps, but lots of people
are still on dialup.

Not to be pedantic, but AJAX is a technology using JavaScript and XML which
still needs to be passed from the server to the client. An intelligent
programmer uses a bunch of scripts to talk back and forth between the server
and client to make it appear that the web-page is more responsive. It's
mostly smoke and mirrors though. A scripting language, being uncompiled is
never as fast as a compiled language. and 33 Kbit dialup or an 8 Mbit cable
or even a T3 connection is still not as fast as a 100 Mbit LAN, not to
mention a gigabit LAN. The AJAX approach is somewhat similar to a Terminal
Server in that once the page image is loaded, only little bits of data are
passed back and forth. Lastly, being a web technology without a fantastic
IDE and built-in database functionality, it will never be as fast to develop
in as is Access.

To be fair, there are some things in AJAX's favor. For 1, it is
cross-platform. Secondly, it leverages mostly existing technologies.

For less than 50 users on a LAN (and sometimes more) and reasonable amounts
of data (say 1/2 GB) I don't think that there's much that can approach
Access in speed of development, rollout, and functionality, or at anywhere
near the cost. As long as there is a reasonable number of users (20 to 30) a
Terminal Server is just fine, and far less expensive than a comparable web
application.
 
To be fair, there are some things in AJAX's favor. For 1, it is
cross-platform. Secondly, it leverages mostly existing technologies.

Not to disagree with anything you said (which is why I'm not quoting
it), one of the disadvantages is that JS as implemented in all the
browsers is single-threaded. That is mostly a problem for pages that
use Document.Write, which is not specific to AJAX. Here's an
interesting article about the problem (no word-wrap, all on one line):

http://www.readwriteweb.com/archives/how_javascript_is_slowing_down_the
_web.php
 
Back
Top