Creating an array

M

mareharbor

My code generates a number based on some inputs and then uses that
number as the Tble(0) value and then is trying to create an array
using that initial value. I do not know how many rows the array will
have, but i don't want it to be bigger then 2550. The next row in the
array is based on the previous row. My problem is the lagged
specification and the incorrect use of the step function. Can anyone
point me in the right direction on this. I will eventually put the
created array in some cells to display it. Thanks a bunch.

Sub Test()
Dim Input1, Input2, Input3, Box As Double
Dim Tble() As String
Dim rw, x As Integer

Input1 = Cells(1, 1).Value
Input2 = Cells(2, 1).Value

Input3 = Input1 * Input2

Cells(3, 1).Value = Input3
rw = 0
ReDim Tble(rw)
Tble(0) = Input3
While Tble(rw) <> 2550
For rw = 1 To Step
Tble(rw) = Tble(rw - 1) + 1
rw = rw + 1
Next rw
Wend

End Sub
 
G

Guest

Sub Test()
Dim Input1 as Double, Input2 as Double
Dim Input3, Box As Double
Dim Tble() As Long
Dim rw as Long , x As Long

Input1 = Cells(1, 1).Value
Input2 = Cells(2, 1).Value

Input3 = Input1 * Input2

Cells(3, 1).Value = Input3
Redim tble(0 to 2550)

Tble(0) = Input3

For rw = 1 To 2550 Step 1
Tble(rw) = Tble(rw - 1) + 1
if rw = 2550 then exit for
Next rw
End Sub

You haven't said how to determine the upper bound, so I used 2550.
 
R

Rick Rothstein \(MVP - VB\)

What you are trying to do in your example is not entirely clear. However, if
you know the top bound you are willing to accept, then ReDim your array to
that value and, inside your loop, increment a counter each time you add an
element to the array. AFTER the loop has completed, ReDim the array to that
counter value (possible one less than the counter value depending on where
in the loop you are incrementing it at) BUT make sure you use the Preserve
keyword so as not to lose any data you assigned to the array. A sample
snippet of this would be...

ReDim YourArray(TopBound)
For X = 1 to TopBound
....
.... <<Your code>>
....
If ExitCondition = True Then Exit Sub
Counter = Counter + 1
Next
ReDim Preserve YourArray(Counter)

For future consideration, if you have a situation where you don't know the
top bound that will result from your loop, use ReDim Preserve in "chunks",
fixing the count at the end. Here is a sample of how to do that.

ChunkAmount = 1000
Do While ExitCondition = False
....
.... <<Your code>>
....
If Counter = UBound(YourArray) Then
ReDim Preserve YourArray(Counter + ChunkSize)
End If
Counter = Counter + 1
Loop
ReDim Preserve YourArray(Counter -1)

And one final point... this following line of yours
Dim Input1, Input2, Input3, Box As Double

does NOT do what you think it does. Of all the variables listed, only Box
will be a Double... the rest will be Variants. Visual Basic requires each
variable to be declared as to Type individually. So, you could do this...

Dim Input1 As Double, Input2 As Double, Input3 As Double, Box As Double

or list them individually on their own lines (which I think is far more
readable than ganging them up all on one line.

Rick
 
H

Harlan Grove

...
....
ReDim YourArray(TopBound)
For X = 1 to TopBound
....
.... <<Your code>>
....
If ExitCondition = True Then Exit Sub
Counter = Counter + 1
Next
ReDim Preserve YourArray(Counter)

Unlike Pascal but like C, VBA preserves the values of For loop
variables after the loop ends. So no need for Counter. Finish with

ReDim Preserve YourArray(X - 1)

Does VB proper work differently?
For future consideration, if you have a situation where you don't
know the top bound that will result from your loop, use ReDim
Preserve in "chunks", . . .

ChunkAmount = 1000
Do While ExitCondition = False
....
.... <<Your code>>
....
If Counter = UBound(YourArray) Then
ReDim Preserve YourArray(Counter + ChunkSize)
End If
Counter = Counter + 1
Loop
ReDim Preserve YourArray(Counter -1)
....

The main alternative being doubling the array size every time you top
out. The implicit assumption being that, if you don't know the actual
size of your data, best to assume your half way every time you need to
reallocate more storage. The practical justification being that the
2nd time you need to allocate more storage, heuristically it's time to
increase the reallocation chunk size.

Dim a(1 To 100), i As Long
i = 0
Do While whatever
i = i + 1
...
If i >= UBound(a) Then ReDim Preserve a(1 To 2 * UBound(a))
Loop
ReDim Preserve a(1 To i)
 
R

Rick Rothstein \(MVP - VB\)

ReDim YourArray(TopBound)
Unlike Pascal but like C, VBA preserves the values of For loop
variables after the loop ends. So no need for Counter. Finish with

ReDim Preserve YourArray(X - 1)

Does VB proper work differently?

No, it works the same in the compiled VB world. My preference for the side
Counter variable is two-fold. First off, "old habits die hard".<g> Way back
when, in the early days of VB, there was a concern that Microsoft might
remove the "loop index survives the loop" feature, so it was deemed better
to not get use to it. Second, it makes switching loop structures mid-stream
easier. By that I mean, if you start off with a For-Next loop and decide,
for whatever reason, to convert it to a Do-While loop, the basic
counter-dependent part of the structure, and the code relying on it after
the loop, will pretty much survive the conversion intact.

The main alternative being doubling the array size every time you top
out. The implicit assumption being that, if you don't know the actual
size of your data, best to assume your half way every time you need to
reallocate more storage. The practical justification being that the
2nd time you need to allocate more storage, heuristically it's time to
increase the reallocation chunk size.

Dim a(1 To 100), i As Long
i = 0
Do While whatever
i = i + 1
...
If i >= UBound(a) Then ReDim Preserve a(1 To 2 * UBound(a))
Loop
ReDim Preserve a(1 To i)

I have seen that argument before and do not take issue with it. However, if
you have some familiarity with the data you are processing, you usually can
make a good guess as to an upper limit to use. That should be size you use
in your initial ReDim statement (which I now see I left out of my example).
If you set your chunk size sufficiently large to handle any over flow from a
too-small initial guess, usually you won't have to perform the ReDim more
than once or twice more. Of course, there are open-ended situations where
you don't know how much data is coming at you... in those situations, your
method is definitely the way to go.


Rick
 
H

Harlan Grove

...
....
. . . First off, "old habits die hard".<g> Way back when, in the early
days of VB, there was a concern that Microsoft might remove the "loop
index survives the loop" feature, so it was deemed better to not get use
to it. . . .
....

There, in a nut shell, is why programming language semantics should be
subject to multiple vendor standardization committees which must solicit
comments from language programmers before changing those semantics.
. . . Second, it makes switching loop structures mid-stream easier. By
that I mean, if you start off with a For-Next loop and decide, for
whatever reason, to convert it to a Do-While loop, the basic
counter-dependent part of the structure, and the code relying on it after
the loop, will pretty much survive the conversion intact.
....

Never had that happen myself. I've mostly written numeric routines, and the
loop mechanisms were determined by the most likely stopping criteria. When
expecting to iterate through entire arrays, use FOR loops. When iterating
through parts, e.g., binary search, use DO loops.
I have seen that argument before and do not take issue with it. However,
if you have some familiarity with the data you are processing, you usually
can make a good guess as to an upper limit to use. That should be size you
use in your initial ReDim statement (which I now see I left out of my
example). If you set your chunk size sufficiently large to handle any
over flow from a too-small initial guess, usually you won't have to
perform the ReDim more than once or twice more. . . .

If *ONCE*, then ReDim a(UBound(a) + chunk) and ReDim a(2 * UBound(a)) are
identical. It's when you do this the *SECOND* (and subsequent) times that it
should become obvious (thus, 'heuristic') that the original chunk size was
NOT a reasonable initial guess at the array size, so better to increase the
chunk size. So doubling is no different than additive on the first resize,
but is likely more efficient on subsequent resizes.

There's still a practical argument for additive: less memory allocation. If
the initial guess happened to span 49% of the array size, then doubling
would allocate 196% of necessary memory before resizing to only what's
needed, whereas additive would only allocate to 147%. And there'd be no run
 
H

Harlan Grove

@#$% flaky laptop mouse/keyboard!

Picking up on my previous, uncompleted last paragraph,

There's still a practical argument for additive: less memory allocation. If
the initial guess happened to span 49% of the array size, then doubling
would allocate 196% of necessary memory before resizing to only what's
needed, whereas additive would only allocate to 147%. And there'd be no run
time different for 3 resizes. That leads to another alternative: use an
integer state variable, and wait for each 3rd resize to increase the memory
allocation increment, and then increment between additive and exponential.

Dim a(1 To chunk)
i = 0
k = 0
Do While whatever
i = i + 1
...
If i >= UBound(a)
k = k + 1
If k Mod 3 = 0 Then
chunk = 2 * chunk
k = k + 1
End If
ReDim Preserve a(1 To UBound(a) + chunk)
End If
Loop
ReDim Preserve a(1 To i)

This makes the array sizes these multiples of chunk: 1, 2, 3, 5, 7, 11, 15,
23, 31, etc. as opposed to (additive) 1, 2, 3, 4, 5, 6, 7, 8, 9, etc. or
(exponential) 1, 2, 4, 8, 16, 32, 64, 128, 256, etc. This fussy additional
coding avoids bogging down in resizing operations when the initial chunk is
well below 25% of the actual array size without unduly overallocating menory
when the final array size is just larger than a power of 2 times the initial
chunk size.
 
R

Rick Rothstein \(MVP - VB\)

Given you obvious interest in the underlying mathematics of things, you
might find this Google Group thread interesting.

http://groups.google.com/group/micr...2bd70f9e95/7cf35cd25e9c9249?#7cf35cd25e9c9249

I thought I remembered more analysis in it then there appears to be, but you
should still enjoy it I would think. Concentrate on messages 7 through 22.

To give you a summary of what it is about, it deals with maximizing the
interval (gaps) to be used when using the VBA Replace function in a loop in
order to squeeze out all multiple blank space (which, of course, is what
TRIM does at the spreadsheet level). The discussion was in a compiled VB
newsgroup, but (on a quick skim) the code looks like it would work as in in
Excel's macro world.

Rick
 
H

Harlan Grove

Given you obvious interest in the underlying mathematics of things, you
might find this Google Group thread interesting.

http://groups.google.com/group/micr...2bd70f9e95/7cf35cd25e9c9249?#7cf35cd25e9c9249
....

Starting with the first post, lots of whining & fussing about something that
could easily have been done with VBScript regular expressions using " {2,}".

I thought I'd test out the proposed optimal approach (#21) in Excel/VBA.


Sub foo()
Const UB As Long = 1000
Const ML As Long = 10000
Const MI As Long = 200

Dim i As Long, j As Long, k As Long, n As Long
Dim dt As Date, v As Variant, t As String
Dim a(1 To UB) As String, b(1 To 7) As String

On Error Resume Next
For i = 1 To UB
t = ""
k = Int(1 + 4 * Rnd)
For j = 1 To k
n = IIf(j > 1, 1 + Int(Rnd ^ -2), Int(4 * Rnd))
t = t & String(IIf(n > ML, ML, n), " ") & Format(j)
Next j
a(i) = t & String(Int(4 * Rnd), " ")
Next i
On Error GoTo 0

VBA:
dt = Now
i = 0
For Each v In Array(2, 3, 3, 5, 13, 121, 9841)
i = i + 1
b(i) = String(v, " ")
Next v
For k = 1 To MI
For i = 1 To UB
t = a(i)
For j = 1 To 7
t = Replace(t, b(j), " ")
Next j
Next i
Next k
Debug.Print "VBA Replace: " & Format((Now - dt) * 86640#, "#.0000")

REGEXP:
dt = Now
Set v = CreateObject("VBScript.Regexp")
v.Pattern = " {2,}"
v.Global = True
For k = 1 To MI
For i = 1 To UB
t = v.Replace(a(i), " ")
Next i
Next k
Debug.Print "RegExp Replace: " & Format((Now - dt) * 86640#, "#.0000")
Set v = Nothing

Debug.Print String(32, "-")

End Sub


The 'optimal' VBA Replace approach takes roughly 20 times LONGER than the
RegExp.Replace approach. This tends to confirm my prejudices against VB
programmers, mainly that many (most?) of them are unaware of the proper
tools or algorithms to use for standard programming tasks. Maybe VB.Net has
fixed that.

Anyway, about the problems, removing a particular substring from string is
different than allocating storage for an array of unknown size, are
fundamentally different. In the first case, the string length places an
upper bound on the number of substrings that may need to be removed. In the
latter case, the upper bound is unknown. The math is different, but
exponentially growing reallocation increments are still optimal unless
memory is tight. Even then constant increments are suboptimal.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top