Exceeding 65K of defined names within workbook causes workbook to go into repair mode when it is ope

R

Ronald Dodge

Okay, I'm at it again, but only with further testing on this defined names
issue. I have seen different posts stating that with excessive number of
range names, it causes the files to process significantly slower, which I
have seen no such symptom within my own files. However, I have noticed a
certain pattern and now I must raise another question that I seem to have
noticed?

What is the maximum number of items any one collection can have without
causing issues?

Why do I ask this?

It seems to me that the collection is using 4 bytes as addresses to get an
index number. If this is the case, then when an item exceeds 65536
different items, you no longer have a unique ID number, which is what seems
to be causing me these issues that I seem to have ran into. This 65536
number also made me think of rows instantly, which then makes me wonder if
this is also the reason why MS has made 65536 as the maximum number of rows
one can have within a single worksheet.

What had me think in this sort of direction?

Remember the early Windows 95 edition, and you could only have up to 2GB of
HD Space on a per letter drive basis, thus if you had a larger drive size,
it had to be partitioned into multiple partitions to be able to use the
space on it? Well that issue related back to the fact that the original FAT
filing system could only handle up to the 2GB of space per drive/partition
for addressing purposes.

In this case, if the Names collection has more than the 4 bytes of indexed
values, the workbook goes into repair mode when it is opened, which then
makes the workbook almost useless as it stripes a lot of stuff from it.
While the application specifications says it's limited to the amount of RAM
on the system, I'm no where near reaching this limit. Not only that, but
supposedly, Excel 2002 should be able to handle up to about 160MB of RAM
usage, which according to the task manager, when I ran into this issue, it
was only using 67MB of RAM on a system that has 512MB of RAM. According to
the Task Manager, I still have 130MB of RAM usage currently not in use and
available to use along with 768MB of swap file available to use for a total
working memory usage availability being 1.28GB, which only up to 853MB of
that has been used.

Now that I seemed to have found the real limit of defined names, it's now
time to start thinking in other directions to address the issues that I been
facing. I would dare to venture to claim that the specification limit of
defined names is in deed 65536 just like the maximum number of rows is
65536. The "Cells" object has a 2 byte by a 1 byte index method, similar to
a 2 dimensional array. To help me find out the issues, I used the watch
window to look at the count value on the Names collection object and after
several testings and watching that value along with seeing other things,
it's what has had me draw that conclusion.
--

Sincerely,

Ronald R. Dodge, Jr.
Master MOUS 2000
 
J

Jon Peltier

Type 'Excel limits' in the help lookup box. Under "Worksheet and workbook
specifications" you'll see:

Names in a workbook: Limited by available memory

I would say that 65k names is a bit excessive, but I've never had more than
a couple thousand.

- Jon
 
R

Ronald Dodge

Uhmmm, I know this and I have stated this in the message, but it's not
working out that way. Regardless what the specifications says, the testings
are stating otherwise. Why do you think I stated so much about the RAM and
SWAP file availability and usage in the last paragraph of my message?

Sorry if this seems harsh, but it seems to be rather frustrating when you
are doing things within written specifications, but things aren't working
out for you. I can pretty much do everything else within the files, but
this names issue has surfaced on me, and this only happens when I exceed
that 65536 limit. The file is a bit slow to open with the 62k names, but I
can still do so much other stuff even with the 62k names in it. It's not
even close to breaching the memory limitation with the 62k names.

Sincerely,

Ronald R. Dodge, Jr.
Master MOUS 2000
 
J

Jon Peltier

Well, I obviously didn't read past the first couple paragraphs in your
message. If I had done so, I may not have answered, because I would have
noticed you'd already learned what I then posted.

I understand your frustration with the stated limits. Maybe when the
limitations were first written down, available memory only allowed well
under 65k names, so whoever wrote the specs didn't consider the 4 byte
addresses.

- Jon
 
P

Peter T

I have no idea as to the specific problem for the apparent Names limit
you've hit, never having tested with anything like that many. However I'm
pretty sure there's not an inherent 64k limit to collections in general. In
theory the count should extent to a Long though would undoubtedly hit memory
constraints before getting any where close to that. However specific
collections may be limited due to other factors.

The 64k rows limit affects quite a few things in Excel, one is the array
limit which is actually one less (I don't just mean array entered formulas,
but certain array type functions). Anything like that referencing your names
perhaps, which might only become apparent when a full recalc occurs on open.
I know not related but there are also 64k limits with certain things in W9x
and ME with certain API's (I got bit by one recently).
Now that I seemed to have found the real limit of defined names, it's now
time to start thinking in other directions to address the issues that I been
facing.

Sounds well overdue!

Regards,
Peter T
 
R

Ronald Dodge

The funny thing about the written documentation that you stated, the system
that I ran into the issue with is the system that I got as of June 1999 with
no sort of upgrades to it on the hardware side ever since, so that
documentation must be prior to Excel 97, if that is the case. This also
held true for the system that the company bought back in 1997. The only
hardware upgrade it had was going from 128MB of RAM to 384MB of RAM. That
system still has just a 6GB HD while the other system only has 20GB HD size.
Compare that to today's standards, that's very small for an HD, but yet back
then, it was pretty high up.

As you may have noticed, I generally keep working and testing, documenting
until I find something that's seems to be statistically much more likely to
be something other than how it's stated and I point those things out. I
even been pointed to posts stating that things in Excel tends to run slow
with 5+ range names, and I haven't even found that to be the case with 62k+
range names.

Of anyone out there, I'm probably one of the very few people (on a relative
basis) that really push systems to their limits that doesn't even get into
graphics, audio, video, or design type stuff. I more or less heavily
process data with other things added to the files for readability purposes.
For me, it's the data and formats (Only cause Excel burns off so much memory
for formats, otherwise, it would only be data intensive for memory usage)
that gets to be heavy memory usage and CPU intensive for the large amount of
data that gets processed, though still probably minor compared to graphics
department. On the other hand, it still takes about an hour per day just to
run the reports and process the data. One reason why I am a heavy data
processor, I'm a real stickler about accuracy and precision cause when I
haven't been in the past, it came back and haunted me in some form. Along
those same lines, I don't like doing things manually, especially with the
days when I worked at the IRS doing nothing but data entry only to get the
early stages of CTS, so I have since then sharply enhanced my computer
skills and automated a lot of the tasks.

Just with so many changes that took place last fall, I had to go into a full
scale audit process after I got things back into alignment on a temporary
basis. I have had to since then put in fixes, just so as when those types
of changes takes place again, those numbers aren't nearly as likely to go
out of alignment again. That's what brought on me using range names, but I
see I will have to get creative once again and determine another route to
address these issues. I still will end up using range names, but probably
much less intensive given this 4 byte addressing limit for collection
objects.

Ronald R. Dodge, Jr.
Master MOUS 2000
 
R

Ronald Dodge

Even though I have used range names before in the past, nothing to this
extent until this past March when I first started to use range names
extensively only to have ran into this limit so fast.

As for the collection count property that I asked, it just made me wonder if
the count property has been just assigned an unsigned integer variable data
type, which is what I have faced in particular to the Names collection
object. Rather if it's signed or not, it's still 4 bytes and you still only
can get 65536 different ID values. I haven't had anything else to run into
this 4 byte addressing limitation, but it wouldn't surprise me. I have done
some extensive programming including created my own objects and collections
of objects, but never really tested to see what is the maximum number of
items those collections could hold. Of course, given that a single
VBProject can only hold up to 64MB of RAM usage, this limitation may not be
hit as easily.

As for the recalc you mention, the workbook is set to manual calculation and
the instance of Excel it opens into is also set to manual calculation, so
the repair isn't even triggered by a calculation factor. For that set of
files, I mainly use VBA to control the calculations of the various workbooks
and worksheets. By using VBA to control the calculations, it has saved a
lot of processing time as I have been able to cut down an extraordinary
amount of redundant calculations by using VBA to control the calculations
rather than just allowing Excel to do the calculations how it would do it
under AutoCalc or a full forced recalc.

Ronald R. Dodge, Jr.
Master MOUS 2000
 
P

Peter T

I don't quite follow where the "4 bytes" comes in
As for the collection count property that I asked, it just made me wonder if
the count property has been just assigned an unsigned integer variable data
type, which is what I have faced in particular to the Names collection
object. Rather if it's signed or not, it's still 4 bytes and you still only
can get 65536 different ID values.

4 bytes (32 bit) 2,147,483,647 +ve values
2 bytes -32,768 to +32,767, including the zero total 65536 unique values

FWIW the count property of a collection is a long -

Dim c As New Collection, v As Variant
v = c.Count
MsgBox VarType(v) ' 3 / Long
For i = 1 To 70000
c.Add i, CStr(i)
Next
MsgBox c.Count ' 70000


I always thought workbooks re-calculate on open, irrespective of the wb's or
application's Calculation property, as they do on save but not sure (would
need to write to a cell in a closed wb which in turn is linked to a formula
and view if it the formula value updates on open with calc manual). Even if
it doesn't, it still wouldn't prove wouldn't prove either way if other non
cell formula stuff re-calc's on open with calc-man.

I doubt any of the above directly relates to the a Names limit. Must admit I
would be interested to know if the limit really is a defined, but only for
curiosity as I can't imagine ever needing to use so many names.

Regards,
Peter T
 
R

Ronald Dodge

Sorry about that, you right, it's 2 bytes. You still get what I'm refering
to otherwise.

Ronald R. Dodge, Jr.
Master MOUS 2000
 
P

Peter T

Not entirely, but indeed Excel 97 was originally designed for 2 byte/16
bit/64k.
Maybe !

Regards,
Peter T
 
N

NickHK

Ronald,
No such limit on a collection. I can add >1 million quickly.
With the Names:
I ran code to find a limit, but after 25 minutes with ~ 75000 names added, I
stopped the code.
So, it would seem a memory limit rather than any hard limit.

Whatever the limit, it seems ill advised to use so many names in a single
WB.
Maybe create your own Type:

Private Type MyRange
RAddress As String
RValue As Variant
End Type

and add these to a collection, with the <Name> as the key.
Not quite the same, but you can then avoid all these inherent problems of a
massive names collection.

NickHK
 
R

Ronald Dodge

Which version of Excel are you using? This is happening with Excel 2002,
SP3. The file that I attempted to put range names into is around 9MBs and I
have had files pushing 20MBs using no range names and still not have a
problem other than just being on the slow side due to it's size and amount
of calculations on the large amount of data.

The other thing, did you save this file, close it out, then reopen it? It's
when you go to reopen a such file, that is when it goes into repair mode.
When I first ran into this issue, I was working in the file just fine until
after I had saved it, closed it out and attempted to reopen it at a later
point of time. If you never close out a such file, you never notice the
issue.

Just for gigs, I even did it for a completely new workbook with nothing else
open, setup a macro to record into it a total of 65538 names, saved it,
closed it out, then open it back up, and it also did the same thing. That
file size was 3.35MB.

--

Sincerely,

Ronald R. Dodge, Jr.
Master MOUS 2000
 
N

NickHK

Ronald.
Comments in line

Certainly seems that trying to use that many names is not the way to go.
Find another way.

NickHK

Ronald Dodge said:
Which version of Excel are you using? This is happening with Excel 2002,
SP3. The file that I attempted to put range names into is around 9MBs and I
have had files pushing 20MBs using no range names and still not have a
problem other than just being on the slow side due to it's size and amount
of calculations on the large amount of data.
- Same, XL2002, SP3
The other thing, did you save this file, close it out, then reopen it? It's
when you go to reopen a such file, that is when it goes into repair mode.
When I first ran into this issue, I was working in the file just fine until
after I had saved it, closed it out and attempted to reopen it at a later
point of time. If you never close out a such file, you never notice the
issue.
- First time, no saving/opening
- OK, just ran again and saved/closed. Upon opening, yes error message about
damage etc.
 
R

Ronald Dodge

Oh believe me, I have already gone a different direction. When you come to
road blocks, you aren't going to just stand still and do nothing or keep
trying to get through. Instead, you find another way to get around.

--

Sincerely,

Ronald R. Dodge, Jr.
Master MOUS 2000
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top