Using a constant long as array size initializer gives RTE (was Re: Option Strict On)

L

Larry Lard

Vijay said:
The confusion for me here is C-Sharp does not seem to complain. VB being
simple enough why is there this problem? Maybe they have not changed the
underlying type from VB??

It would appear this is one the very few times when VB.NET is stricter
than C#. To wit, and to recap for C#ers just joining us:

C#
int[] f = new int[3000000000];
// compiles, throws System.OverflowException at run time

VB.NET
Dim f(3000000000) As Integer
' Will not compile: error is 'Constant expression not representable in
type Integer'

So here, the VB.NET compiler can work out at compile time that 3
billion is bigger than an Int32, and thus can't be used as an array
size (a CLS limitation I would suspect); but the C# compiler can't, or
doesn't care.

Your particular example can be examined also:

C#
long l = 3000000000;
int[] f = new int[l];
// compiles, throws System.OverflowException at run time

VB.NET
Dim l As Long = 3000000000
Dim f(l) As Integer
' Will not compile under Option Strict On:
' error is 'Option Strict On disallows implicit conversions from
'<type1>' to '<type2>''
' WILL compile with Option Strict Off: throws System.OverflowException
at run time

So for me the question is, why does C# allow this unstated cast from
long to int to compile? To that end I have xposted to the C# group
 
V

VJ

Yea seems like at some point VB.NET is smarter or strict here... I took all
the trouble to move from VB to C# over last 2 years.. now you have given me
some reason to come back.. :) Actually speaking I should be a loyalist here
for VB, because my initials are VB...:)

Vijay

Larry Lard said:
The confusion for me here is C-Sharp does not seem to complain. VB being
simple enough why is there this problem? Maybe they have not changed the
underlying type from VB??

It would appear this is one the very few times when VB.NET is stricter
than C#. To wit, and to recap for C#ers just joining us:

C#
int[] f = new int[3000000000];
// compiles, throws System.OverflowException at run time

VB.NET
Dim f(3000000000) As Integer
' Will not compile: error is 'Constant expression not representable in
type Integer'

So here, the VB.NET compiler can work out at compile time that 3
billion is bigger than an Int32, and thus can't be used as an array
size (a CLS limitation I would suspect); but the C# compiler can't, or
doesn't care.

Your particular example can be examined also:

C#
long l = 3000000000;
int[] f = new int[l];
// compiles, throws System.OverflowException at run time

VB.NET
Dim l As Long = 3000000000
Dim f(l) As Integer
' Will not compile under Option Strict On:
' error is 'Option Strict On disallows implicit conversions from
'<type1>' to '<type2>''
' WILL compile with Option Strict Off: throws System.OverflowException
at run time

So for me the question is, why does C# allow this unstated cast from
long to int to compile? To that end I have xposted to the C# group
 
V

VJ

Sorry wrong post... I was another group..

VJ

VJ said:
Yea seems like at some point VB.NET is smarter or strict here... I took
all the trouble to move from VB to C# over last 2 years.. now you have
given me some reason to come back.. :) Actually speaking I should be a
loyalist here for VB, because my initials are VB...:)

Vijay

Larry Lard said:
The confusion for me here is C-Sharp does not seem to complain. VB being
simple enough why is there this problem? Maybe they have not changed the
underlying type from VB??

It would appear this is one the very few times when VB.NET is stricter
than C#. To wit, and to recap for C#ers just joining us:

C#
int[] f = new int[3000000000];
// compiles, throws System.OverflowException at run time

VB.NET
Dim f(3000000000) As Integer
' Will not compile: error is 'Constant expression not representable in
type Integer'

So here, the VB.NET compiler can work out at compile time that 3
billion is bigger than an Int32, and thus can't be used as an array
size (a CLS limitation I would suspect); but the C# compiler can't, or
doesn't care.

Your particular example can be examined also:

C#
long l = 3000000000;
int[] f = new int[l];
// compiles, throws System.OverflowException at run time

VB.NET
Dim l As Long = 3000000000
Dim f(l) As Integer
' Will not compile under Option Strict On:
' error is 'Option Strict On disallows implicit conversions from
'<type1>' to '<type2>''
' WILL compile with Option Strict Off: throws System.OverflowException
at run time

So for me the question is, why does C# allow this unstated cast from
long to int to compile? To that end I have xposted to the C# group
 
M

Mike Schilling

Larry Lard said:
The confusion for me here is C-Sharp does not seem to complain. VB being
simple enough why is there this problem? Maybe they have not changed the
underlying type from VB??

It would appear this is one the very few times when VB.NET is stricter
than C#. To wit, and to recap for C#ers just joining us:

C#
int[] f = new int[3000000000];
// compiles, throws System.OverflowException at run time

VB.NET
Dim f(3000000000) As Integer
' Will not compile: error is 'Constant expression not representable in
type Integer'

So here, the VB.NET compiler can work out at compile time that 3
billion is bigger than an Int32, and thus can't be used as an array
size (a CLS limitation I would suspect); but the C# compiler can't, or
doesn't care.

Your particular example can be examined also:

C#
long l = 3000000000;
int[] f = new int[l];
// compiles, throws System.OverflowException at run time

VB.NET
Dim l As Long = 3000000000
Dim f(l) As Integer
' Will not compile under Option Strict On:
' error is 'Option Strict On disallows implicit conversions from
'<type1>' to '<type2>''
' WILL compile with Option Strict Off: throws System.OverflowException
at run time

So for me the question is, why does C# allow this unstated cast from
long to int to compile? To that end I have xposted to the C# group


There's no cast involved; the C# spec says, regarding the sizes used in
creating a new array: (section 7.5.10.2
Array creation expressions)

Each expression in the expression list must be of type int, uint, long,
or ulong, or of a
type that can be implicitly converted to one or more of these types

It also says it's a compile-time error if a constant expression evaluates to
a negative, but is silent about what happens if it evaluates to a number
that's too large. In fact, nothing I can see in the spec refers to the
largest possible size of an array being limited to what fits in an int.
Apparently the language has no such restriction.
 
J

Jon Skeet [C# MVP]

Mike Schilling said:
There's no cast involved; the C# spec says, regarding the sizes used in
creating a new array: (section 7.5.10.2
Array creation expressions)

Each expression in the expression list must be of type int, uint, long,
or ulong, or of a
type that can be implicitly converted to one or more of these types

It also says it's a compile-time error if a constant expression evaluates to
a negative, but is silent about what happens if it evaluates to a number
that's too large. In fact, nothing I can see in the spec refers to the
largest possible size of an array being limited to what fits in an int.
Apparently the language has no such restriction.

Having looked at the generated code, it looks like it will overflow on
a 32-bit system, but not on a 64-bit system. A conv.of.i instruction is
used, which converts to a native int and throws an exception if the
value is too large to be stored in a native int.

So, I think you could look at this in two ways. VB.NET stops you making
silly mistakes on 32-bit systems, but prevents you from creating arrays
of more than 2GB on 64-bit systems. C# does the reverse. Personally, on
this matter I think I prefer the VB.NET approach.

I'll raise this as a point of ambiguity in the spec though.
 
W

Willy Denoyette [MVP]

| > > So for me the question is, why does C# allow this unstated cast from
| > > long to int to compile? To that end I have xposted to the C# group
| >
| > There's no cast involved; the C# spec says, regarding the sizes used in
| > creating a new array: (section 7.5.10.2
| > Array creation expressions)
| >
| > Each expression in the expression list must be of type int, uint,
long,
| > or ulong, or of a
| > type that can be implicitly converted to one or more of these types
| >
| > It also says it's a compile-time error if a constant expression
evaluates to
| > a negative, but is silent about what happens if it evaluates to a number
| > that's too large. In fact, nothing I can see in the spec refers to the
| > largest possible size of an array being limited to what fits in an int.
| > Apparently the language has no such restriction.
|
| Having looked at the generated code, it looks like it will overflow on
| a 32-bit system, but not on a 64-bit system. A conv.of.i instruction is
| used, which converts to a native int and throws an exception if the
| value is too large to be stored in a native int.
|
| So, I think you could look at this in two ways. VB.NET stops you making
| silly mistakes on 32-bit systems, but prevents you from creating arrays
| of more than 2GB on 64-bit systems. C# does the reverse. Personally, on
| this matter I think I prefer the VB.NET approach.
|

Managed Array's (just like any other object)are limited to 2GB anyway, also
on 64bit windows.

Willy.
 
J

Jon Skeet [C# MVP]

Willy Denoyette said:
| So, I think you could look at this in two ways. VB.NET stops you making
| silly mistakes on 32-bit systems, but prevents you from creating arrays
| of more than 2GB on 64-bit systems. C# does the reverse. Personally, on
| this matter I think I prefer the VB.NET approach.

Managed Array's (just like any other object)are limited to 2GB anyway, also
on 64bit windows.

Hmm... that may be a limitation of .NET rather than the CLI spec
though. The CLI spec states that the newarr instruction take a number
of elements which is of type native int - and doesn't specify any
limit.

It could be that while current implementations have more restrictions,
a future implementation may be able to create larger arrays.

Of course, it could equally be two different people on the spec writing
team who didn't talk to each other quite enough...
 
W

Willy Denoyette [MVP]

| > | So, I think you could look at this in two ways. VB.NET stops you
making
| > | silly mistakes on 32-bit systems, but prevents you from creating
arrays
| > | of more than 2GB on 64-bit systems. C# does the reverse. Personally,
on
| > | this matter I think I prefer the VB.NET approach.
| >
| > Managed Array's (just like any other object)are limited to 2GB anyway,
also
| > on 64bit windows.
|
| Hmm... that may be a limitation of .NET rather than the CLI spec
| though. The CLI spec states that the newarr instruction take a number
| of elements which is of type native int - and doesn't specify any
| limit.
|

Actually it says "native int or int32", which is rather confusing IMO, and I
noticed that it's an int32 even on the 64-bit CLR.


| It could be that while current implementations have more restrictions,
| a future implementation may be able to create larger arrays.
|

Sure, it's a .NET limitation, it's possible that the next version of the CLR
supports a larger value, but as far as I know nothing like this has been
announced publically.

| Of course, it could equally be two different people on the spec writing
| team who didn't talk to each other quite enough...
|

I don't think so. IMO the limitation is a conscious design decision. Imagine
what happens on a system when a single application allocates a single 8GB
array (contigious memory) and starts to access it in a sparse/random order,
you'll end in a world of pain unless you have a ton of physical memory
available.

Willy.
 
J

Jon Skeet [C# MVP]

Willy Denoyette said:
| Hmm... that may be a limitation of .NET rather than the CLI spec
| though. The CLI spec states that the newarr instruction take a number
| of elements which is of type native int - and doesn't specify any
| limit.

Actually it says "native int or int32", which is rather confusing IMO, and I
noticed that it's an int32 even on the 64-bit CLR.

Perhaps we're looking at different specs, or different places? I was
looking at partition 3 of the ECMA spec, in the definition of newarr.
It's rather odd.
| It could be that while current implementations have more restrictions,
| a future implementation may be able to create larger arrays.

Sure, it's a .NET limitation, it's possible that the next version of the CLR
supports a larger value, but as far as I know nothing like this has been
announced publically.

Right. My guess is that in 10 years time the limitation might seem
somewhat severe - although I would have thought that the spec could
have been expanded at that time.
| Of course, it could equally be two different people on the spec writing
| team who didn't talk to each other quite enough...

I don't think so. IMO the limitation is a conscious design decision. Imagine
what happens on a system when a single application allocates a single 8GB
array (contigious memory) and starts to access it in a sparse/random order,
you'll end in a world of pain unless you have a ton of physical memory
available.

But that situation may well be reasonably common in 10 years.

Put it this way - future expansion is the only reason I can see for
array lengths being allowed to be longs in C#.
 
W

Willy Denoyette [MVP]

| > | Hmm... that may be a limitation of .NET rather than the CLI spec
| > | though. The CLI spec states that the newarr instruction take a number
| > | of elements which is of type native int - and doesn't specify any
| > | limit.
| >
| > Actually it says "native int or int32", which is rather confusing IMO,
and I
| > noticed that it's an int32 even on the 64-bit CLR.
|
| Perhaps we're looking at different specs, or different places? I was
| looking at partition 3 of the ECMA spec, in the definition of newarr.
| It's rather odd.
|

From ECMA-335 3rd Ed. / June2005

Partition III

4.20 newarr - .....
....
The newarr instruction pushes a reference to a new zero-based,
one-dimensional array whose elements are of type etype, a metadata token (a
typeref, typedef or typespec; see Partition II). numElems (of type native
int or int32) specifies the number of elements in the array. Valid array
indexes are 0 ? index < numElems ...


| > | It could be that while current implementations have more restrictions,
| > | a future implementation may be able to create larger arrays.
| >
| > Sure, it's a .NET limitation, it's possible that the next version of the
CLR
| > supports a larger value, but as far as I know nothing like this has been
| > announced publically.
|
| Right. My guess is that in 10 years time the limitation might seem
| somewhat severe - although I would have thought that the spec could
| have been expanded at that time.
|
| > | Of course, it could equally be two different people on the spec
writing
| > | team who didn't talk to each other quite enough...
| >
| > I don't think so. IMO the limitation is a conscious design decision.
Imagine
| > what happens on a system when a single application allocates a single
8GB
| > array (contigious memory) and starts to access it in a sparse/random
order,
| > you'll end in a world of pain unless you have a ton of physical memory
| > available.
|
| But that situation may well be reasonably common in 10 years.
|
Well I don't believe so, but I could be wrong :-(.
I remember back in 1995 DEC said that at the end of the century 30% of all
the server/desktop would be equiped with one or more 64-bit processors with
at least 16GB of RAM, running a 64 bit OS. Six years later the world looks
more conservative with less than 10% market share (estimates) for 64 bit HW
and an avarage of 8GB of RAM.


| Put it this way - future expansion is the only reason I can see for
| array lengths being allowed to be longs in C#.
|

Agreed, but I would be happy if they first relaxed the 2GB restriction, this
way we would be able to create 2^31 * sizeof(long) arrays or 16GB, without a
need to change the CLR data structures.


Willy.
 
J

Jon Skeet [C# MVP]

Willy Denoyette said:
| Perhaps we're looking at different specs, or different places? I was
| looking at partition 3 of the ECMA spec, in the definition of newarr.
| It's rather odd.

From ECMA-335 3rd Ed. / June2005

Partition III

4.20 newarr - .....
...
The newarr instruction pushes a reference to a new zero-based,
one-dimensional array whose elements are of type etype, a metadata token (a
typeref, typedef or typespec; see Partition II). numElems (of type native
int or int32) specifies the number of elements in the array. Valid array
indexes are 0 ? index < numElems ...

Ah, interesting - same bit, different version. I'm looking at the 2002
version. That "or" is really confusing - I have no idea what it means.
| But that situation may well be reasonably common in 10 years.
|
Well I don't believe so, but I could be wrong :-(.
I remember back in 1995 DEC said that at the end of the century 30% of all
the server/desktop would be equiped with one or more 64-bit processors with
at least 16GB of RAM, running a 64 bit OS. Six years later the world looks
more conservative with less than 10% market share (estimates) for 64 bit HW
and an avarage of 8GB of RAM.

Of course DEC was trying to sell Alphas at the time :)

While it's true that we're not in a situation where more than 8GB is
*common*, it's starting to happen every so often, and not only in
massive organisations. Machines with 1 or 2GB are more common for
consumers than they were - certainly for developers, and things do tend
to gradually push upwards.

Of course, there's the ever-tantalising prospect of fast, massive,
cheap static memory - the "1TB on a credit card sized form factor for
$50" promise. I'll believe it when I see it - but if it ever *does*
happen, computing will change drastically...
| Put it this way - future expansion is the only reason I can see for
| array lengths being allowed to be longs in C#.

Agreed, but I would be happy if they first relaxed the 2GB restriction, this
way we would be able to create 2^31 * sizeof(long) arrays or 16GB, without a
need to change the CLR data structures.

Right.
 
W

Willy Denoyette [MVP]

| > | Perhaps we're looking at different specs, or different places? I was
| > | looking at partition 3 of the ECMA spec, in the definition of newarr.
| > | It's rather odd.
| >
| > From ECMA-335 3rd Ed. / June2005
| >
| > Partition III
| >
| > 4.20 newarr - .....
| > ...
| > The newarr instruction pushes a reference to a new zero-based,
| > one-dimensional array whose elements are of type etype, a metadata token
(a
| > typeref, typedef or typespec; see Partition II). numElems (of type
native
| > int or int32) specifies the number of elements in the array. Valid array
| > indexes are 0 ? index < numElems ...
|
| Ah, interesting - same bit, different version. I'm looking at the 2002
| version. That "or" is really confusing - I have no idea what it means.
|
Nor do I.

| > | But that situation may well be reasonably common in 10 years.
| > |
| > Well I don't believe so, but I could be wrong :-(.
| > I remember back in 1995 DEC said that at the end of the century 30% of
all
| > the server/desktop would be equiped with one or more 64-bit processors
with
| > at least 16GB of RAM, running a 64 bit OS. Six years later the world
looks
| > more conservative with less than 10% market share (estimates) for 64 bit
HW
| > and an avarage of 8GB of RAM.
|
| Of course DEC was trying to sell Alphas at the time :)
|
Yep ,not that we expected to take that 30% with Alpha (our estimate was 6%),
but their forcasts were backed by Gartner's.

| While it's true that we're not in a situation where more than 8GB is
| *common*, it's starting to happen every so often, and not only in
| massive organisations. Machines with 1 or 2GB are more common for
| consumers than they were - certainly for developers, and things do tend
| to gradually push upwards.
|
| Of course, there's the ever-tantalising prospect of fast, massive,
| cheap static memory - the "1TB on a credit card sized form factor for
| $50" promise. I'll believe it when I see it - but if it ever *does*
| happen, computing will change drastically...
|

True, however we may not forget that we are talking about single array's of
2GB, so you can have several of these monsters in a single AD and multiple
AD's per process and that can become a real issue even on 64 bit if you
don't set a limit. One of the major problems we encounter now (on 64-bit)
are an overuse of XML and self expanding ArrayLists and generic List's in
server applications, growing beyong available HW memory, just because "they
are so easy to use sir". OOM exceptions aren't thrown any longer, but oh,
the performance drops dramatically and developper don't understand why. So
IMO it's good to have some limits, it makes people think, but I guess it's
me getting old ;-).

Willy.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top