D
Dennis Myrén
Hi.
Just some questions regarding unsigned (well, signed as well) primitive
types.
uint u = 123;
From MSDN:
"When an integer literal has no suffix, its type is the first of these types
in which its value can be represented: int, uint, long, ulong."
So, this means writing uint u = 123; means i first create an int
and then implicitly cast it to uint?
Then, the above statement would result in twice the overhead
as just using an int, is that right?
A suffix may also be applied to the uint creation statement above:
uint u = 123U;
If i use the suffix of 'U', am i then working around the implicit conversion
which is taking place?
Finally, as we know, a suffix cannot be applied to ushort instances:
ushort u = 123U; // Compilation error.
So, is it so, that when assigning a value to ushort, an implicit conversion
from
int is *always* performed?
Then, instantiating a ushort instance(which should be more efficient than
int),
is actually more overhead than instantiating an int,
allocating sizeof(ushort) + sizeof(int) = 6 bytes rather than sizeof(ushort)
= 2?
Or maybe the compiler is helping us to optimize this?
Just some questions regarding unsigned (well, signed as well) primitive
types.
uint u = 123;
From MSDN:
"When an integer literal has no suffix, its type is the first of these types
in which its value can be represented: int, uint, long, ulong."
So, this means writing uint u = 123; means i first create an int
and then implicitly cast it to uint?
Then, the above statement would result in twice the overhead
as just using an int, is that right?
A suffix may also be applied to the uint creation statement above:
uint u = 123U;
If i use the suffix of 'U', am i then working around the implicit conversion
which is taking place?
Finally, as we know, a suffix cannot be applied to ushort instances:
ushort u = 123U; // Compilation error.
So, is it so, that when assigning a value to ushort, an implicit conversion
from
int is *always* performed?
Then, instantiating a ushort instance(which should be more efficient than
int),
is actually more overhead than instantiating an int,
allocating sizeof(ushort) + sizeof(int) = 6 bytes rather than sizeof(ushort)
= 2?
Or maybe the compiler is helping us to optimize this?