Why is 0xff000000 a long?

C

CedricCicada

Greetings!

I am trying to write a color class that is a bit more intelligent that
C#'s System.Color structure. This means that I'm using a bunch of
bitmasks. My class has the following method:

public static HTColor FromArgb(int argb)
{
int colorValue = 0;
int transparency = (argb & 0xff000000) >> 24;
int red = (argb & 0xff0000) >> 16;
int green = (argb & 0x00ff00) >> 8;
int blue = argb & 0x0000ff;
HTColor result;
result.m_netColor = Color.FromArgb(red, green, blue);
return result;
}

The transparency line is throwing an error claiming that I cannot
implicity convert an object of type 'long' to type 'int'.

But the C# standard for literals includes the following:

"1 The type of an integer literal is determined as follows:
2 If the literal has no suffix, it has the first of these types in
which its value can be represented: int, uint, long, ulong.
3 If the literal is suffixed by U or u, it has the first of these types
in which its value can be represented: uint, ulong.
4 If the literal is suffixed by L or l, it has the first of these types
in which its value can be represented: long, ulong.
5 If the literal is suffixed by UL, Ul, uL, ul, LU, Lu, lU, or lu, it
is of type ulong."
(quoted from
http://www.jaggersoft.com/csharp_standard/9.4.4.2.htm#integer-type-suffix)

Since ints and uints are at least 32 bits long and 0xff000000 is 32
bits long, it seems to me that this literal should be evaluated as an
int.

I get the same error if my literal is 0x8f000000, and I don't get the
error if my literal is 0x7f000000.

Thank you very much.

Rob Richardson
RAD-CON, Inc.

P.S. Yes, I know "transparency" is never used. I'm not sure if I even
want it, but I do want to understand the literal type issue it
illustrates.
 
J

John J. Hughes II

The problem is getting it into transparency, no converting the 0xff.. to
long...
Try
long transparency = (argb & 0xff000000) >> 24;

and I would have a tendency to do this so everything is the same type:
long transparency = ((long)argb & 0xff000000) >> 24;

Regards,
John
 
B

Barry Kelly

int transparency = (argb & 0xff000000) >> 24;
The transparency line is throwing an error claiming that I cannot
implicity convert an object of type 'long' to type 'int'.

Mixing 'int' and 'uint' results in 'long' to avoid treating a signed
type as an unsigned type, or vice versa (as this would result in
changing the semantic value stored in the variable).
But the C# standard for literals includes the following:

"1 The type of an integer literal is determined as follows:
2 If the literal has no suffix, it has the first of these types in
which its value can be represented: int, uint, long, ulong.
3 If the literal is suffixed by U or u, it has the first of these types
in which its value can be represented: uint, ulong.
4 If the literal is suffixed by L or l, it has the first of these types
in which its value can be represented: long, ulong.
5 If the literal is suffixed by UL, Ul, uL, ul, LU, Lu, lU, or lu, it
is of type ulong."
(quoted from
http://www.jaggersoft.com/csharp_standard/9.4.4.2.htm#integer-type-suffix)

Since ints and uints are at least 32 bits long and 0xff000000 is 32
bits long, it seems to me that this literal should be evaluated as an
int.

0xff000000 is not negative (it has no leading -), therefore it cannot be
an int (the highest int is 0x7fffffff). However, it can be a uint - and
that is in fact its type. Try this:

uint x = 0xff000000;

You'll get no error.
I get the same error if my literal is 0x8f000000, and I don't get the
error if my literal is 0x7f000000.

That's because 0x8f000000 is larger than the largest positive int, while
0x7f000000 is smaller than the largest positive int.

Beware mixing signed and unsigned.

-- Barry
 
B

Ben Voigt

Greetings!

I am trying to write a color class that is a bit more intelligent that
C#'s System.Color structure. This means that I'm using a bunch of
bitmasks. My class has the following method:

public static HTColor FromArgb(int argb)
{
int colorValue = 0;
int transparency = (argb & 0xff000000) >> 24;
int red = (argb & 0xff0000) >> 16;
int green = (argb & 0x00ff00) >> 8;
int blue = argb & 0x0000ff;
HTColor result;
result.m_netColor = Color.FromArgb(red, green, blue);
return result;
}

Shift first, then mask, to avoid evil sign-extension.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top