Using Color.FromArgb values with Convert.ToInt32

  • Thread starter Thread starter sp@k
  • Start date Start date
S

sp@k

I have this strange problem. The application I am working on imports
color values from Flash that look like this:

0x-f9fff6

And, on the script side, it uses a conversion algorithm to
convert them into Color.FromArgb values:

Color.FromArgb(&H78 + Convert.ToInt32(aPixels(k)))

K is the loop index. I added &H78 because I read somewhere that it
should be in front of the hex value in order for the Color.FromArgb to
read it properly. But then again Convert.ToInt32() might be doing the
same thing. FYI, I also tried running it without &H78 and it didn't
work. Whatever is happening in that convert statement is screwing up
the program.

This should work under normal circumstances. But it doesn't work and
this brings me here. Please help.
 
K is the loop index. I added &H78 because I read somewhere that it
should be in front of the hex value in order for the Color.FromArgb to
read it properly. But then again Convert.ToInt32() might be doing the
same thing. FYI, I also tried running it without &H78 and it didn't
work. Whatever is happening in that convert statement is screwing up
the program.

This should work under normal circumstances. But it doesn't work and
this brings me here. Please help.

Being in-front means being in the most significant bits of the 32 bit
number. Try this:

Color.FromArgb(&H78 << 24 Or Convert.ToInt32(aPixels(k)))
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top