silliest \u question (decimal to unicode)

  • Thread starter Thread starter Andres A.
  • Start date Start date
A

Andres A.

I have bunch of unicode characters stored as Decimal
is there a easy way of displaying unicode from Decimal numbers or do i have
to convert the decimal to hex then display the hex?
i ran into a small problem with converting Dec to hex and displaying the
hex, because when i do the conversion between Dec and Hex, i dont know how i
can add a \u properly to the result for the compiler to properly recognize
it as a unicode string and not a string with just a \u in it,

i have tried string blah = @"\u" + ConvertToHex(1634); but of course this
will not work because it think its just a string with \u in it and not a
unicode string.

any help would be appreciated
 
Andres A. said:
I have bunch of unicode characters stored as Decimal

Do you mean stored as the decimal type, or as strings in decimal in a
file somewhere?

The integer types themselves are always in binary, effectively -
they're just numbers.

Converting an integer into a unicode character is very simple though:

char c = (char)myInteger;
 
Hi Andres,

Andres A. said:
I have bunch of unicode characters stored as Decimal
is there a easy way of displaying unicode from Decimal numbers or do i have
to convert the decimal to hex then display the hex?
i ran into a small problem with converting Dec to hex and displaying the
hex, because when i do the conversion between Dec and Hex, i dont know how i
can add a \u properly to the result for the compiler to properly recognize
it as a unicode string and not a string with just a \u in it,

i have tried string blah = @"\u" + ConvertToHex(1634); but of course this
will not work because it think its just a string with \u in it and not a
unicode string.

any help would be appreciated

The unicode escape "\u" is only relevant for literal strings (i.e.
quoted string values that are embedded in the code). You can cast directly
from a numeric value (int, long, decimal, etc.) to a char value:

decimal d = 65;
char c = (char)d;

string blah = c.ToString();

Regards,
Daniel
 
thank you both Jon and Daniel
I was aware that a char is really a integer, but i thought this didnt apply
to unicode characters,
thanks works great now.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Back
Top