T
Tony Johansson
Hi!
Here I encode the spanish character "ñ" to UTF-8 which is encoded as a two
bytes with the values 195 and 177 which is understandable.
As we know a char is a Unicode which is a signed 16-bits integer.
Now to my question when I run this program and use the debugger and hover
over this ch variabel that is of type char
it shows 241.
I mean because a char is Unicode(UTF-16) and this value is using two bytes
when UTF-8 is used how can the debugger show 241 when I hover over this ch
variable ?
static void Main(string[] args)
{
UTF8Encoding utf8 = new UTF8Encoding();
string chars = "ñ";
char ch = 'ñ';
byte[] byteArray = new byte[utf8.GetByteCount(chars)];
byteArray = utf8.GetBytes(chars);
Console.WriteLine(utf8.GetString(byteArray));
}
//Tony
Here I encode the spanish character "ñ" to UTF-8 which is encoded as a two
bytes with the values 195 and 177 which is understandable.
As we know a char is a Unicode which is a signed 16-bits integer.
Now to my question when I run this program and use the debugger and hover
over this ch variabel that is of type char
it shows 241.
I mean because a char is Unicode(UTF-16) and this value is using two bytes
when UTF-8 is used how can the debugger show 241 when I hover over this ch
variable ?
static void Main(string[] args)
{
UTF8Encoding utf8 = new UTF8Encoding();
string chars = "ñ";
char ch = 'ñ';
byte[] byteArray = new byte[utf8.GetByteCount(chars)];
byteArray = utf8.GetBytes(chars);
Console.WriteLine(utf8.GetString(byteArray));
}
//Tony