±
±èÀçȲ
When you use the ASCIIEncoding class, its GetChars(..) method decode the
bytes as '?' if bytes[index] higher than 0x7f.
But its GetString(..) method decode the bytes using this code that is 0x7f &
bytes[index].
Why..?
Why Does it generate different result ? using ASCIIEncoding.GetChars(..),
ASCIIEncoding.GetString(..)
Why Doesn't it generate same result ?
Source>>
Encoding e = Encoding.ASCII;
byte[] b = Encoding.Unicode.GetBytes("°¡");
foreach(char c in e.GetChars(b))
Console.Write(c);
Console.WriteLine();
Console.WriteLine(e.GetString(b));
result>>
?
,
Press any key to continue
bytes as '?' if bytes[index] higher than 0x7f.
But its GetString(..) method decode the bytes using this code that is 0x7f &
bytes[index].
Why..?
Why Does it generate different result ? using ASCIIEncoding.GetChars(..),
ASCIIEncoding.GetString(..)
Why Doesn't it generate same result ?
Source>>
Encoding e = Encoding.ASCII;
byte[] b = Encoding.Unicode.GetBytes("°¡");
foreach(char c in e.GetChars(b))
Console.Write(c);
Console.WriteLine();
Console.WriteLine(e.GetString(b));
result>>
?
,
Press any key to continue