Why Does it generate different result ? using ASCIIEncoding.GetChars(..), ASCIIEncoding.GetString(..

±

±èÀçȲ

When you use the ASCIIEncoding class, its GetChars(..) method decode the
bytes as '?' if bytes[index] higher than 0x7f.
But its GetString(..) method decode the bytes using this code that is 0x7f &
bytes[index].
Why..?

Why Does it generate different result ? using ASCIIEncoding.GetChars(..),
ASCIIEncoding.GetString(..)
Why Doesn't it generate same result ?


Source>>
Encoding e = Encoding.ASCII;

byte[] b = Encoding.Unicode.GetBytes("°¡");

foreach(char c in e.GetChars(b))
Console.Write(c);

Console.WriteLine();

Console.WriteLine(e.GetString(b));

result>>
?
,
Press any key to continue
 
J

Jon Skeet [C# MVP]

±èÀçȲ said:
When you use the ASCIIEncoding class, its GetChars(..) method decode the
bytes as '?' if bytes[index] higher than 0x7f.
But its GetString(..) method decode the bytes using this code that is 0x7f &
bytes[index].
Why..?

It looks like GetChars is basically a bit more rigorous. I'd noticed
GetString being pretty lax before, but not that GetChars does the right
thing. Interesting...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top