S
skeamy
Hi everybody,
Im currently converting an application from VB6 to C# which contains
logic to convert EBCDIC to ASCII and vice versa. I have a test app
which reads some previously stored EBCDIC from a text file and
converts it. I have finally got it all working (Encoding.DEFAULT was
the turning point when reading the file)
I am now trying to remove all references to the Microsoft.VisualBasic
and Compatibility namespaces and use the C# System namespace
equivilents (Purely for my own experience I am not starting a debate
about this). I have everything changed except a packed date routine
which uses Strings.ASC and uses the last char of the packed field to
indicate whether its a positive or negative number - I have isolated
the problem to a few lines of code and wonder if someone can explain
what is happening
char cTest = (char)(174);
MessageBox.Show(((int) cTest ).ToString()); // Returns 174 Works OK
MessageBox.Show(Strings.Asc(cTest).ToString()); // Returns 174 Works
OK
char cTest1 = (char)(338);
MessageBox.Show(((int) cTest1 ).ToString()); // Returns 338 Doesn't
work
MessageBox.Show(Strings.Asc(cTest1).ToString()); // Returns 140 Works
OK
char cTest2 = (char)(337);
MessageBox.Show(((int) cTest2 ).ToString()); // Returns 337 Doesn't
work
MessageBox.Show(Strings.Asc(cTest2).ToString());// Returns 111 Works
OK
I was led to believe that the VB6 function ASC() returned the
equivilent of (int) char in C# but Im missing something. Does anyone
know how String.ASC is achieving its (working) results?
Thanks
Please reply in this thread not to email
Im currently converting an application from VB6 to C# which contains
logic to convert EBCDIC to ASCII and vice versa. I have a test app
which reads some previously stored EBCDIC from a text file and
converts it. I have finally got it all working (Encoding.DEFAULT was
the turning point when reading the file)
I am now trying to remove all references to the Microsoft.VisualBasic
and Compatibility namespaces and use the C# System namespace
equivilents (Purely for my own experience I am not starting a debate
about this). I have everything changed except a packed date routine
which uses Strings.ASC and uses the last char of the packed field to
indicate whether its a positive or negative number - I have isolated
the problem to a few lines of code and wonder if someone can explain
what is happening
char cTest = (char)(174);
MessageBox.Show(((int) cTest ).ToString()); // Returns 174 Works OK
MessageBox.Show(Strings.Asc(cTest).ToString()); // Returns 174 Works
OK
char cTest1 = (char)(338);
MessageBox.Show(((int) cTest1 ).ToString()); // Returns 338 Doesn't
work
MessageBox.Show(Strings.Asc(cTest1).ToString()); // Returns 140 Works
OK
char cTest2 = (char)(337);
MessageBox.Show(((int) cTest2 ).ToString()); // Returns 337 Doesn't
work
MessageBox.Show(Strings.Asc(cTest2).ToString());// Returns 111 Works
OK
I was led to believe that the VB6 function ASC() returned the
equivilent of (int) char in C# but Im missing something. Does anyone
know how String.ASC is achieving its (working) results?
Thanks
Please reply in this thread not to email