Best way to convert byte array (from TCP socket) to C# Object?

G

Gator

Hi All,
Basically my situation is this, I have a server implemented in C++,
unmanaged code, using proprietery protocol over TCP/IP to communicate
with the cilent(c++ also).
Now, I am implementing the another client in C#. Server side can not
be
changed :(
So, I am using tcp/ip sockets. In the end I get byte[] on the client
side, which originally was some C++ class object, converted to byte
array.

What is the most efficient way to convert that byte array to C#
object?
I've checked few sources, writing custom serializer doesn't seem to be
a good idea, as the serialize will receive a stream object, which only
has byte read.

So far the best I've come up with is to create BinaryReader based on
the NetworkStream which in turn is created from the socket. More or
less works, but string reading is a hassle, because the server won't
send it in the format needed by BinaryReader, and I have to create 2
additional objects.

Is there anything that can operate on byte[] to read simple types from
it based on the offset and length? I.e. interpret bytes as a type?
 
F

Fergus Cooney

Hi Gator,

Without putting much thought in (I'm finishing for the day) have
you considered doing it in "unsafe" C# using pointers to your heart's
content? Being C++ers you'll know what you're doing.

Out of curiosity - am I right in thinking that you want to import
a C++ object into C# and have it become a managed object ?

Regards,
Fergus
 
G

Gator

Hi Fergus,
you considered doing it in "unsafe" C# using pointers to your heart's
content? Being C++ers you'll know what you're doing.
Nopers, I haven't considered actually. If I could I'd rather do it
safe ;)

Out of curiosity - am I right in thinking that you want to import
a C++ object into C# and have it become a managed object ?
Hmm, I guess oyu could say so. In short both ends of the socket are
dealing with message class objects. On send, class is converted to
byte buffer, on receive it has to be restored from the bute buffer.
Now if it ws C# on both end it's be all dandy, serialize and be done
with it.
Because of C++/C# combo I have to go through those contortions.
 
G

Gator

Chad Myers said:
From what I understand, you definately DO NOT want to try
to get it back into the same C++ object on the client side
(using some unmanaged DLL), right? You want to create some
type of managed .NET object (C# is just a language, remember :) Yes.

that closely resembles the C++ object, right? Yes.

I don't see why you wouldn't want to use that (or use a byte[]
like you said, since BinaryReader does it the way you're asking,
just from a stream of bytes rather than a fixed byte[]).
Well, with ints and numbers it's fine. however to read a string from
that stram is a problem, I have to copy bytes from that stram into
byte array and then convert it to the string, because the string
wasn't serialized by C++ side as C# expects it.
I assume it's a null-terminated string. You'll have to create
MemoryStream and copy the bytes (byte-by-byte) from the stream
Actually binaryreader has ReadBytes function which can read multiple
bytes in one shot, directly into byte array and then convert it to
string.

Thanks, looks like there's no better way.
 
F

Fergus Cooney

Hi Gator,

|| > you considered doing it in "unsafe" C# using pointers to your
heart's
|| > content? Being C++ers you'll know what you're doing.
||
|| Nopers, I haven't considered actually. If I could I'd rather do
it
|| safe ;)

I hear you on that, buddy, though I think "unsafe" is Microsoft's
word for "keep away kiddies". You don't strike me as a kiddy, lol.
Perhaps you could use it just for the strings.

Good luck, whichever way,
Fergus
 
C

Chad Myers

Gator said:
Pardon, forgot to mention that the string has length prefix.

Hrm, according to the docs, BinaryReader.ReadString()...
"Reads a string from the current stream. The string is
prefixed with the length, encoded as an integer seven bits
at a time."

I'm not sure what they mean by the 7-bits part...I thought
ASCII was encoded with 8 bits, but the 8th bit was never
used.

Do they expect the bits to be packed or to be 1-bit
padded to fit on the byte boundary?

-c
 
J

Jon Skeet

Chad Myers said:
Hrm, according to the docs, BinaryReader.ReadString()...
"Reads a string from the current stream. The string is
prefixed with the length, encoded as an integer seven bits
at a time."

I'm not sure what they mean by the 7-bits part...I thought
ASCII was encoded with 8 bits, but the 8th bit was never
used.

Do they expect the bits to be packed or to be 1-bit
padded to fit on the byte boundary?

Nope - it's the *length* which is encoded as an integer 7 bits at a
time, not the string. I believe it's basically a system where you take
the bits of the length, divide them into blocks of 7 (instead of the
typical 8) and use the top bit of each byte to say whether there's
another byte to come. So, for lengths < 128, you end up with it being a
single byte which is the length in itself. 128-16384 would be encoded
in two bytes (but without knowing the endianness I don't know how it
would work out exactly) etc.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top