Marshalling C# Strings to VC++ 6.0â€

S

Scott Ballard

Greetings,

I'm having difficulty marshalling strings in a C# COM server
(VS2008, .NET Framework 2.0) back to a VC++ 6.0 client application.
The C++ application wants to consume a single byte per character
string, but the C# application is sending it back as a two byte per
character string. Fundamentally I know what the problem is (C#
strings are UNICODE) I just don't know where/how to inject the code to
fix it. Unfortunately I can't touch the C++ application; it must
remain unchanged. It was written against a COM interface that defines
the method like this in the IDL:

HRESULT Foo([in] BSTR incoming, [out] BSTR* outgoing);

In C# the interface is defined like this:

void Foo([In, MarshalAs(UnmanagedType.BStr) string incoming,
[MarshalAs(UnmanagedType.BStr] out string outgoing);

I've tried different MarshalAs types and an ICustomMarshaler for the
"outgoing" string to no avail (I can provide additional details if
needed). The odd thing is the C# COM server has no trouble reading
the "incoming" string from C++. I'm really hoping someone can give me
a pointer or two on how to do this. Thank you very much for your
help.

Regards,

Scott B.
 
K

Konrad Neitzel

Hi Scott!
I'm having difficulty marshalling strings in a C# COM server
(VS2008, .NET Framework 2.0) back to a VC++ 6.0 client application.
The C++ application wants to consume a single byte per character
string, but the C# application is sending it back as a two byte per
character string. Fundamentally I know what the problem is (C#
strings are UNICODE) I just don't know where/how to inject the code to
fix it. Unfortunately I can't touch the C++ application; it must
remain unchanged. It was written against a COM interface that defines
the method like this in the IDL:
HRESULT Foo([in] BSTR incoming, [out] BSTR* outgoing);
In C# the interface is defined like this:
void Foo([In, MarshalAs(UnmanagedType.BStr) string incoming,
[MarshalAs(UnmanagedType.BStr] out string outgoing);

Hmm ... UnmanagedType.BStr is a Unicode character string. I would use
LPStr instead of BStr.

Maybe that change would be enough already?

With kind regards,

Konrad
 
K

Konrad Neitzel

Hi Scott!

Just some more information - My C++ times are quite far away in the past
already, but I hope that I remember correctly.
HRESULT Foo([in] BSTR incoming, [out] BSTR* outgoing);

But if this function does not want unicode - why does it use BSTR? I am
just wondering. "BSTRs are wide, double-byte (Unicode) strings on 32-bit
Windows platforms and narrow, single-byte strings on the Apple PowerMac."
(From MSDN!)

So maybe the implementation of the program itself is not that clean (or
running on a Apple PowerMac).
In C# the interface is defined like this:
void Foo([In, MarshalAs(UnmanagedType.BStr) string incoming,
[MarshalAs(UnmanagedType.BStr] out string outgoing);
Hmm ... UnmanagedType.BStr is a Unicode character string. I would use
LPStr instead of BStr.
Maybe that change would be enough already?


From my point of view, changing the Type to a LPStr should make sure, that
you no longer get a Unicode string. So it could fix the problems.

With kind regards,

Konrad
 
S

Scott Ballard

Hi Scott!

Just some more information - My C++ times are quite far away in the past  
already, but I hope that I remember correctly.
HRESULT Foo([in] BSTR incoming, [out] BSTR* outgoing);

But if this function does not want unicode - why does it use BSTR? I am  
just wondering. "BSTRs are wide, double-byte (Unicode) strings on 32-bit  
Windows platforms and narrow, single-byte strings on the Apple PowerMac." 
(From MSDN!)

So maybe the implementation of the program itself is not that clean (or  
running on a Apple PowerMac).
In C# the interface is defined like this:
void Foo([In, MarshalAs(UnmanagedType.BStr) string incoming,
[MarshalAs(UnmanagedType.BStr] out string outgoing);
Hmm ... UnmanagedType.BStr is a Unicode character string. I would use
LPStr instead of BStr.
Maybe that change would be enough already?

From my point of view, changing the Type to a LPStr should make sure, that  
you no longer get a Unicode string. So it could fix the problems.

With kind regards,

Konrad

Hi Konrad,

Thank you for the quick response. I did try MarshalAs.LPStr, but the C
++ client application complained "the stub received bad data."
Remember the C++ application was programmed against the IDL that
specified a BSTR. Therefore, it's expecting a BSTR and receives an
LPStr so I'm not too surprised it didn't work.

I read the same documentation as you about the BSTR being two byte
wide characters on Windows. The C++ application is definitely running
on Windows. Did that same definition of BSTR apply ten years ago in VC
++6.0 when the C++ application was written?

The whole point of this exercise is to replace the old VC++6.0 COM
server with a modern C# COM Server, without touching the old VC++6.0
client application. If I run the C++ client application in a debugger
I can definitely see the old C++ COM server returns the string as one
byte wide characters (despite the IDL defining BSTR type). However,
when I replace the C++ COM server with C# COM server and run the C++
client application in the debugger I can see the new C# COM Server is
returning two byte wide characters. There has to be a way to get a
one byte wide character string into a BSTR, unless I'm thinking about
this in the wrong way.

Regards,

Scott B.
 
D

Doug Forster

The whole point of this exercise is to replace the old VC++6.0 COM
server with a modern C# COM Server, without touching the old VC++6.0
client application. If I run the C++ client application in a debugger
I can definitely see the old C++ COM server returns the string as one
byte wide characters (despite the IDL defining BSTR type). However,
when I replace the C++ COM server with C# COM server and run the C++
client application in the debugger I can see the new C# COM Server is
returning two byte wide characters. There has to be a way to get a
one byte wide character string into a BSTR, unless I'm thinking about
this in the wrong way.

I see the UnmanagedType enum has an AnsiBStr value. Perhaps that will do
what you want?

Cheers
Doug Forster
 
S

Scott Ballard

I see the UnmanagedType enum has an AnsiBStr value. Perhaps that will do
what you want?

Cheers
Doug Forster

This is solved. It turns out the old C++ code was using
SysAllocStringByteLen to create an ANSI BSTR (one byte per
character). I modified the new C# code to P/Invoke
SysAllocStringByteLen and everything works fine now. I never thought
of jamming the ANSI output of SysAllocStringByteLen into a C# string.
The C# string looks strange in the debugger (since it reads two bytes
per character and the ANSI string is only one byte per character), but
it works!

Regards,

Scott B.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top