How to pass a vc++ BSTR value to C#.Net

S

shamsh

I'm using a COM encryption method which encrypts(using RSA) a string to a
ByteArray,but this byteArray has to be passed out as a BSTR datatype and
should be availble to a C# code where the encrypted text has to be decrypted
back to normal string.
Can anyone please let me know how to convert a byteArray to BSTR in vc++ and
then in C# it should be converted back from BSTR back to byteArray?
 
A

Arnshea

I'm using a COM encryption method which encrypts(using RSA) a string to a
ByteArray,but this  byteArray has to be passed out as a BSTR datatype and
should be availble to a C# code where the encrypted text has to be decrypted
back to normal string.
Can anyone please let me know how to convert a byteArray to BSTR in vc++ and
then in C# it should be converted back from BSTR back to byteArray?

One way to do this is to use Base64Encode() in C++ (requires atlenc.h)
to convert the byte array into a plain old null-terminated char *.
Then use _bstr_t(char *) to construct the BSTR and pass _bstr_t.Detach
() on to the managed code.

Inside C# you can convert the base 64 encoded string back into a byte
array via Convert.FromBase64String().
 
S

shamsh

I’m doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR* pbstrResultText)
{
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData, &dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"), GetLastError());
return 1;
}

}

I’m able to write (LPCVOID)pbData to a .txt file(binary) and at C# end I’m
able to decrypt it back.
Instead of writing it to a .txt file I’m supposed to make it as a return type.
I’m doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString()); tried
this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this one also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData = StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData, false);
// This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string str)
{
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}
 
B

Ben Voigt [C++ MVP]

shamsh said:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData, &dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData = StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any encoding at all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what you have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 
S

shamsh

I cannot pass the byteArray as it is from the C++ function as I'm not suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.
I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

--
regards,
Shamsheer


Ben Voigt said:
shamsh said:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData, &dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData = StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any encoding at all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what you have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 
T

Tim Roberts

shamsh said:
I cannot pass the byteArray as it is from the C++ function as I'm not suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.

Well, in the context of your application, what does that mean, exactly? A
BSTR is an array of 16-bit words, ostensibly Unicode code points. You have
a byte array. Are you supposed to assume the byte array is a series of
ASCII characters, then convert it to Unicode, then encode it? Is the
encrypted result supposed to contain all Unicode characters (as a BSTR
does)? Or is the encrypted result really a sequence of bytes?
I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

You need to specify the problem a little better, I think.
 
S

shamsh

Let me try to put it in a better way..
This is the code for encryption...
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData, &dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

pbData is of BYTE* type which has the encrypted text and this has to be
converted to BSTR(return type), as the output parameter of the function is of
BSTR*.

On the C# end this BSTR* type has to be converted back to binary(it should
be the same as pbData,otherwise decryption will fail) for decryption.
 
B

Ben Voigt [C++ MVP]

shamsh said:
I cannot pass the byteArray as it is from the C++ function as I'm not
suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.

You can't change the signature because it would mess up other callers, but
you can add/change the conversion to BSTR? This doesn't make any sense to
me. Are you implementing an interface, the other providers of which are
really returning text strings and you need to return binary data instead?
In that case using the same interface is probably not the best idea because
you're violating the LSP in subtle ways.
I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

--
regards,
Shamsheer


Ben Voigt said:
shamsh said:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData, &dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData = StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any encoding at
all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what you
have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 
S

shamsh

Let me put the whole scenario for you.
Currently the whole application is using this COM DLL which has function for
encryption/decryption.We are in process of converting lot of business object
to .Net and some will remain as it is in C++.We also have a equivalent .Net
code for encryption/decryption.
So we might have a suitation where an encryption might take place using COM
and decryption using .Net.
Thats the reason for not changing the signature of the function.
The C++ caller function need not bother about the binary data being
converted to BSTR ,it should be happy if it is getting an encrypted string
from the COM.

Actually I'm able to return the binary data in the form of BSTR by using
CCOMBSTR,but at .Net side while
decrypting(rsacryptoserviceprovider.decrypt()) is either giving "Bad Data" or
the number of bytes is more than 128 for which the rsa modulus doesn't
support.

This whole thing works perfectly if I write the encrypyed text(binary data)
at COM end to a .txt file and at the C# end read that binary file for
decryption.

Let me know if I was able to communicate your doubts,or if you want the
whole code.



--
regards,
Shamsheer


Ben Voigt said:
shamsh said:
I cannot pass the byteArray as it is from the C++ function as I'm not
suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.

You can't change the signature because it would mess up other callers, but
you can add/change the conversion to BSTR? This doesn't make any sense to
me. Are you implementing an interface, the other providers of which are
really returning text strings and you need to return binary data instead?
In that case using the same interface is probably not the best idea because
you're violating the LSP in subtle ways.
I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

--
regards,
Shamsheer


Ben Voigt said:
shamsh wrote:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData, &dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData = StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any encoding at
all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what you
have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 
B

Ben Voigt [C++ MVP]

shamsh said:
Let me put the whole scenario for you.
Currently the whole application is using this COM DLL which has function
for
encryption/decryption.We are in process of converting lot of business
object
to .Net and some will remain as it is in C++.We also have a equivalent
.Net
code for encryption/decryption.
So we might have a suitation where an encryption might take place using
COM
and decryption using .Net.
Thats the reason for not changing the signature of the function.
The C++ caller function need not bother about the binary data being
converted to BSTR ,it should be happy if it is getting an encrypted string
from the COM.

Actually I'm able to return the binary data in the form of BSTR by using
CCOMBSTR,but at .Net side while
decrypting(rsacryptoserviceprovider.decrypt()) is either giving "Bad Data"
or
the number of bytes is more than 128 for which the rsa modulus doesn't
support.

This whole thing works perfectly if I write the encrypyed text(binary
data)
at COM end to a .txt file and at the C# end read that binary file for
decryption.

Let me know if I was able to communicate your doubts,or if you want the
whole code.

You haven't really addressed the point on which I'm confused (side note: Who
is teaching foreign students of English to use the word "doubt" when they
mean "question"? I hear this mistake everywhere.)

Encryption of a string doesn't give you another string. It gives you binary
data. So it's not appropriate for the C++ clients to be using BSTR either,
everything should be SAFEARRAY(VT_UI1). Unless the interface was originally
designed for strings and encryption was added later, in which case using the
BSTR capability of having embedded nulls seems a reasonable way to avoid a
breaking change to the interface. .NET is still going to have problems
though, because .NET supports a subset of the BSTR concept which doesn't
include the BSTR as a counted array of bytes.

So maybe a good option would be for the interface to go ahead an return a
BSTR (counted array of bytes format), and provide a conversion function, not
used by the C++ callers, which turns that BSTR into a SAFEARRAY(VT_UI1) or
even directly into a .NET managed byte array. This could even be
implemented in C# if desired. Just use IntPtr to hold the BSTR, call
SysStringByteLen and use pointers or Marshal.Copy to transfer into a real
byte array.

http://msdn.microsoft.com/en-us/library/ms146631.aspx
--
regards,
Shamsheer


Ben Voigt said:
shamsh said:
I cannot pass the byteArray as it is from the C++ function as I'm not
suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.

You can't change the signature because it would mess up other callers,
but
you can add/change the conversion to BSTR? This doesn't make any sense
to
me. Are you implementing an interface, the other providers of which are
really returning text strings and you need to return binary data instead?
In that case using the same interface is probably not the best idea
because
you're violating the LSP in subtle ways.
I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

--
regards,
Shamsheer


:

shamsh wrote:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData,
&dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this
one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData =
StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any encoding
at
all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what you
have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 
S

shamsh

I tried using a similar code related to Marshal.Copy(),still I'm getting the
"BAD DATA" error.
I'm not sure if the encrypted text from COM is being passed correctly to the
IntPtr(Dotnet).

I'm very new to C++ ,hence is it possible for you to provide a sample code
in C++ which will convert BSTR into a SAFEARRAY(VT_UI1).

--
regards,
Shamsheer


Ben Voigt said:
shamsh said:
Let me put the whole scenario for you.
Currently the whole application is using this COM DLL which has function
for
encryption/decryption.We are in process of converting lot of business
object
to .Net and some will remain as it is in C++.We also have a equivalent
.Net
code for encryption/decryption.
So we might have a suitation where an encryption might take place using
COM
and decryption using .Net.
Thats the reason for not changing the signature of the function.
The C++ caller function need not bother about the binary data being
converted to BSTR ,it should be happy if it is getting an encrypted string
from the COM.

Actually I'm able to return the binary data in the form of BSTR by using
CCOMBSTR,but at .Net side while
decrypting(rsacryptoserviceprovider.decrypt()) is either giving "Bad Data"
or
the number of bytes is more than 128 for which the rsa modulus doesn't
support.

This whole thing works perfectly if I write the encrypyed text(binary
data)
at COM end to a .txt file and at the C# end read that binary file for
decryption.

Let me know if I was able to communicate your doubts,or if you want the
whole code.

You haven't really addressed the point on which I'm confused (side note: Who
is teaching foreign students of English to use the word "doubt" when they
mean "question"? I hear this mistake everywhere.)

Encryption of a string doesn't give you another string. It gives you binary
data. So it's not appropriate for the C++ clients to be using BSTR either,
everything should be SAFEARRAY(VT_UI1). Unless the interface was originally
designed for strings and encryption was added later, in which case using the
BSTR capability of having embedded nulls seems a reasonable way to avoid a
breaking change to the interface. .NET is still going to have problems
though, because .NET supports a subset of the BSTR concept which doesn't
include the BSTR as a counted array of bytes.

So maybe a good option would be for the interface to go ahead an return a
BSTR (counted array of bytes format), and provide a conversion function, not
used by the C++ callers, which turns that BSTR into a SAFEARRAY(VT_UI1) or
even directly into a .NET managed byte array. This could even be
implemented in C# if desired. Just use IntPtr to hold the BSTR, call
SysStringByteLen and use pointers or Marshal.Copy to transfer into a real
byte array.

http://msdn.microsoft.com/en-us/library/ms146631.aspx
--
regards,
Shamsheer


Ben Voigt said:
I cannot pass the byteArray as it is from the C++ function as I'm not
suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.

You can't change the signature because it would mess up other callers,
but
you can add/change the conversion to BSTR? This doesn't make any sense
to
me. Are you implementing an interface, the other providers of which are
really returning text strings and you need to return binary data instead?
In that case using the same interface is probably not the best idea
because
you're violating the LSP in subtle ways.

I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

--
regards,
Shamsheer


:

shamsh wrote:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData,
&dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str); //BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this
one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData =
StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText = rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[] StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any encoding
at
all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what you
have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 
B

Ben Voigt [C++ MVP]

shamsh said:
I tried using a similar code related to Marshal.Copy(),still I'm getting
the
"BAD DATA" error.
I'm not sure if the encrypted text from COM is being passed correctly to
the
IntPtr(Dotnet).

I'm very new to C++ ,hence is it possible for you to provide a sample code
in C++ which will convert BSTR into a SAFEARRAY(VT_UI1).

You should need:

size_t count = SysStringByteLen(bstr);
SAFEARRAY* sa = SafeArrayCreateVector(VT_UI1, 0, count);
BYTE* pContent;
SafeArrayAccessData(sa, (void HUGEP**)&pContent);
CopyMemory(pContent, bstr, count);
SafeArrayUnaccessData(sa);

and perhaps
SysFreeString(bstr);

I haven't tested that exact code, let me know if you run into trouble with
it.

--
regards,
Shamsheer


Ben Voigt said:
shamsh said:
Let me put the whole scenario for you.
Currently the whole application is using this COM DLL which has
function
for
encryption/decryption.We are in process of converting lot of business
object
to .Net and some will remain as it is in C++.We also have a equivalent
.Net
code for encryption/decryption.
So we might have a suitation where an encryption might take place using
COM
and decryption using .Net.
Thats the reason for not changing the signature of the function.
The C++ caller function need not bother about the binary data being
converted to BSTR ,it should be happy if it is getting an encrypted
string
from the COM.

Actually I'm able to return the binary data in the form of BSTR by
using
CCOMBSTR,but at .Net side while
decrypting(rsacryptoserviceprovider.decrypt()) is either giving "Bad
Data"
or
the number of bytes is more than 128 for which the rsa modulus doesn't
support.

This whole thing works perfectly if I write the encrypyed text(binary
data)
at COM end to a .txt file and at the C# end read that binary file for
decryption.

Let me know if I was able to communicate your doubts,or if you want the
whole code.

You haven't really addressed the point on which I'm confused (side note:
Who
is teaching foreign students of English to use the word "doubt" when they
mean "question"? I hear this mistake everywhere.)

Encryption of a string doesn't give you another string. It gives you
binary
data. So it's not appropriate for the C++ clients to be using BSTR
either,
everything should be SAFEARRAY(VT_UI1). Unless the interface was
originally
designed for strings and encryption was added later, in which case using
the
BSTR capability of having embedded nulls seems a reasonable way to avoid
a
breaking change to the interface. .NET is still going to have problems
though, because .NET supports a subset of the BSTR concept which doesn't
include the BSTR as a counted array of bytes.

So maybe a good option would be for the interface to go ahead an return a
BSTR (counted array of bytes format), and provide a conversion function,
not
used by the C++ callers, which turns that BSTR into a SAFEARRAY(VT_UI1)
or
even directly into a .NET managed byte array. This could even be
implemented in C# if desired. Just use IntPtr to hold the BSTR, call
SysStringByteLen and use pointers or Marshal.Copy to transfer into a real
byte array.

http://msdn.microsoft.com/en-us/library/ms146631.aspx
--
regards,
Shamsheer


:



I cannot pass the byteArray as it is from the C++ function as I'm
not
suposed
to change the function signature(it is being called throughout the
application).So I need to convert the byteArray to BSTR.

You can't change the signature because it would mess up other callers,
but
you can add/change the conversion to BSTR? This doesn't make any
sense
to
me. Are you implementing an interface, the other providers of which
are
really returning text strings and you need to return binary data
instead?
In that case using the same interface is probably not the best idea
because
you're violating the LSP in subtle ways.

I'm very new to C++ code.Can you please provide the sample code.
Moreover at C# end I need to do some kind of encoding to convert the
encrypted string coming from C++ to, byteArray.

--
regards,
Shamsheer


:

shamsh wrote:
I'm doing something like
STDMETHODIMP CSecMgr::Encrypt(BSTR bstrEncryptText, BSTR*
pbstrResultText) {
// Encrypt data
if (!CryptEncrypt(hKey, NULL, TRUE, 0, pbData,
&dwDataLen,
dwEncryptedLen))
{
// Error
_tprintf(_T("CryptEncrypt error 0x%x\n"),
GetLastError()); return 1;
}

}

I'm able to write (LPCVOID)pbData to a .txt file(binary) and at
C#
end I'm able to decrypt it back.
Instead of writing it to a .txt file I'm supposed to make it as a
return type. I'm doing something like

char *pStr = new char[dwDataLen];
memcpy(pStr, pbData, dwDataLen );
//pStr[dwDataLen] = '\0';
CString str(pStr);
//delete [] pStr;

//*pbstrResultText = CComBSTR(str);
//BSTR(str.AllocSysString());
tried this option
*pbstrResultText = SysAllocStringByteLen(pStr,dwDataLen); // this
one
also

At the C#.net end

public static string DecryptData(string data2Decrypt)

{

byte[] encryptedData =
StrToByteArray(data2Decrypt);

// some other steps

byte[] decryptedText =
rsa.Decrypt(encryptedData,
false); // This either gives BAD Data or size related error
}
public static byte[]
StrToByteArray(string
str) {
System.Text.ASCIIEncoding encoding = new
System.Text.ASCIIEncoding();
return encoding.GetBytes(str);
}

Well, you don't want the ASCII encoding. You don't want any
encoding
at
all
(SysAllocStringByteLen works on raw bytes). It so happens that
Encoding.UnicodeEncoding is the "no translation" option.

But you really should be passing a byte array because that's what
you
have.
COM supports byte arrays, the C++ would be SAFEARRAY. Just do
SafeArrayCreateVector(VT_UI1, 0, length);SafeArrayAccessDatamemcpy
or
directly perform your calculations into the array data
areaSafeArrayUnaccessData
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top