OracleType.NVarChar parameters handled differently in framework 1.1?

S

SmokeMe

There appears to have been a change in behaviour between handling of
OracleType.NVarChar parameters in 1.1? In my case this breaks code using
DbType.String or the IDataParameter interface with strings over 2000
characters.

A 2001 character string parameter, consisting entirely of 7 bit characters,
sent to a 9.2.0.1 stored procedure works under the optional 1.0 Oracle
driver. The exact same code and database now falls over with
"ORA-01460:unimplemented or unreasonable conversion requested" using the
supplied 1.1 OracleClient.dll.

This change in behavior leads me to the conclusion that the biggest string
you can now pass to an Oracle database using the
IDataParameter interface is now only 2000 characters, primarily because
OracleClient 1.1 maps the String native type to OracleType.NVarChar with a
4000 byte limit?

This 2000 char limit would be required to handle the possibility that the
database *may* be configured with a 16bit NLS_NCHAR_CHARACTERSET like
AL16UTF16 as is sometimes recommended by Oracle? Even though you are could
actually be inserting a plain old ascii string into a plain old
varchar2(4000)
column on a plain old USASCII7 Oracle instance?

Assuming this is the case, can anyone think of a workaround, other than

1. Having to abandon the database independent IDataParameter interface and
change all the client code to discover and use whichever native
<database>Type is appropriate for the connection?
2. Stick to strings of less than 2000 characters.

Rich.
 
D

David Browne

SmokeMe said:
There appears to have been a change in behaviour between handling of
OracleType.NVarChar parameters in 1.1? In my case this breaks code using
DbType.String or the IDataParameter interface with strings over 2000
characters.

A 2001 character string parameter, consisting entirely of 7 bit characters,
sent to a 9.2.0.1 stored procedure works under the optional 1.0 Oracle
driver.

Actually the string parameter consists of 16-bit Unicode characters. The
default mapping for that type is NChar, since it's bitwise compatible. But
if your procedures and SQL are using VARCHAR2 types, you should force your
parameters to be VARCHAR2. You don't have to use the provider-specific
types to do this, you just have to specify the DbType of the parameter, and
not let the provider use its default mapping.

System.Data.IDbDataParameter p;
.. . .
p.DbType = System.Data.DbType.AnsiString;

This 2000 char limit would be required to handle the possibility that the
database *may* be configured with a 16bit NLS_NCHAR_CHARACTERSET like
AL16UTF16 as is sometimes recommended by Oracle?

No the limit is there because NChar in 9i is always a unicode type, and in
Oracle SQL it always has a 2000 character limit. Period.

David
 
K

Kipper

Thanks for your prompt response, following from your comments my issue may
be related to a change in the default behaviour of the OracleClient.dll,
with some confusion within the help docs?

1.0 help file that came with the 1.0 download states:
(OracleParameter.DbType Property): "For String data types the default is
DbType.AnsiString."

1.1 help file that came with VS2003 states:
(OracleParameter.DbType Property): "For String data types the default is
DbType.AnsiString."

i.e the same...

....but... if you look at the grid on the 1.1 VS2003 help section "Using
Parameters with a DataAdapter" it shows the default mapping to "NVarChar" as
you correctly point out? Such a change in default behaviour may explain the
errors I now see with strings over 2000 characters?

But this creates a further question, have I been a member of the massed
ranks of coders "getting away with it" i.e. have any occasions where I have
stored say a Euro symbol in a UFT8 instance worked by accident rather by
design because the readers and writers all use the same NLS_LANG settings?

What I'm getting at is what conversions take place when you have a String
data type packaged as a System.Data.DbType.AnsiString parameter going to a
UTF8 instance. Could the local NLS_LANG setting cause characters to be
corrupted/lost on the way into the database?

Rich.
 
S

SqlJunkies User

Set your IDbDataParameter.DbType to DbType.AnsiString and everything should work as before.

When using an OracleParameter instance of the IDbDataParameter interface, it appears as though setting the parameter.DbType to DbType.String or DbType.StringFixedLength causes the Oracle instance OracleParameter.OracleType to change to OracleType.NVarChar or OracleType.NChar respectively. If you set the DbType to DbType.AnsiString or DbType.AnsiStringFixedLength, however, the OracleParameter.OracleType seems to map to the OracleType.VarChar and Char respectively. I discovered this with some trial and error. A simple Search and replace of all your DbType.String to DbType.AnsiString should do the trick for you and not break any of your existing SQL and OleDB use of IDbDataParameter. Let me know if this works for you or if you find another solution.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top