DataReader Limits data returned!

T

TY

I have a stored procedure that returns xml data (using For XML), it returns
about 9000 records. I use a datareader to write the data to a XML file.
For some reason, the datareader doesn't write all data, it seems like the
data returned has some size limit.

here is my code, my code is based on a msdn example for using getBytes were
it writes data to the stream by chunks based on the buffer size:

'----some code here to fill in the stored procedure name and
parameters.
Dim mydatareader As SqlDataReader
connectionObj.Open()
mydatareader = CommandObj.ExecuteReader

Dim fs As FileStream
Dim bw As BinaryWriter
Dim bufferSize As Integer = 100
Dim outChar(bufferSize - 1) As Char
Dim retval As Long
Dim startindex As Long = 0

If mydatareader.Read Then

fs = New FileStream("c:\test.xml", FileMode.OpenOrCreate,
FileAccess.Write)
bw = New BinaryWriter(fs)
retval = mydatareader.GetChars(0, startindex, outChar, 0, 100)

' Continue reading and writing while there are bytes beyond the
size of the buffer.
Do While retval = bufferSize
'bw.Write(outByte)
bw.Write(outChar)
bw.Flush()

' Reposition the start index to the end of the last buffer
and fill the buffer.
startindex += bufferSize
retval = mydatareader.GetChars(0, startindex, outChar, 0,
bufferSize)
Loop

'Write the remaining buffer.
bw.Write(outChar, 0, CType(retval - 1, Integer))
bw.Flush()

' Close the output file.
bw.Close()
fs.Close()
End If
connectionObj.Close()
connectionObj.Dispose()
CommandObj.Dispose()
 
W

W.G. Ryan - MVP

TY said:
I have a stored procedure that returns xml data (using For XML), it returns
about 9000 records. I use a datareader to write the data to a XML file.
For some reason, the datareader doesn't write all data, it seems like the
data returned has some size limit.

here is my code, my code is based on a msdn example for using getBytes
were
it writes data to the stream by chunks based on the buffer size:

'----some code here to fill in the stored procedure name and
parameters.
Dim mydatareader As SqlDataReader
connectionObj.Open()
mydatareader = CommandObj.ExecuteReader

Dim fs As FileStream
Dim bw As BinaryWriter
Dim bufferSize As Integer = 100
Dim outChar(bufferSize - 1) As Char
Dim retval As Long
Dim startindex As Long = 0

If mydatareader.Read Then

fs = New FileStream("c:\test.xml", FileMode.OpenOrCreate,
FileAccess.Write)
bw = New BinaryWriter(fs)
retval = mydatareader.GetChars(0, startindex, outChar, 0, 100)

' Continue reading and writing while there are bytes beyond the
size of the buffer.
Do While retval = bufferSize
'bw.Write(outByte)
bw.Write(outChar)
bw.Flush()

' Reposition the start index to the end of the last buffer
and fill the buffer.
startindex += bufferSize
retval = mydatareader.GetChars(0, startindex, outChar, 0,
bufferSize)
Loop

'Write the remaining buffer.
bw.Write(outChar, 0, CType(retval - 1, Integer))
bw.Flush()

' Close the output file.
bw.Close()
fs.Close()
End If
connectionObj.Close()
connectionObj.Dispose()
CommandObj.Dispose()
 
W

W.G. Ryan - MVP

TY said:
I have a stored procedure that returns xml data (using For XML), it returns
about 9000 records. I use a datareader to write the data to a XML file.
For some reason, the datareader doesn't write all data, it seems like the
data returned has some size limit.

here is my code, my code is based on a msdn example for using getBytes
were
it writes data to the stream by chunks based on the buffer size:

'----some code here to fill in the stored procedure name and
parameters.
Dim mydatareader As SqlDataReader
connectionObj.Open()
mydatareader = CommandObj.ExecuteReader

Dim fs As FileStream
Dim bw As BinaryWriter
Dim bufferSize As Integer = 100
Dim outChar(bufferSize - 1) As Char
Dim retval As Long
Dim startindex As Long = 0

If mydatareader.Read Then

fs = New FileStream("c:\test.xml", FileMode.OpenOrCreate,
FileAccess.Write)
bw = New BinaryWriter(fs)
retval = mydatareader.GetChars(0, startindex, outChar, 0, 100)

' Continue reading and writing while there are bytes beyond the
size of the buffer.
Do While retval = bufferSize
'bw.Write(outByte)
bw.Write(outChar)
bw.Flush()

' Reposition the start index to the end of the last buffer
and fill the buffer.
startindex += bufferSize
retval = mydatareader.GetChars(0, startindex, outChar, 0,
bufferSize)
Loop

'Write the remaining buffer.
bw.Write(outChar, 0, CType(retval - 1, Integer))
bw.Flush()

' Close the output file.
bw.Close()
fs.Close()
End If
connectionObj.Close()
connectionObj.Dispose()
CommandObj.Dispose()
--Ty, is the problem that the records aren't all getting written or that no
one record is getting written completely?
 
Joined
May 16, 2012
Messages
1
Reaction score
0
I know this is an old thread but I'm having the same problem. I am using OracleDataReader to read a table containing upwards of 20,000 records. For me the limit happens to be 9002; any attempt to obtain a field value from any record 9003 and beyond results in a severe crash of the application -- not even a stack trace to help debugging, and try/catch blocks do not work. Application memory is not the problem, as I can populate the arrays with dummy data for elements beyond 9002.

I'll keep looking for an answer. In the meantime my strategy will be to use DataSet objects, rather than DataReader. If I find a useful answer to the DataReader issue I'll post it here.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top