DataReader Limits data returned!

Discussion in 'Microsoft ADO .NET' started by TY, Jan 26, 2006.

  1. TY

    TY Guest

    I have a stored procedure that returns xml data (using For XML), it returns
    about 9000 records. I use a datareader to write the data to a XML file.
    For some reason, the datareader doesn't write all data, it seems like the
    data returned has some size limit.

    here is my code, my code is based on a msdn example for using getBytes were
    it writes data to the stream by chunks based on the buffer size:

    '----some code here to fill in the stored procedure name and
    parameters.
    Dim mydatareader As SqlDataReader
    connectionObj.Open()
    mydatareader = CommandObj.ExecuteReader

    Dim fs As FileStream
    Dim bw As BinaryWriter
    Dim bufferSize As Integer = 100
    Dim outChar(bufferSize - 1) As Char
    Dim retval As Long
    Dim startindex As Long = 0

    If mydatareader.Read Then

    fs = New FileStream("c:\test.xml", FileMode.OpenOrCreate,
    FileAccess.Write)
    bw = New BinaryWriter(fs)
    retval = mydatareader.GetChars(0, startindex, outChar, 0, 100)

    ' Continue reading and writing while there are bytes beyond the
    size of the buffer.
    Do While retval = bufferSize
    'bw.Write(outByte)
    bw.Write(outChar)
    bw.Flush()

    ' Reposition the start index to the end of the last buffer
    and fill the buffer.
    startindex += bufferSize
    retval = mydatareader.GetChars(0, startindex, outChar, 0,
    bufferSize)
    Loop

    'Write the remaining buffer.
    bw.Write(outChar, 0, CType(retval - 1, Integer))
    bw.Flush()

    ' Close the output file.
    bw.Close()
    fs.Close()
    End If
    connectionObj.Close()
    connectionObj.Dispose()
    CommandObj.Dispose()
     
    TY, Jan 26, 2006
    #1
    1. Advertisements

  2. "TY" <> wrote in message
    news:...
    >I have a stored procedure that returns xml data (using For XML), it returns
    > about 9000 records. I use a datareader to write the data to a XML file.
    > For some reason, the datareader doesn't write all data, it seems like the
    > data returned has some size limit.
    >
    > here is my code, my code is based on a msdn example for using getBytes
    > were
    > it writes data to the stream by chunks based on the buffer size:
    >
    > '----some code here to fill in the stored procedure name and
    > parameters.
    > Dim mydatareader As SqlDataReader
    > connectionObj.Open()
    > mydatareader = CommandObj.ExecuteReader
    >
    > Dim fs As FileStream
    > Dim bw As BinaryWriter
    > Dim bufferSize As Integer = 100
    > Dim outChar(bufferSize - 1) As Char
    > Dim retval As Long
    > Dim startindex As Long = 0
    >
    > If mydatareader.Read Then
    >
    > fs = New FileStream("c:\test.xml", FileMode.OpenOrCreate,
    > FileAccess.Write)
    > bw = New BinaryWriter(fs)
    > retval = mydatareader.GetChars(0, startindex, outChar, 0, 100)
    >
    > ' Continue reading and writing while there are bytes beyond the
    > size of the buffer.
    > Do While retval = bufferSize
    > 'bw.Write(outByte)
    > bw.Write(outChar)
    > bw.Flush()
    >
    > ' Reposition the start index to the end of the last buffer
    > and fill the buffer.
    > startindex += bufferSize
    > retval = mydatareader.GetChars(0, startindex, outChar, 0,
    > bufferSize)
    > Loop
    >
    > 'Write the remaining buffer.
    > bw.Write(outChar, 0, CType(retval - 1, Integer))
    > bw.Flush()
    >
    > ' Close the output file.
    > bw.Close()
    > fs.Close()
    > End If
    > connectionObj.Close()
    > connectionObj.Dispose()
    > CommandObj.Dispose()
     
    W.G. Ryan - MVP, Jan 26, 2006
    #2
    1. Advertisements

  3. "TY" <> wrote in message
    news:...
    >I have a stored procedure that returns xml data (using For XML), it returns
    > about 9000 records. I use a datareader to write the data to a XML file.
    > For some reason, the datareader doesn't write all data, it seems like the
    > data returned has some size limit.
    >
    > here is my code, my code is based on a msdn example for using getBytes
    > were
    > it writes data to the stream by chunks based on the buffer size:
    >
    > '----some code here to fill in the stored procedure name and
    > parameters.
    > Dim mydatareader As SqlDataReader
    > connectionObj.Open()
    > mydatareader = CommandObj.ExecuteReader
    >
    > Dim fs As FileStream
    > Dim bw As BinaryWriter
    > Dim bufferSize As Integer = 100
    > Dim outChar(bufferSize - 1) As Char
    > Dim retval As Long
    > Dim startindex As Long = 0
    >
    > If mydatareader.Read Then
    >
    > fs = New FileStream("c:\test.xml", FileMode.OpenOrCreate,
    > FileAccess.Write)
    > bw = New BinaryWriter(fs)
    > retval = mydatareader.GetChars(0, startindex, outChar, 0, 100)
    >
    > ' Continue reading and writing while there are bytes beyond the
    > size of the buffer.
    > Do While retval = bufferSize
    > 'bw.Write(outByte)
    > bw.Write(outChar)
    > bw.Flush()
    >
    > ' Reposition the start index to the end of the last buffer
    > and fill the buffer.
    > startindex += bufferSize
    > retval = mydatareader.GetChars(0, startindex, outChar, 0,
    > bufferSize)
    > Loop
    >
    > 'Write the remaining buffer.
    > bw.Write(outChar, 0, CType(retval - 1, Integer))
    > bw.Flush()
    >
    > ' Close the output file.
    > bw.Close()
    > fs.Close()
    > End If
    > connectionObj.Close()
    > connectionObj.Dispose()
    > CommandObj.Dispose()

    --Ty, is the problem that the records aren't all getting written or that no
    one record is getting written completely?
     
    W.G. Ryan - MVP, Jan 26, 2006
    #3
  4. TY

    djstew

    Joined:
    May 16, 2012
    Messages:
    1
    Likes Received:
    0
    I know this is an old thread but I'm having the same problem. I am using OracleDataReader to read a table containing upwards of 20,000 records. For me the limit happens to be 9002; any attempt to obtain a field value from any record 9003 and beyond results in a severe crash of the application -- not even a stack trace to help debugging, and try/catch blocks do not work. Application memory is not the problem, as I can populate the arrays with dummy data for elements beyond 9002.

    I'll keep looking for an answer. In the meantime my strategy will be to use DataSet objects, rather than DataReader. If I find a useful answer to the DataReader issue I'll post it here.
     
    djstew, May 16, 2012
    #4
    1. Advertisements

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Charlie

    Large Data Sets using DataReader

    Charlie, Sep 19, 2003, in forum: Microsoft ADO .NET
    Replies:
    1
    Views:
    534
    William \(Bill\) Vaughn
    Sep 19, 2003
  2. Makarand

    Binding data from DataReader to DataGrid

    Makarand, Oct 28, 2003, in forum: Microsoft ADO .NET
    Replies:
    2
    Views:
    580
    William Ryan
    Oct 28, 2003
  3. Jim Heavey

    DataReader and Returned Parms

    Jim Heavey, Dec 19, 2003, in forum: Microsoft ADO .NET
    Replies:
    3
    Views:
    296
    William Ryan
    Dec 19, 2003
  4. William \(Bill\) Vaughn

    Re: Retrieving DataReader data

    William \(Bill\) Vaughn, Jul 27, 2004, in forum: Microsoft ADO .NET
    Replies:
    0
    Views:
    267
    William \(Bill\) Vaughn
    Jul 27, 2004
  5. Doug Zody

    filtering the data collected in DataReader

    Doug Zody, Jan 6, 2005, in forum: Microsoft ADO .NET
    Replies:
    2
    Views:
    1,980
    Sahil Malik
    Jan 7, 2005
Loading...

Share This Page