SQL Server memory ballooning during mass import

M

Mark Rae

Hi,

I have a Windows service which iterates through a mySQL DataReader and pumps
the records into SQL Server 2000. There can be upwards of 300,000 rows in
the mySQL DataReader. Each record in the mySQL DataReader may represent a
new record in the SQL Server database or an update of an existing record.

The service runs quite happily consuming just under 30Mb RAM, though it does
use a fair bit of CPU, which is not surprising.

The problem (if it even is a problem...) now is that the sqlserver.exe
process is ballooning out of all proportion during the part of the process
where it does the 300,000 or so database writes, almost like its caching
them, or holding them in a transaction before committing them...

However, doing a SELECT COUNT(*) on the table that the records are being
pumped into clearly shows that they're going in one by one.

Can anyone see anything glaringly obvious that I've missed from the
following code which might cause this...?

While objMySQLDR.Read ' mySQLDataReader
objSQLDS = New DataSet() ' SQL Server DataSet
objSQLDA = New SqlDataAdapter("SELECT * FROM tbl_av_content_download
WHERE ttmms_acct_id = " & pintTTMMSAcctID.ToString & "
AND tbl_cms_download_download_id = " & objMySQLDR(1).ToString,
objSQLConnection)
objSQLCommandBuilder = New SqlCommandBuilder(objSQLDA)
objSQLDA.Fill(objSQLDS, "tbl_av_content_download")
If objSQLDS.Tables(0).Rows.Count = 0 Then ' adding new row to SQL
Server
objImportRow = objSQLDS.Tables(0).NewRow ' create a blank row
For intRow As Integer = 0 to objMySQLDR.FieldCount - 1
objImportRow.Item(intRow) = objMySQLDR(intRow) ' sync the fields
Next
objSQLDS.Tables(0).Rows.Add(objImportRow) ' add the new Row to the
DataSet
Else ' updating existing row in SQL Server
objImportRow = objSQLDS.Tables(0).Rows(0) ' use the existing row
objImportRow.BeginEdit ' set it into Edit mode
For intRow As Integer = 0 to objMySQLDR.FieldCount - 1
objImportRow.Item(intRow) = objMySQLDR(intRow) ' sync the fields
Next
objImportRow.EndEdit ' set it out of Edit mode
End If
objSQLDA.Update(objSQLDS.Tables("tbl_av_content_download")) ' update the
Row object
objImportRow.AcceptChanges ' write the Row back to SQL Server
objSQLCommandBuilder.Dispose
objSQLDA.Dispose
objSQLDS.Dispose
objImportRow = Nothing
objMySQLDS.Clear
objMySQLDS.Dispose
End while
objMySQLDR.Close
 
W

William \(Bill\) Vaughn

I expect what you're seeing is the data cache growing. However, this
approach is far from optimal. ADO.NET is not designed to be a mechanism to
perform bulk operations. If you have to more than a dozen or so rows, it's
far (far) more efficient to use DTS/BCP--especially when working with SQL
Server. You'll find that DTS scripts are easy to setup and can import data
from virtually any source. If you need to perform inbound processing,
(changing each row), consider doing that server-side with a SP.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top