sqlTransaction Complete Problem-dataype change from image to varbi

P

Phil Johnson

Hi,

This is being called in a C# loop within an ado.net transaction from C# 1.1
code. It is used to write large file data to a SQL Server database in
chunks, rather than in one go.

I had the stored procedure below, which worked until the image datatype was
changed to varbinary(max).

Now I get the following error:

This SqlTransaction has completed; it is no longer usable

The stored procedure is here (NOTE this is the version that works with the
dataype set to image. If I change the datatype to varbinary(max), and
obviously change the field in the table as well, the code fails with the
error given above).

ALTER PROCEDURE [dbo].[FileUploadData_Upd]
@FileID Uniqueidentifier,
--@data varbinary(max),
@data image,
@append bit
AS

SET NOCOUNT ON
IF @append = 0
UPDATE dbo.FileUpload
SET Data = @data
WHERE FileId = @FileID;
IF @append = 1
UPDATE dbo.FileUpload
SET Data.write(@data,NULL,0)


Any help would be appreciated

--
Regards,

Phil Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
P

Phil Johnson

I should add, I have modified the datatype of the parameter in C# to be
varBinary:

SqlParameter paramData = cm.Parameters.Add("@data",
System.Data.SqlDbType.VarBinary);

And the isolation level of the transaction is ReadCommitted.

--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
M

Marc Gravell

OK; I got it working just fine in SQL 2005 using the following (where
my table is "mgt"); I'm guessing (from the similarity) that your code
is derived from an old post of mine, so it should be easy to fit these
in...

SP: [note btw that you missed a WHERE clause from yours...]

SET NOCOUNT ON

DECLARE @offset int
SET @offset = CASE @append
WHEN 1 THEN NULL ELSE 0 END
UPDATE mgt
SET Data.write(@data,@offset,NULL)
WHERE id = @id

C#: I tweaked the declaration, and explicitely set the Size properties
(marked ****; might not be needed):

SqlParameter paramData =
cmd.Parameters.Add("@data", System.Data.SqlDbType.VarBinary); // the
binary

if (bytesRead == SQL_IMAGE_BUFFER_SIZE)
{ // pass the filled buffer
paramData.Value = buffer;
paramData.Size = buffer.Length; //
****
}
else
{ // didn't fill an entire buffer
byte[] smallBuffer = new
byte[bytesRead];
Buffer.BlockCopy(buffer, 0,
smallBuffer, 0, bytesRead);
paramData.Value = smallBuffer;
paramData.Size = smallBuffer.Length;
// ****
}

I then wrapped in a SqlTransaction (although TransactionScope probably
would have been easier) between the conn.Open() and conn.Close():

conn.Open();
using (SqlTransaction tran = conn.BeginTransaction())
{
cmd.Transaction = tran;
try
{
while ((bytesRead = input.Read(buffer, 0,
SQL_IMAGE_BUFFER_SIZE)) > 0)
{
// ...SNIP...
}
tran.Commit();
}
catch
{
tran.Rollback();
}
}
conn.Close();

Works fine for me ;-p

Marc
 
P

Phil Johnson

Thanks for the response Marc, I can post the loop itself... its a big ntier
app and the transaction is passed in to the method the loop is in, but the
isolationlevel is set to ReadCommitted.

Also the buffer size for the SQL_IMAGE_BUFFER_SIZE is 8040000 (just over 8 mb)

I have modified this C# code since my initial posting so the parmeter is of
varbinary datatype rather than image but I get the same issue.

The first insert executes, but when it tries to loop and execute the append
it get the Transaction is complete and cannot be used exception.

// Set up the byte count
long totalBytes = 0;

// Get a file stream to the input file
FileStream input = System.IO.File.OpenRead(@_fileLocation);

// Set the command used above to the sproc to load the file data.
cm.CommandText="FileUploadData_Upd";
// Set any parameters that need to be set (get references to the data and
append parameters as we will need to modify them later)
cm.Parameters.Add("@FileID", PrimaryKey );
SqlParameter paramData = cm.Parameters.Add("@data",
System.Data.SqlDbType.VarBinary);
SqlParameter paramAppend = cm.Parameters.Add("@append",
System.Data.SqlDbType.Bit);
// The first pass should always replace existing
paramAppend.Value = false;

// Set the buffer size - 8040 is the optimal buffer size in SQL Server 2000
so use multiples of 8040
const int SQL_IMAGE_BUFFER_SIZE = 8040000;
byte[] buffer = new byte[SQL_IMAGE_BUFFER_SIZE];

// count for the bytes read in
int bytesRead;

// Now, load the file data into db a chunk at a time.
while((bytesRead = input.Read(buffer, 0, SQL_IMAGE_BUFFER_SIZE)) > 0)
{
if(bytesRead==SQL_IMAGE_BUFFER_SIZE)
{
// pass the filled buffer into the sproc parameter
paramData.Value = buffer;
}
else
{
// didn't fill an entire buffer so we need to change the buffer size
byte[] smallBuffer = new byte[bytesRead];
Buffer.BlockCopy(buffer, 0, smallBuffer, 0, bytesRead);
paramData.Value = smallBuffer;
}

cm.ExecuteNonQuery();
// subsequent calls should append data so set the sproc parameter
paramAppend.Value = true;
// Update the total bytes count.
totalBytes += bytesRead;

// Force a tidy up to conserve system memory
System.GC.Collect();
}
--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
M

Marc Gravell

Oops; forgot the throw! Very important... ;-p

catch {
tran.Rollback();
throw;
}

Marc
 
M

Marc Gravell

And just to show how TransactionScope makes life easier:

using (SqlCommand cmd = conn.CreateCommand())
using (TransactionScope tran = new TransactionScope()) {
// ...SNIP... (everything)
tran.Complete();
}
 
M

Marc Gravell

The one thing I'm not seeing in the code is anything to do with a
transaction...? Where is this transaction? Is it an ambient
(TransactionScope) transaction?

I altered my original SQL2000 code from here:
http://groups.google.co.uk/group/microsoft.public.dotnet.languages.csharp/msg/314e7e3782e59a93

(with the changes as per the previous 2 posts), and it worked fine for
both SqlTransaction and TransactionScope?

So the transaction that appears to be closing itself: where is this
declared?
It is perhaps in a freaky orphan state, and the GC.Collect() is making
it call the finalize, hence tear down the transaction?

(in which case, don't let it get orphaned! If this is the problem, the
GC.Collect(), although unusual, has probably (by chance) just saved
you from a hard to track bug).

Marc
 
P

Phil Johnson

Hi Marc, thanks for all the info.... I think the code I originally used was
written by yourself. Thanks for that ;-)

Its been working fine until this change in the datatype... like I say if I
switch back to an image datatype it's fine.

I tried taking out the gc.collect, but I still get the same error.

Its part of a big business object framework so the transaction gets passed
around a bit with different commands.

The transaction passed in is originally declared as below, and my loop
occurs in the Update (tr) call:

cn.Open();
SqlTransaction tr = cn.BeginTransaction(IsolationLevel.ReadCommitted);
try
{
if (IsDirty)
{
Update( tr );
}

tr.Commit();
tr.Dispose();
}
catch
{
tr.Rollback();
throw;
}
--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
M

Marc Gravell

Well, I can't see anything obvious (unless it is simply a timeout
issue); I'm assuming that you are re-using the connection and
associating the transaction with the command? [this issue of having to
pass a database connection around is one of the reasons that
TransactionScope is so appealing].

Unfortunately my attempts to reproduce here all simply work. I suspect
you're going to have to take pieces away until it breaks... you could
also try reducing the buffer size a bit; I know you can use multiples
of 8040, but a thousand seems a bit heavy - try a few less perhaps?

Marc
 
M

Marc Gravell

I suspect you're going to have to take pieces away until it
breaks...

Of course, I meant the other way around: start with something simple
that works, and keep adding pieces *in* until it breaks. It works fine
here...

Marc
 
P

Phil Johnson

Thanks for the help anyway Marc,

I will probably have to do as you say and start with the simplest form and
keep building up on it.

Its a sticky one... If I find out what is causing the problem I will add
another reply here.

Thanks again for your time

--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
M

Marc Gravell

If I find out what is causing the problem I will add
another reply here.

Please do; I'm genuinely interested in what is going wrong - because
(obviously) this is something I need to do occasionally.

Marc
 
P

Phil Johnson

I think it is the size of the buffer... I cut it back to 8040 rather than
8040000 and it started going throught the loop again and again rather than
quiting the second time through.

I was trying a 100MB file and it killed my machine though.... I will try
other sizes when I get back in the office tomorrow to try to find an optimum
size but early indications are that is going to get it.

Thanks again for your help, its greatly appreciated.

--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
M

Marc Gravell

Actually, I've got an idea that might solve both problems - I'm going
to try it on the train tomorrow ;-p

Marc
 
M

Marc Gravell

Right; as I see it there are two issues at current:

a: the need for lots of round-trips to invoke the SP (which is leading
you to use ever-larger buffers)
b: the nuicance of having a long-running transaction against your main
doc table

I think I've found a way to fix both (working fine on SQL Express
2005), but I haven't benchmarked it. Basically, rather than uploading
into the data-table, we'll plan to upload the chunks (as multiple rows
per file) into a staging table [sample A].

Once our upload is complete (into the stage table), we can then use an
SP [sample B] to recombine the record into the live (docs) table - but
our transaction only needs to span this portion (not the upload
itself).

(note: for the C# I've tried to gear it towards your current
transaction implementation - but note that TransactionScope is easier)
To do the upload as efficiently as possible, perhaps use SqlBulkCopy
(equivalent to "bcp") [sample C]; this allows us to firehose data into
the stage table with minimum overhead. Likewise we will need to invoke
the commit [sample D].

Bringing C & D together to orchestrate the transaction gives us
[sample E].

Finally - how do we feed SqlBulkCopy? There are 2 options: a DataTable
(but that would force us to read the entire file first), or an
IDataReader; opt for the latter by creating a shim that looks and
smells like a data-reader, but is actually pumping from a file [sample
F]; note I've done a minimal implementation, but it looks long just
because of the number of "throw new NotImplementedException(...)"
methods.

Marc

---- [sample A: database schema]
create table docs
(id int NOT NULL IDENTITY(1,1),
content varbinary(max) NULL)

create table doc_stage (
[key] uniqueidentifier NOT NULL,
seq int NOT NULL,
content varbinary(max) NOT NULL)

create unique clustered index idx_doc_stage_guid
on doc_stage ([key], seq)

---- [sample B: commit procedure]
ALTER PROC CommitDoc
@key uniqueidentifier,
@id int OUT
AS
SET NOCOUNT ON

DECLARE @seq int, @content varbinary(max)
IF @id IS NULL
BEGIN
INSERT docs ([content]) VALUES (NULL)
SET @id = SCOPE_IDENTITY()
END
SET @seq = 1
WHILE 1 = 1
BEGIN
SELECT @content = [content]
FROM doc_stage
WHERE [key] = @key
AND seq = @seq

IF @@ROWCOUNT = 0 BREAK

IF @seq = 1
BEGIN
UPDATE docs -- in case null...
SET [content] = @content
WHERE id = @id
END
ELSE
UPDATE docs
SET [content].write(@content, NULL, null)
WHERE id = @id

SELECT @seq = @seq + 1
END

DELETE
FROM doc_stage
WHERE [key] = @key
GO

---- [sample C: uploading to stage via SqlBulkCopy]
static Guid Upload(SqlConnection conn, string path)
{
using (UploadReader reader = new UploadReader(path))
using (SqlBulkCopy bcp = new SqlBulkCopy(conn))
{
bcp.BatchSize = 100;
bcp.DestinationTableName = "doc_stage";
bcp.WriteToServer(reader);
return reader.Key;
}
}

---- [sample D: calling the commit proc]
static int Commit(SqlTransaction tran, Guid key, int? id)
{
using (SqlCommand cmd = tran.Connection.CreateCommand())
{
cmd.Transaction = tran;
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = "CommitDoc";
cmd.Parameters.Add("@key",
SqlDbType.UniqueIdentifier).Value = key;

SqlParameter idParam = cmd.Parameters.Add("@id",
SqlDbType.Int);
idParam.Direction = ParameterDirection.InputOutput;
if (id.HasValue)
{
idParam.Value = id.Value;
}
else
{
idParam.Value = DBNull.Value;
}
cmd.ExecuteNonQuery();
return (int) idParam.Value;
}
}

---- [sample E: orchestrating upload and commit]
static void Main()
{
try
{
string CS =
global::ConsoleApplication2.Properties.Settings.Default.pubsConnectionString;
int? id = null; // for insert
string path = @"C:\AjaxControlToolkit.zip"; // just a
file...

using(SqlConnection conn = new SqlConnection(CS)) {
conn.Open();

Guid key = Upload(conn, path);


using (SqlTransaction tran =
conn.BeginTransaction())
{
try
{
id = Commit(tran, key, id);
tran.Commit();
}
catch
{
tran.Rollback();
throw;
}
}
conn.Close();
}
Console.WriteLine("Document {0} uploaded", id);

}
catch (Exception ex)
{
Console.Error.WriteLine(ex.Message);
Console.Error.WriteLine(ex.StackTrace);
}
Console.Error.WriteLine("Press any key");
Console.ReadKey();
}
---- [sample F: shim IDataReader]
class UploadReader : IDataReader
{
readonly Guid key;
const int BUFFER_SIZE = 8040; // 1 data page
byte[] buffer = new byte[BUFFER_SIZE];
int currentBytes, sequence;
Stream source;
public Guid Key { get { return key; } }
public UploadReader(string path) :
this(File.OpenRead(path))
{}
public UploadReader(Stream source)
{
if (source == null) throw new
ArgumentNullException("source");
if (!source.CanRead) throw new
ArgumentException("Cannot read from source", "source");
key = Guid.NewGuid();
this.source = source;
}
public void Dispose() { Close(); }
public void Close()
{
if (source != null)
{
try { source.Close(); }
catch { }
try { source.Dispose(); }
catch { }
source = null;
}
buffer = null;
}
public int Depth {
get { throw new NotImplementedException("Depth"); }
}
public DataTable GetSchemaTable() {
throw new NotImplementedException("GetSchemaTable");
}
public bool IsClosed {
get { return source == null; }
}
public object GetValue(int i)
{
switch (i)
{
case 0:
return Key;
case 1:
return sequence;
case 2:
if (currentBytes == BUFFER_SIZE)
{
return buffer;
}
else
{
byte[] clone = new byte[currentBytes];
Buffer.BlockCopy(buffer, 0, clone, 0,
currentBytes);
return clone;
}
default:
throw new NotImplementedException("GetValue");
}
}
public bool NextResult() {
return false;
}
public bool Read() {
currentBytes = source.Read(buffer, 0, BUFFER_SIZE);
sequence++;
return currentBytes > 0;
}
public int RecordsAffected {
get { throw new
NotImplementedException("RecordsAffected"); }
}
public int FieldCount {
get { return 3; }
}
public bool GetBoolean(int i) {
throw new NotImplementedException("GetBoolean");
}
public byte GetByte(int i) {
throw new NotImplementedException("GetByte");
}
public long GetBytes(int i, long fieldOffset, byte[]
buffer, int bufferoffset, int length) {
throw new NotImplementedException("GetBytes");
}
public char GetChar(int i) {
throw new NotImplementedException("GetChar");
}
public long GetChars(int i, long fieldoffset, char[]
buffer, int bufferoffset, int length) {
throw new NotImplementedException("GetChars");
}
public IDataReader GetData(int i) {
throw new NotImplementedException("GetData");
}
public string GetDataTypeName(int i) {
throw new NotImplementedException("GetDataTypeName");
}
public DateTime GetDateTime(int i) {
throw new NotImplementedException("GetDateTime");
}
public decimal GetDecimal(int i) {
throw new NotImplementedException("GetDecimal");
}
public double GetDouble(int i) {
throw new NotImplementedException("GetDouble");
}
public Type GetFieldType(int i) {
throw new NotImplementedException("GetFieldType");
}
public float GetFloat(int i) {
throw new NotImplementedException("GetFloat");
}
public Guid GetGuid(int i) {
throw new NotImplementedException("GetGuid");
}
public short GetInt16(int i) {
throw new NotImplementedException("GetInt16");
}
public int GetInt32(int i) {
throw new NotImplementedException("GetInt32");
}
public long GetInt64(int i) {
throw new NotImplementedException("GetInt64");
}
public string GetName(int i) {
throw new NotImplementedException("GetName");
}
public int GetOrdinal(string name) {
throw new NotImplementedException("GetOrdinal");
}
public string GetString(int i) {
throw new NotImplementedException("GetString");
}
public int GetValues(object[] values) {
throw new NotImplementedException("GetValues");
}
public bool IsDBNull(int i) {
throw new NotImplementedException("IsDBNull");
}
public object this[string name] {
get { return GetValue(GetOrdinal(name)); }
}
public object this[int i] {
get { return GetValue(i); }
}
}
 
M

Marc Gravell

One other thought; for big files like this, you might consider storing
them outside of the db and simply referencing them via some kind of
relative path? Also: in SQL Server 2008 there is built in file-stream
support; but I haven't played with it yet so I don't know how exactly
it can be used...

Marc
 
P

Phil Johnson

Hi Marc,

Thanks for the code, I will give it a try.

Also, thanks for the suggestion re the large files being stored as files and
the path being stored in the database. I don't think we would go for it
because we use the full text searching in SQL Server, but it could be an
option if all else fails.

Enjoy the weekend :)

--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
P

Phil Johnson

I found out what the issue was on this one... when the datatype had been
changed, the stored procedure to write the file in chunks was modified to use
a newer call than Writetext and the Where statement was omitted.

This meant that for the appends, it was appending the buffer chunk to EVERY
file in the database, which was hundreds.

Thanks for all your help on this one though Marc.
--
Regards,

Phillip Johnson (MCSD For .NET)
PJ Software Development
www.pjsoftwaredevelopment.com
 
M

Marc Gravell

and the Where statement was omitted.
You're going to kick yourself, but I already pointed this out on the
24th:

Glad it started working, though...

<q>
SP: [note btw that you missed a WHERE clause from yours...]
</q>

Marc
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top