Compact .NET & database?

E

Enrico Bizzarri

Hi everybody,
I've used CSV and SQLCE.

I pass data from PC to PocketPC via CSV then I load all my records in a
SQLCE database. I'm Using 19.000 records and I can't use SQL Server 2000.

I think that is a good choice, is quite fast and easy to use.

Note that I haven't to load all the records in the db every time I use my
app.

HTH

Bye

Enrico
 
T

Thore Berntsen

Dosen't loading 19000 records into a SQLCE database take very long? If no.
How do You do it?

Thore Berntsen
 
E

Enrico Bizzarri

I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

I do:

line = sr.ReadLine()

Do Until line = "[endTable]"

mLine = line.Split(";"c)

sql = "INSERT INTO table "
sql = sql & "("
sql = sql & "field1, field2, field3"
sql = sql & ")VALUES("
sql = sql & "'" & mLine(0) & "', "
sql = sql & "'" & mLine(1) & "', "
sql = sql & "" & mLine(2) & ""
sql = sql & ");"

cmd.CommandText = sql
cmd.ExecuteNonQuery()
line = sr.ReadLine()
Loop

HTH

Bye

Enrico
 
E

Enrico Bizzarri

I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

Sorry not sleep much this nite, is not "above" is "near 600" sometimes 610.

Bye

Enrico
 
J

Jon Skeet [C# MVP]

Enrico Bizzarri said:
I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

I do:

line = sr.ReadLine()

Do Until line = "[endTable]"

mLine = line.Split(";"c)

sql = "INSERT INTO table "
sql = sql & "("
sql = sql & "field1, field2, field3"
sql = sql & ")VALUES("
sql = sql & "'" & mLine(0) & "', "
sql = sql & "'" & mLine(1) & "', "
sql = sql & "" & mLine(2) & ""
sql = sql & ");"

cmd.CommandText = sql
cmd.ExecuteNonQuery()
line = sr.ReadLine()
Loop

600 records per minute is significantly slower than we're seeing
(although I don't know what device you're using). One of the reasons is
probably that the above isn't able to use the fact that it's the same
basic SQL every time. Change it to (C# code for my own sanity):

cmd.CommandText = "INSERT INTO table (field1,field2,field3) "+
"VALUES (?, ?, ?)";
// Fix this bit up, obviously.
cmd.Parameters.Add ("field1", SqlDbType.VarChar, 20);
cmd.Parameters.Add ("field2", SqlDbType.VarChar, 10);
cmd.Parameters.Add ("field3", SqlDbType.VarChar, 30);

string line;
char[] splitCharacters = ";".ToCharArray();
while ( (line=sr.ReadLine()) != "[endTable]")
{
string[] values = line.Split (splitCharacters);
for (int i=0; i < 3; i++)
{
cmd.Parameters.Value = values;
}
cmd.ExecuteNonQuery();
}

We use something similar, and on my XDA2 it can insert just under 3000
records in about 25-30 seconds.
 
T

Thore Berntsen

Thank You for that useful tip. I have tried Enricos method earlier and that
was to slow for me, now I will try Your method.
Jon Skeet said:
Enrico Bizzarri said:
I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

I do:

line = sr.ReadLine()

Do Until line = "[endTable]"

mLine = line.Split(";"c)

sql = "INSERT INTO table "
sql = sql & "("
sql = sql & "field1, field2, field3"
sql = sql & ")VALUES("
sql = sql & "'" & mLine(0) & "', "
sql = sql & "'" & mLine(1) & "', "
sql = sql & "" & mLine(2) & ""
sql = sql & ");"

cmd.CommandText = sql
cmd.ExecuteNonQuery()
line = sr.ReadLine()
Loop

600 records per minute is significantly slower than we're seeing
(although I don't know what device you're using). One of the reasons is
probably that the above isn't able to use the fact that it's the same
basic SQL every time. Change it to (C# code for my own sanity):

cmd.CommandText = "INSERT INTO table (field1,field2,field3) "+
"VALUES (?, ?, ?)";
// Fix this bit up, obviously.
cmd.Parameters.Add ("field1", SqlDbType.VarChar, 20);
cmd.Parameters.Add ("field2", SqlDbType.VarChar, 10);
cmd.Parameters.Add ("field3", SqlDbType.VarChar, 30);

string line;
char[] splitCharacters = ";".ToCharArray();
while ( (line=sr.ReadLine()) != "[endTable]")
{
string[] values = line.Split (splitCharacters);
for (int i=0; i < 3; i++)
{
cmd.Parameters.Value = values;
}
cmd.ExecuteNonQuery();
}

We use something similar, and on my XDA2 it can insert just under 3000
records in about 25-30 seconds.
 
E

Enrico Bizzarri

Wow....

Some time ago i've spend hours in this newsgroup talking about that and the
best i've done is a "600record turtle"TM

Now i'll try do translate it in VB (For my own sanity :)) and i'll post the
results.

I'll never stop to learn reading the NG

Bye

Enrico
 
P

Paul [Paradise Solutions]

Jon

I assume this is going into a database without table relationships / pk
/ indexes?

Paul
Enrico Bizzarri said:
I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

I do:

line = sr.ReadLine()

Do Until line = "[endTable]"

mLine = line.Split(";"c)

sql = "INSERT INTO table "
sql = sql & "("
sql = sql & "field1, field2, field3"
sql = sql & ")VALUES("
sql = sql & "'" & mLine(0) & "', "
sql = sql & "'" & mLine(1) & "', "
sql = sql & "" & mLine(2) & ""
sql = sql & ");"

cmd.CommandText = sql
cmd.ExecuteNonQuery()
line = sr.ReadLine()
Loop


600 records per minute is significantly slower than we're seeing
(although I don't know what device you're using). One of the reasons is
probably that the above isn't able to use the fact that it's the same
basic SQL every time. Change it to (C# code for my own sanity):

cmd.CommandText = "INSERT INTO table (field1,field2,field3) "+
"VALUES (?, ?, ?)";
// Fix this bit up, obviously.
cmd.Parameters.Add ("field1", SqlDbType.VarChar, 20);
cmd.Parameters.Add ("field2", SqlDbType.VarChar, 10);
cmd.Parameters.Add ("field3", SqlDbType.VarChar, 30);

string line;
char[] splitCharacters = ";".ToCharArray();
while ( (line=sr.ReadLine()) != "[endTable]")
{
string[] values = line.Split (splitCharacters);
for (int i=0; i < 3; i++)
{
cmd.Parameters.Value = values;
}
cmd.ExecuteNonQuery();
}

We use something similar, and on my XDA2 it can insert just under 3000
records in about 25-30 seconds.
 
J

Jon Skeet [C# MVP]

Paul said:
I assume this is going into a database without table relationships / pk
/ indexes?

Nope - it's a table with a PK of 4 fields. Removing the PK didn't help
significantly.

One interesting thing to note - I found that a numeric(7,2) field is
making a *big* difference in speed - it's taking about a sixth of the
total time. (An nvarchar(60) field, however, was hardly making any
difference.)
 
E

Enrico Bizzarri

Done.

I've translate the code to VB.

I've now above 750 records per minute (note that i've 12 fields to insert
into).
What about numeric? I've 3 decimal fields and 9 nvarchar. What can i do
improve more?

Thanks

Bye
BiZ
 
P

Paul [Paradise Solutions]

Jon said:
Nope - it's a table with a PK of 4 fields. Removing the PK didn't help
significantly.

One interesting thing to note - I found that a numeric(7,2) field is
making a *big* difference in speed - it's taking about a sixth of the
total time. (An nvarchar(60) field, however, was hardly making any
difference.)

Hmm.. I have my DB setup with table relationships and a fair number of
indexes and I use parameterised commands, yet my insert speed is
*appalling*. Device is a 3760 from Compaq.
When you say the numeric is making the diference do you mean that it's
your bottle-neck? If so perhaps that's some of my problem - some tables
have 5 or 6 money / numeric(10,2) fields.
 
J

Jon Skeet [C# MVP]

Enrico Bizzarri said:
I've translate the code to VB.

I've now above 750 records per minute (note that i've 12 fields to insert
into).
What about numeric? I've 3 decimal fields and 9 nvarchar. What can i do
improve more?

You might want to "play around" finding out which of the fields is
expensive - I found that numeric fields were relatively expensive, for
example. Not sure what you can do about that, really.

I didn't mention one of the other advantages of using parameters, btw -
you don't need to worry about quoting etc.
 
J

Jon Skeet [C# MVP]

Paul said:
Hmm.. I have my DB setup with table relationships and a fair number of
indexes and I use parameterised commands, yet my insert speed is
*appalling*. Device is a 3760 from Compaq.

Unfortunately I've no idea how fast that is - I'm a relative newcomer
to this field.
When you say the numeric is making the diference do you mean that it's
your bottle-neck? If so perhaps that's some of my problem - some tables
have 5 or 6 money / numeric(10,2) fields.

That could well make a big difference. You might want to consider just
*trying* making them nvarchar(12) instead, just for the sake of insert
speed testing.
 
G

Graham McKechnie

Jon & Enrico,

Maybe I can help here as I do have some experience moving large amounts (for
a pda) of drug data. I'm moving in CSV form approx 130,000 records from 19
different tables. The drugs database comes in at 13.2M on the device and
takes 30 minutes to build on my 5450. Approx. 4300rows/minute. The tables
vary considerably re data types and individual table sizes. I haven't
bothered doing any exact time testing on the individual tables, but on say
one of the large tables that only contains 4 int columns, the record counter
on the window really speeds up on this table, as you would expect, as
compared to the products table that has some reasonably large nvarchar
fields and quite a number of foreign keys, where you can observe the record
counter slow down quite considerably. All the tables have PK's, most have a
various FK's and constraints and most of the large tables have multiple
indexes. So the figure of 4300rows/minute could obviously move up on less
complicated data and could even be slower on more complicated data - I think
I could then agree with Jon's claim of 6000rows/minute.

I'm coding it in a similar style to Jon's example. I build the sql statement
with the "?" placeholders as Jon did, add the SqlCeParameters to the command
object and then call Prepare. Jon may have accidently left this out of his
example code snippet. All of the above is outside of the main looping
mechanism of the streamreader, then I just loop as Jon did. I've usually got
to massage the data on the way in before I insert the data in each
Parameter.Value, so I most likely lose a bit of speed at that point too.

I've read somewhere that calling prepare may be unnecessary as
ExecuteNonQuery does it internally, I'm not sure about that and have never
tested without the command.Prepare(). Calling prepare at least does some
more error checking on your command object if you happen to screw up a
parameter. I reckon it could also be made to do more - see my comments on
the ce newsgroup.

I've saved the worst for the last. The big kicker here are the machines -
they really do vary in speed in a big way. I've also borrowed a 2210 and the
same test came in at 25minutes(5200/min). However my colleagues at head
office have a Dell Axim, I think its a 5, not sure. The same test took over
2 hours(1100/min)

I'd love to know why, because the specs of my 5450 and the Dell appear to be
the same.

So I guess I know what test I'm going to run on a machine before I buy a new
one<g>

Regards
Graham



Jon Skeet said:
Enrico Bizzarri said:
I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

I do:

line = sr.ReadLine()

Do Until line = "[endTable]"

mLine = line.Split(";"c)

sql = "INSERT INTO table "
sql = sql & "("
sql = sql & "field1, field2, field3"
sql = sql & ")VALUES("
sql = sql & "'" & mLine(0) & "', "
sql = sql & "'" & mLine(1) & "', "
sql = sql & "" & mLine(2) & ""
sql = sql & ");"

cmd.CommandText = sql
cmd.ExecuteNonQuery()
line = sr.ReadLine()
Loop

600 records per minute is significantly slower than we're seeing
(although I don't know what device you're using). One of the reasons is
probably that the above isn't able to use the fact that it's the same
basic SQL every time. Change it to (C# code for my own sanity):

cmd.CommandText = "INSERT INTO table (field1,field2,field3) "+
"VALUES (?, ?, ?)";
// Fix this bit up, obviously.
cmd.Parameters.Add ("field1", SqlDbType.VarChar, 20);
cmd.Parameters.Add ("field2", SqlDbType.VarChar, 10);
cmd.Parameters.Add ("field3", SqlDbType.VarChar, 30);

string line;
char[] splitCharacters = ";".ToCharArray();
while ( (line=sr.ReadLine()) != "[endTable]")
{
string[] values = line.Split (splitCharacters);
for (int i=0; i < 3; i++)
{
cmd.Parameters.Value = values;
}
cmd.ExecuteNonQuery();
}

We use something similar, and on my XDA2 it can insert just under 3000
records in about 25-30 seconds.
 
G

Ginny Caughey [MVP]

Graham,

There is also quite a bit of speed difference depending on the particular
flash card you're writing to as well for data that is stored on flash.

--
Ginny Caughey
..Net Compact Framework MVP

Have an opinion on the effectiveness of Microsoft Embedded newsgroups?
Let Microsoft know!
https://www.windowsembeddedeval.com/community/newsgroups



Graham McKechnie said:
Jon & Enrico,

Maybe I can help here as I do have some experience moving large amounts (for
a pda) of drug data. I'm moving in CSV form approx 130,000 records from 19
different tables. The drugs database comes in at 13.2M on the device and
takes 30 minutes to build on my 5450. Approx. 4300rows/minute. The tables
vary considerably re data types and individual table sizes. I haven't
bothered doing any exact time testing on the individual tables, but on say
one of the large tables that only contains 4 int columns, the record counter
on the window really speeds up on this table, as you would expect, as
compared to the products table that has some reasonably large nvarchar
fields and quite a number of foreign keys, where you can observe the record
counter slow down quite considerably. All the tables have PK's, most have a
various FK's and constraints and most of the large tables have multiple
indexes. So the figure of 4300rows/minute could obviously move up on less
complicated data and could even be slower on more complicated data - I think
I could then agree with Jon's claim of 6000rows/minute.

I'm coding it in a similar style to Jon's example. I build the sql statement
with the "?" placeholders as Jon did, add the SqlCeParameters to the command
object and then call Prepare. Jon may have accidently left this out of his
example code snippet. All of the above is outside of the main looping
mechanism of the streamreader, then I just loop as Jon did. I've usually got
to massage the data on the way in before I insert the data in each
Parameter.Value, so I most likely lose a bit of speed at that point too.

I've read somewhere that calling prepare may be unnecessary as
ExecuteNonQuery does it internally, I'm not sure about that and have never
tested without the command.Prepare(). Calling prepare at least does some
more error checking on your command object if you happen to screw up a
parameter. I reckon it could also be made to do more - see my comments on
the ce newsgroup.

I've saved the worst for the last. The big kicker here are the machines -
they really do vary in speed in a big way. I've also borrowed a 2210 and the
same test came in at 25minutes(5200/min). However my colleagues at head
office have a Dell Axim, I think its a 5, not sure. The same test took over
2 hours(1100/min)

I'd love to know why, because the specs of my 5450 and the Dell appear to be
the same.

So I guess I know what test I'm going to run on a machine before I buy a new
one<g>

Regards
Graham



Jon Skeet said:
Enrico Bizzarri said:
I've reached the speed of above 600 records per minute. It's the fastest
solution I've found. Really not "fast" but fastest...

I do:

line = sr.ReadLine()

Do Until line = "[endTable]"

mLine = line.Split(";"c)

sql = "INSERT INTO table "
sql = sql & "("
sql = sql & "field1, field2, field3"
sql = sql & ")VALUES("
sql = sql & "'" & mLine(0) & "', "
sql = sql & "'" & mLine(1) & "', "
sql = sql & "" & mLine(2) & ""
sql = sql & ");"

cmd.CommandText = sql
cmd.ExecuteNonQuery()
line = sr.ReadLine()
Loop

600 records per minute is significantly slower than we're seeing
(although I don't know what device you're using). One of the reasons is
probably that the above isn't able to use the fact that it's the same
basic SQL every time. Change it to (C# code for my own sanity):

cmd.CommandText = "INSERT INTO table (field1,field2,field3) "+
"VALUES (?, ?, ?)";
// Fix this bit up, obviously.
cmd.Parameters.Add ("field1", SqlDbType.VarChar, 20);
cmd.Parameters.Add ("field2", SqlDbType.VarChar, 10);
cmd.Parameters.Add ("field3", SqlDbType.VarChar, 30);

string line;
char[] splitCharacters = ";".ToCharArray();
while ( (line=sr.ReadLine()) != "[endTable]")
{
string[] values = line.Split (splitCharacters);
for (int i=0; i < 3; i++)
{
cmd.Parameters.Value = values;
}
cmd.ExecuteNonQuery();
}

We use something similar, and on my XDA2 it can insert just under 3000
records in about 25-30 seconds.
 
G

Graham McKechnie

Ginny,

Thanks for the link - it was well worth watching. I thought I'd found an
answer re the speed differences I mentioned in my last post, when he
compared the 3 processors 250,255 and 263. But I'm back to square one again,
just checked my ipaq and its a 250. So that doesn't explain why the Dell is
so slow. That site for card tests is very useful as well.

Does anyone know what machines and models are running the 263?

Graham
 
J

Jon Skeet [C# MVP]

Graham McKechnie said:
I've read somewhere that calling prepare may be unnecessary as
ExecuteNonQuery does it internally, I'm not sure about that and have never
tested without the command.Prepare(). Calling prepare at least does some
more error checking on your command object if you happen to screw up a
parameter. I reckon it could also be made to do more - see my comments on
the ce newsgroup.

Prepare doesn't speed things up, and can cause corruption - I think
it's if you call it before you set any parameter values. Basically
there's a bug somewhere which causes this. Apparently Prepare() is
called internally when it needs to be. Whether this will be the case in
future versions or not, I don't know.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top