Possible to optimize this?

R

Robin Tucker

I have a slight bottleneck in my code, the simplicity of which leads me to
believe it is either insurmountable, or easily speeded up. I have an
ArrayList of numbers (integers, which are in fact database primary keys).
In order to pass all of these keys into a stored procedure, I generate one
long space delimited string by concatenating the IDs and pass it into SQL
Server as an NTEXT field (the stored procedure then breaks the NTEXT field
into a table and does a join with another but thats by the way). When I
have, say, 16,000 primary keys in my list, the section below can take quite
a long time (seconds!). Is this reasonable? If not, can I speed it up any?
I'm guessing the compiler is quite smart at optimization and that something
as simple as this is already more or less maxed out.

My ideas are: perhaps iterating the collection with "For Each" might be
slower than directly indexing the array. Also, perhaps I should generate a
string of the correct length and then modify it rather than relying on the
runtime to resize or re-create the string on each iteration (but strings are
immutable, right?). Perhaps there is a different way I can convert an
integer to a string that is more efficient than "ToString".

An ideas?



(theNodeIDs is an ArrayList, theNodeIDsString is a string)....

For Each theNodeID As Integer In theNodeIDs
theNodeIDsString += theNodeID.ToString + " "
Next




Thanks,


Robin
 
M

Marina

One way to optimize, would be to use a stringbuilder for the concatenation.
You should see a significant improvement.
 
H

Herfried K. Wagner [MVP]

Robin,

Robin Tucker said:
I have a slight bottleneck in my code, the simplicity of which leads me to
believe it is either insurmountable, or easily speeded up. I have an
ArrayList of numbers (integers, which are in fact database primary keys).
In order to pass all of these keys into a stored procedure, I generate one
long space delimited string by concatenating the IDs and pass it into SQL
Server as an NTEXT field (the stored procedure then breaks the NTEXT field
into a table and does a join with another but thats by the way). When I
have, say, 16,000 primary keys in my list, the section below can take
quite a long time (seconds!). Is this reasonable? If not, can I speed it
up any? I'm guessing the compiler is quite smart at optimization and that
something as simple as this is already more or less maxed out.

My ideas are: perhaps iterating the collection with "For Each" might be
slower than directly indexing the array.

The difference is marginal.
Also, perhaps I should generate a string of the correct length and then
modify it rather than relying on the runtime to resize or re-create the
string on each iteration (but strings are immutable, right?).

That's the bottleneck. Strings are immutable and your code will create new
string objects several times. Instead of using a string, use a
'StringBuilder' instead:
For Each theNodeID As Integer In theNodeIDs
theNodeIDsString += theNodeID.ToString + " "
Next

\\\
Dim sb As New StringBuilder()
For Each NodeID As Integer In NodeIDs
sb.Append(NodeID.ToString())
sb.Append(" ")
Next NodeID
.... = sb.ToString()
///
 
R

Robin Tucker

Thanks both for the above. The operation was taking ~15 seconds but is now
to all intents and purposes instantaneous :))

Robin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top