char[] to DateTime

  • Thread starter Thread starter Mark
  • Start date Start date
M

Mark

Is there a way to convert a char[] to a DateTime without first converting to
a string and using DateTime.Parse or ParseExact? I'm trying to reuse the
char[] which can be reused instead of converting to a string since string is
immutable and must be GC'ed. Please, I'm looking for a conversion between
an char[] to DateTime without intermediary step.

Thanks in advance

Mark
 
What kind of application are you writing that the construction of one
string and one GC is going to affect you that much?

By the way, char[] needs to be GC'd as well.
 
Bruce Wood said:
What kind of application are you writing that the construction of one
string and one GC is going to affect you that much?
Agreed.

By the way, char[] needs to be GC'd as well.

Yes, but it can be changed and reused without creating a new object. I
once wrote a MIDP version of Tetris which, after starting up, didn't
create a single new object until you quit. Of course, that was in an
environment where it really, really mattered!
 
Mark said:
Is there a way to convert a char[] to a DateTime without first converting to
a string and using DateTime.Parse or ParseExact? I'm trying to reuse the
char[] which can be reused instead of converting to a string since string is
immutable and must be GC'ed. Please, I'm looking for a conversion between
an char[] to DateTime without intermediary step.

As Bruce said, is it really likely to be a problem to have the string
garbage collected? So long as it stays in generation 0, it's very
unlikely to make much difference.

Of course, you *could* write your own parser for DateTime, and that may
be reasonable if you've got a fixed format - especially if it's one
which is very straightforward, like yyyyMMddHHmmss, but I would try to
get some performance data which shows the "normal" way to be a
bottleneck before resorting to things like that.
 
Its a legacy application data pump to the database. The client writes tab
delimited records over a socket. The program then processes the bytes so
that it can be split into fields and then rows. It writes about 2Mil
records for just one table and there are 50 tables usually much smaller in
size ranging from 10-10000 rows. The profiler shows that DateTime.Parse is
a hot spot.
 
I'm impressed that you've profiled it, but surprised at the results.
All of that disk I/O and the program is hung up in the CPU? Oh well,
profilers don't lie.

If DateTime.Parse() is a hot spot, my first question would be whether
your dates and times are in a consistent format? If they are, I would
write my own parse routine to read them directly out of the char[]
array. DateTime.Parse() is probably slow because it has to cope with
multiple input formats, not to mention internationalization concerns.
If you know, for example, that all of your dates are MM/DD/YY hh:mm:ss,
or all YYYY-MM-DD hh:mm:ss, or can even whittle it down to two or three
possible formats, writing your own (specific) routine would probably
speed things up considerably.
 
Bruce Wood said:
I'm impressed that you've profiled it, but surprised at the results.
All of that disk I/O and the program is hung up in the CPU? Oh well,
profilers don't lie.

If DateTime.Parse() is a hot spot, my first question would be whether
your dates and times are in a consistent format? If they are, I would
write my own parse routine to read them directly out of the char[]
array. DateTime.Parse() is probably slow because it has to cope with
multiple input formats, not to mention internationalization concerns.
If you know, for example, that all of your dates are MM/DD/YY hh:mm:ss,
or all YYYY-MM-DD hh:mm:ss, or can even whittle it down to two or three
possible formats, writing your own (specific) routine would probably
speed things up considerably.

I ran across this exact situation a few years ago.
I was processing transaction data in a loop.
I determined that DateTime.Parse was the bottleneck.
Since the DateTime format was fixed in the incoming data I parsed it myself
and used
public DateTime(int year, int month, int day, int hour, int minute, int
second)

I don't remember the exact speed increase that I got, but it was
substantial.
Say from 1-2k/sec up to 5-10k/sec.

Now, the actual application only averaged ~100 trans/sec.
You could argue that the speed increase was irrelevant, but that is not
true.

The data came in in batches, so it was processed in ~1/3 the time.
And, my application spent more time sleeping instead of burning CPU cycles
so other apps could be run on the same box.

In my code I put a big comment explaining WHY I was doing this, so that the
chances were reduced of somebody FIXING my code.

Bill Butler
 
Back
Top