We are developing a C# application for a web-service client. This will run on Windows XP PC\'s.
One of the fields returned by the web service is a DateTime field. Th
I know this is an older question, but I ran into a similar situation, and I wanted to share what I had found for future searchers, possibly including myself :).
DateTime.Parse()
can be tricky -- see here for example.
If the DateTime
is coming from a Web service or some other source with a known format, you might want to consider something like
DateTime.ParseExact(dateString,
"MM/dd/yyyy HH:mm:ss",
CultureInfo.InvariantCulture,
DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal)
or, even better,
DateTime.TryParseExact(...)
The AssumeUniversal
flag tells the parser that the date/time is already UTC; the combination of AssumeUniversal
and AdjustToUniversal
tells it not to convert the result to "local" time, which it will try to do by default. (I personally try to deal exclusively with UTC in the business / application / service layer(s) anyway. But bypassing the conversion to local time also speeds things up -- by 50% or more in my tests, see below.)
Here's what we were doing before:
DateTime.Parse(dateString, new CultureInfo("en-US"))
We had profiled the app and found that the DateTime.Parse represented a significant percentage of CPU usage. (Incidentally, the CultureInfo
constructor was not a significant contributor to CPU usage.)
So I set up a console app to parse a date/time string 10000 times in a variety of ways. Bottom line:
Parse()
10 sec
ParseExact()
(converting to local) 20-45 ms
ParseExact()
(not converting to local) 10-15 ms
... and yes, the results for Parse()
are in seconds, whereas the others are in milliseconds.