问题
I have two programs written in .NET that communicate with acquisition hardware via USB. One device is asynchronous, but it tells me down to the microsecond when a data point was recorded. The other one is synchronous, that is, I know when each data point was recorded.
I have .NET programs to interface with these. Until now, I have used DateTime.Now to establish when acquisition started, then I use the millisecond/microsecond time reported by the asynchronous device and the sample number from the synchronous device to calculate the computer time for each data point (by adding ticks to the start time).
Now I must synchronize the times for the two acquisitions. The two programs are running on the same computer, but as I've been reading, DateTime.Now has an accuracy of 10 to 15 milliseconds according to some documentation (although it seems to be that several people can achieve substantially better than this, although arguably only because they call it right after a Thread.Sleep).
So as far as I know, my current method of:
- User starts acquisition on device 1. DateTime.Now is called
- User starts acquisition on device 2. DateTime.Now is called
can give me acquisition start times as bad as 15 milliseconds off.
I can think of taking a DateTime.Now at program start up, and using StopWatch to record the exact elapsed time between the program start and acquisition start. But that would only work if I merged the two programs together.
Any other ideas?
回答1:
StopWatch has a GetTimeStamp() method that you can use to do the sync. It basically uses the Windows API inside of it. You end up getting a tick number of type long, with StopWatch.Frequency being the number of ticks per second on your machine, if you want to convert to seconds.
来源:https://stackoverflow.com/questions/5171349/time-sync-between-two-processes-net