clock

Clock drift on Windows

社会主义新天地 提交于 2019-11-29 01:07:23
问题 I've developed a Windows service which tracks business events. It uses the Windows clock to timestamp events. However, the underlying clock can drift quite dramatically (e.g. losing a few seconds per minute), particularly when the CPUs are working hard. Our servers use the Windows Time Service to stay in sync with domain controllers, which uses NTP under the hood, but the sync frequency is controlled by domain policy, and in any case even syncing every minute would still allow significant

Clock synchronization quality on Windows Azure?

独自空忆成欢 提交于 2019-11-29 00:04:24
问题 I am looking for quantitative estimates on clock offsets between VMs on Windows Azure - assuming that all VMs are hosted in the same datacenter. I am guesstimating that average clock offset between one VM and another is below 10 seconds, but I am not even sure it's guaranteed property of the Azure cloud. Has anybody some quantitative measurements on that matter? 回答1: I have finally settled to do some experiments on my own. A few facts concerning the experiment protocol: Instead of looking for

Makefile : Clock skew detected

泪湿孤枕 提交于 2019-11-28 23:45:25
My problem is whenever I try to compile using Makefile I get the following : make: Warning: File `Board.c' has modification time 1.3e+03 s in the future gcc -Wall -c -Wvla -lm Board.c -o Board.o gcc -Wall -c -Wvla -lm PlayBoard.c -o PlayBoard.o gcc -lm ErrorHandle.o Board.o PlayBoard.o -g -o PlayBoard make: warning: Clock skew detected. Your build may be incomplete. My Makefile is : CC = gcc FLAGS = -Wall -c -Wvla PlayBoard: ErrorHandle.o Board.o PlayBoard.o $(CC) -lm ErrorHandle.o Board.o PlayBoard.o -g -o $@ PlayBoard.o: PlayBoard.c Board.o $(CC) $(FLAGS) -lm PlayBoard.c -o $@ Board.o :

What are the uses of std::chrono::high_resolution_clock?

狂风中的少年 提交于 2019-11-28 19:39:44
At first I thought it can be used for performance measurements. But it is said that std::chrono::high_resolution_clock may be not steady ( is_steady may be false ). It is also said that std::chrono::high_resolution_clock may even be an alias of std::chrono::system_clock which is generally not steady. So I can't measure time intervals with this type of clock because at any moment the clock may be adjusted and my measurements will be wrong. At the same time I can't convert time points of std::chrono::high_resolution_clock to calendar time because it doesn't have to_time_t method. So I can't get

Create an incrementing timer in seconds in 00:00 format?

你说的曾经没有我的故事 提交于 2019-11-28 18:59:31
I want to create an incrementing second timer like a stopwatch. So I want to be able to display the seconds and minutes incrementing in the format 00:01... Google only brings up 24 hour clock examples, I was wondering could anyone get me started with an example or tutorial of what I want to do? Edit: Here is what I have using the Chronometer in Android so far In onCreate() secondsT = 0; elapsedTimeBeforePause = 0; stopWatch.start(); startTime = SystemClock.elapsedRealtime(); stopWatch.setBase(elapsedTimeBeforePause); stopWatch.setOnChronometerTickListener(new OnChronometerTickListener(){

How to sync time on host wake-up within VirtualBox?

天大地大妈咪最大 提交于 2019-11-28 16:14:13
问题 I am running an Ubuntu 12.04-based box inside of Vagrant using VirtualBox. So far, everything is fine - except for one thing: Let's assume that the VM is running. Then, the host goes to standby-mode. After waking it up again, the VM is still running, but its internal clock continues where it stopped when the host went down. So this basically means: Put the host to sleep for 15 minutes, wake it up again, then the VM's internal clock is 15 minutes late. How can I fix this (setting the time

Timing algorithm: clock() vs time() in C++

女生的网名这么多〃 提交于 2019-11-28 15:43:53
问题 For timing an algorithm (approximately in ms), which of these two approaches is better: clock_t start = clock(); algorithm(); clock_t end = clock(); double time = (double) (end-start) / CLOCKS_PER_SEC * 1000.0; Or, time_t start = time(0); algorithm(); time_t end = time(0); double time = difftime(end, start) * 1000.0; Also, from some discussion in the C++ channel at Freenode, I know clock has a very bad resolution, so the timing will be zero for a (relatively) fast algorithm. But, which has

How to get the time since midnight in seconds

落花浮王杯 提交于 2019-11-28 13:44:42
First off, this is a simple question that I'm stuck on in my Java 1 class. It's a static time that I set already as 8:49:12 "today" and I'm to figure out how many seconds past midnight and "to" midnight this represents. 8 hours, 49 minutes, and 12 seconds. Here is my code now: hour = 8; minute = 59; second = 32; System.out.println("The static time used for this program was: " + hour + ":" + minute + ":" + second); My issue is that I have no clue on how to get the time from and since midnight. So basically the output needs to be: Number of seconds since midnight: Number of seconds to midnight:

How Do You Programmatically Set the Hardware Clock on Linux?

半腔热情 提交于 2019-11-28 07:01:40
Linux provides the stime(2) call to set the system time. However, while this will update the system's time, it does not set the BIOS hardware clock to match the new system time. Linux systems typically sync the hardware clock with the system time at shutdown and at periodic intervals. However, if the machine gets power-cycled before one of these automatic syncs, the time will be incorrect when the machine restarts. How do you ensure that the hardware clock gets updated when you set the system time? Check out the rtc man-page for details, but if you are logged in as root, something like this:

Synchronization (of clocks) between two remote computers

陌路散爱 提交于 2019-11-28 05:59:56
I'm looking into writing a simple synchronization ability into my app and one of the concerns that has popped up is synchronization of time between two remote computers, each with their own clock (in particular concerning the modification dates of files/objects). I'm sure a lot of research has been done on this topic and don't want to get too theoretical, but I'm wondering if there are any accepted best practices for minimizing temporal discrepancies between remote clocks? For example, a start is to always use universal time (UTC) as that avoids timezone problems, but there is no guarantee