unix-timestamp

Problems with my unix-epoch time converter

浪子不回头ぞ 提交于 2019-12-13 04:15:38
问题 I wrote a simple function to fill three variables with the current year, month, and day. However, for some reason it is not working correctly, and I can't seem to find the problem. void getDate(int *year, int *month, int *date) { int epochTime, monthLength, functionYear, functionMonth, functionDate; functionYear = 1970; functionMonth = 1; functionDate = 1; epochTime = time(NULL); while (epochTime > 1 * 365 * 24 * 60 * 60) { epochTime -= 1 * 365 * 24 * 60 * 60; functionYear++; } monthLength =

Date conversion in nodejs and postgres

倾然丶 夕夏残阳落幕 提交于 2019-12-13 03:27:05
问题 I have column birthday in postgres with type date , I receive a unixtimestamp from front-end like 716500800 . When I am saving it to postgresql, it seems like it is converting based on my local time zone. I don't understand what I should do, here is a code const date = moment.utc(data.birthday * 1000).format(); console.log(date); // 1992-09-15T00:00:00Z it is right date db.query( 'UPDATE app_users SET birthday=$1 where id=$2 RETURNING birthday', [ date, id ], (err, bd) => { console.log(bd

Why is unix_timestamp parsing this incorrectly by 12 hours off?

痞子三分冷 提交于 2019-12-13 02:25:31
问题 The following appears to be incorrect ( spark.sql ): select unix_timestamp("2017-07-03T12:03:56", "yyyy-MM-dd'T'hh:mm:ss") -- 1499040236 Compared to: select unix_timestamp("2017-07-03T00:18:31", "yyyy-MM-dd'T'hh:mm:ss") -- 1499041111 Clearly the first comes after the second. And the second appears to be correct: # ** R Code ** # establish constants one_day = 60 * 60 * 24 one_year = 365 * one_day one_year_leap = 366 * one_day one_quad = 3 * one_year + one_year_leap # to 2014-01-01 11 * one

how to calculate time spent on my app by timestamp

假装没事ソ 提交于 2019-12-13 00:52:42
问题 i have one column timestamp when user enters the app and another column when user leaves the app . i want to calculate the time spent on the app : sum(timestamp_exit) - sum (timestamp_enter) . right now i've tried to right the current query : select (SUM(unix_timestamp(`created_time_enter`))) as enter , (SUM(unix_timestamp(`created_time_exit`))) as exit FROM `my_table` but i get large numbers and i don't know if it's the correct way. any suggestion? 回答1: You could calculate this using the

Not able to manipulate date on Unix shell script (date: illegal option — d)

坚强是说给别人听的谎言 提交于 2019-12-12 21:27:32
问题 I have a requirement to add 10 days to current date and assign it to a variable. But I am getting error: date: illegal option -- d This is what I tried: $> NEW_expration_DATE=$(date -d "+10 days") Result: date: illegal option -- d Usage: date [-u] [+Field Descriptors] 回答1: Try this: NEW_expration_DATE=$(gdate -d "+10 days") 回答2: It looks like you are using a POSIX shell , and that there is no way to do simple date arithmetic in here. I found a guy who explains it and who coded something to

How should I store data for events in different timezones?

醉酒当歌 提交于 2019-12-12 19:08:31
问题 This is a conceptual question, so no code snippets here. Say I create a database of events. Some of them are in New York, some in Chicago, some in Phoenix, etc... My server's timezone is set to New York. In my mind, I have two options when creating UNIX timestamps for all these events. Take the timezone into account. (i.e., An event at midnight on January 1 in Chicago and Pheonix would have different timestamps). Then I'd have to take the timezone into account again whenever I want to display

time() returns different timestamps based on server

心已入冬 提交于 2019-12-12 15:26:36
问题 I searched now for many hours, it seem's like nobody got that problem before. I run a script, which writes the current timestamp into a database, on two servers. Both have the same os, software, ... and the same timezone. Now I found out, that the diff between some timestamps and the current time() is a negative number (yes, the calculation is correct: time() - $older_timestamp ) I dumped time() on both servers, the result: it differs by exactly one hour. Check it out: time() on server #1: -1

Generating signed XPI via jpm failed

和自甴很熟 提交于 2019-12-12 12:49:01
问题 There was a problem signing an Add-On via jpm: The command jpm -v sign --api-key 'user:xxxxxxxx:xxx' --api-secret xxxxxxxxxxxxxxxxxxxxxxxxx failed with the error message Error: Received bad response from the server while requesting https://addons.mozilla.org/api/v3/addons/%40addonname/versions/0.x.y/ Signing via the web interface worked. How can this be fixed? The full verbose output is JPM [info] binary set to /usr/bin/firefox JPM [info] verbose set JPM [info] Checking compatability

matplotlib: formatting of timestamp on x-axis

戏子无情 提交于 2019-12-12 12:15:21
问题 I'm trying to format the x-axis in my weather data plot. I'm happy with the y-axis but all my tries to get the x-axis into a decent, human-readable format didn't work so far. So after several hours of trial and error I hope for your help. What I'm trying to achieve In the end I would like to have tick marks every 30 minutes, a vertical dotted grid line every hour with the time written as HH:MM beneath it and additionally the date written every night at 00:00 hours. Something like this

spark unix_timestamp data type mismatch

折月煮酒 提交于 2019-12-12 09:05:09
问题 Could someone help guide me in what data type or format I need to submit from_unixtime for the spark from_unixtime() function to work? When I try the following it works, but responds not with current_timestamp. from_unixtime(current_timestamp()) The response is below: fromunixtime(currenttimestamp(),yyyy-MM-dd HH:mm:ss) When I try to input from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS") The above simply fails with a type mismatch: error: type mismatch; found : Int(1392394861) required: