> os:system_time(seconds).
1470361698
This is correct UTC timestamp, but I'm not sure if it is not system-related value. Is it?
How then I can ensure that same program ran on different machine will still operate in UTC timezone?
os:system_time returns the "OS System Time" which is defined to be approximately the "POSIX Time", the number of seconds since Epoch, which is defined to be 00:00:00 UTC, 1970-01-01, so it should always return the "UTC Timestamp". Erlang has excellent documentation about Time Correction, have a look at the OS System Time and POSIX Time sections.
OS System Time
The operating systems view of POSIX time. To retrieve it, call os:system_time(). This may or may not be an accurate view of POSIX time. This time may typically be adjusted both backwards and forwards without limitation. That is, time warps may be observed.
POSIX Time
Time since Epoch. Epoch is defined to be 00:00:00 UTC, 1970-01-01. A day in POSIX time is defined to be exactly 86400 seconds long. Strangely enough Epoch is defined to be a time in UTC, and UTC has another definition of how long a day is. Quoting the Open Group "POSIX time is therefore not necessarily UTC, despite its appearance". The effect of this is that when an UTC leap second is inserted, POSIX time either stops for a second, or repeats the last second. If an UTC leap second would be deleted (which has not happened yet), POSIX time would make a one second leap forward.
Related
We have tried using IST minus the difference between IST and CST, so that we can get CST time (the SUT's time), but it won't work when daylight saving time comes. Kindly someone help on this to get the SUT time.
You can't directly take the time zone from a SUT. There are a few methods to do what you're trying to accomplish:
1. Screen capture the time on the SUT.
Using OCR, isolate the searchRectangle to the system's clock. Then readtext() and save the read text to a variable myTime.
2. Remote commands (mobile SUTs only).
You can send a remote command to a SUT using the Eggplant function ExecuteRemoteCommand(). From Eggplant's documentation:
"On Android, ExecuteRemoteCommand runs as a shell command on the
actual phone.
On iOS, ExecuteRemoteCommand runs the command as
JavaScript, making calls to the Apple UIAutomation API."
You can save the output of these commands to a variable, so on an Android SUT it would be:
set SUT_time to ExecuteRemoteCommand(date)
SUT_time will now represent the time in the format Mon Jul 31 21:09:28 CDT 2017.
3. Math!
Given your current system's date and time, you should always be able to calculate the time of any given timezone. Currently, that would work out as:
set SUT_time to the date minus 10 hours 30 minutes
To make this compatible with daylight saving's time, you'll have to use an if statement. That might look something like this:
set CST to the date minus 10 hours 30 minutes
if CST is between "Mar 11" and "Nov 4" then
add 1 hour to CST
end if
Background
I am investigating different use cases of moment.js for a project but am stumped on the issue of daylight savings ending in the fall. Before asking my question, since I want to be clear and provide background for others who have similar questions, let me explain what I am doing and what found with spring daylight savings.
First, I am working with UTC timestamps and America/New_York timestamps. In the US, daylight savings in 2017 begins on March 12 at 2AM (skipping from 2:00:00AM to 3:00:00AM) and ends on November 5 at 2AM (reverting from 2:00:00AM to 1:00:00AM). Since I also always know the target time zone (America/New_York) that I need to convert to, I will not rely upon moment.js to detect my local time zone and instead explicitly specify the time zone I want.
During the spring when daylight savings goes into effect, the time zones that observe daylight savings, like America/New_York, jump forward an hour. moment.js handles this just fine.
For example, if I pass moment.js a UTC timestamp for the time one second before daylight savings goes into effect for America/New_York, it looks like this:
moment('2017-03-12T06:59:59Z').tz('America/New_York').format('YYYY/MM/DD hh:mm:ss a z')
The input above is taken as UTC because of the Z on my timestamp and I am explicitly setting the target timezone with .tz('America/New_York') so that it doesn't use local system time.
Alternatively using moment-timezone I can explicitly set the input time zone as UTC and set the output as America/New_York.
moment.tz('2017-03-12T06:59:59', 'UTC').tz('America/New_York').format('YYYY/MM/DD hh:mm:ss a z')
Either way, the result is 2017/03/12 01:59:59 am EST.
Then, I run the same commands for a moment just one second later. I will just use the format given in the first example above where I specify the time as UTC and then convert it to America/New_York time:
moment('2017-03-12T07:00:00Z').tz('America/New_York').format('YYYY/MM/DD hh:mm:ss a z')
And my result is correct as expected: 2017/03/12 03:00:00 am EDT - due to daylight savings the time skipped ahead by one hour.
I can then use moment-timezone to go back the other way by passing in an America/New_York timestamp and converting it to UTC.
moment.tz('2017-03-12T01:59:59', 'America/New_York').utc().format('YYYY/MM/DD hh:mm:ss a z')
This gives me 2017/03/12 06:59:59 am UTC
And the next moment in the America/New_York time zone, because of daylight savings coming into play, is 03:00:00 so I convert that to UTC...
moment.tz('2017-03-12T03:00:00', 'America/New_York').utc().format('YYYY/MM/DD hh:mm:ss a z')
... and get 2017/03/12 07:00:00 am UTC which looks correct.
An hour was skipped ("lost") in America/New_York time, but moment can detect that and convert it to UTC.
In Summary, for the spring daylight savings change in the US, I can pass a UTC timestamp into either moment.js or moment-timezone and get back a timestamp in another time zone with the correct daylight savings offset also applied. I can also then pass an America/New_York timestamp into moment and get back a correctly converted UTC timestamp.
My Question
Great, so I want to do the same thing when daylight savings ends in the fall and of course it isn't that simple. My hypothesis is that due to daylight savings effectively causing an hour to "repeat", there is no way for moment to know the correct time UTC. In other words, where there was a gap of one hour when daylight savings started, now we have overlap of one hour.
Question (part 1): Is there a way to pass a relative timestamp and time zone into moment and get back the correct UTC time? When I tried this in the examples below, moment skips an hour on the UTC side.
moment.tz('2017-11-05T01:00:00', 'America/Denver').tz('UTC').format('YYYY/MM/DD hh:mm:ss a z') => "2017/11/05 07:00:00 am UTC"
moment.tz('2017-11-05T01:59:59', 'America/Denver').tz('UTC').format('YYYY/MM/DD hh:mm:ss a z') => "2017/11/05 07:59:59 am UTC"
moment.tz('2017-11-05T02:00:00', 'America/Denver').tz('UTC').format('YYYY/MM/DD hh:mm:ss a z') => "2017/11/05 09:00:00 am UTC"
moment.tz('2017-11-05T02:59:59', 'America/Denver').tz('UTC').format('YYYY/MM/DD hh:mm:ss a z') => "2017/11/05 09:59:59 am UTC"
I assume its because time is happening chronologically like in the example below. UTC offset context is not given therefore I assume that moment cant distinguish between the America/New_York timestamps preceded by asterisks:
*2017/11/05 01:00:00 am America/New_York => 2017/11/05 07:00:00 am UTC
*2017/11/05 01:59:59 am America/New_York => 2017/11/05 07:59:59 am UTC
*2017/11/05 01:00:00 am America/New_York => ???
*2017/11/05 01:59:59 am America/New_York => ???
2017/11/05 02:00:00 am America/New_York => 2017/11/05 09:00:00 am UTC
2017/11/05 02:59:59 am America/New_York => 2017/11/05 09:59:59 am UTC
Again, what I want to know is if there is a way around this? Currently the timestamps in the data that I have does not contain UTC offsets.
Question (part 2): If I am presenting data in the America/New_York time zone then is it correct to think that I will essentially have two hours of data points all stuffed into a (seemingly) single one hour period from 01:00:00 to 01:59:59 on November 5, 2017?
Related Topics
There are a few other topics on SO that are related to this but none that I have found pose or answer this same question. I will link a few here for reference:
Moment.js Convert Local time to UTC time does work
Initialize a Moment with the timezone offset that I created it with
It looks like you have thought the problem through and done some basic research. Thanks!
What you are describing is covered in the moment-timezone docs here. If the data isn't available as a UTC offset in your input, there's no way to tell the difference between the first or second occurrence of an ambiguous local time. Moment picks the first occurrence, because time moves in a forward direction, so this is usually the most sensible choice for most scenarios.
The problem is one of ambiguity. Even as just a human being, if I say "1:00 am on November 5th 2017 in New York" you don't know which of two points in time I'm describing.
That said, sometimes you have external knowledge that can help. For example, if you have an ordered set of timestamps containing time that skips backwards, then you know you encountered a fall-back transition. Say I'm recording data at 15 minute intervals in local time:
00:45
01:00
01:15
01:30
01:45
01:00 <--- this one comes next sequentially, but appears backwards, so infer transition
01:15
01:30
01:45
02:00
You'll have to write your own detection logic to compare one value to the next for that scenario. Also note that if you don't have any time that appears to be out of sequence, then you cannot be assured of which occurrence is being described. A "heartbeat" signal can assist with this in some scenarios.
Now how do you choose the second occurrence in Moment without knowing the offsets in advance? Like this:
First, grab the hasAmbiguousWallTime function from here.
Then define another function:
function adjustToLaterWhenAmbiguous(m) {
if (hasAmbiguousWallTime(m)) {
m.utcOffset(moment(m).add(1, 'hour').utcOffset(), true);
}
}
Now you can do this:
// start with the first occurrence
var m = moment.tz("2017-11-05T01:00:00", "America/New_York");
m.format(); // "2017-11-05T01:00:00-04:00"
// now shift it to the second occurrence
if (... your logic, such as wall time going backwards in sequence, etc. ...) {
adjustToLaterWhenAmbiguous(m);
m.format(); // "2017-11-05T01:00:00-05:00"
}
These two functions should probably be hardened and added to moment-timezone, but they should be sufficient for the scenario you describe.
A couple of other minor points:
Instead of moment.tz(s, 'UTC'), consider using moment.utc(s)
Instead of moment.tz(s, 'America/New_York').tz('UTC'), consider
using moment.tz(s, 'America/New_York').utc()
You may want to review the DST tag wiki for visualization of the problem space.
On part two of your question, yes - you'll end up stuffing two hours of data into what could possibly be visualized as a one-hour space. People have this problem with graphs and charts all the time. They graph something with a constant value over local time, then see a zeroing effect in the spring, and a doubling effect in the fall. Even if you tell moment to use the later occurrence, you won't avoid this unless you actually display the graph in UTC instead of local time.
As the subject asks; do UNIX timestamps change in each timezone?
For example, if I sent a request to another email the other side of the world saying, "Send out an email when the time is 1397484936", would the other server's timestamp be 12 hours behind my own?
The definition of UNIX timestamp is time zone independent. The UNIX timestamp is the number of seconds (or milliseconds) elapsed since an absolute point in time, midnight of Jan 1 1970 in UTC time. (UTC is Greenwich Mean Time without Daylight Savings time adjustments.)
Regardless of your time zone, the UNIX timestamp represents a moment that is the same everywhere. Of course you can convert back and forth to a local time zone representation (time 1397484936 is such-and-such local time in New York, or some other local time in Djakarta) if you want.
The article at http://en.wikipedia.org/wiki/Unix_time is pretty impressive if you'd like a longer read.
Unix time is defined as the number of seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC), Thursday, 1 January 1970. So the answer is no
Unix timestamps do not change accross timezones, they are created for the purpose of having a standard time across globe.
NOTE:-
Timestamps are calculated on the basis of current time in the computer thus do not rely on them until and unless you are very sure about the time settings in the participating machines.
Someone stated that "UTC is Greenwich Mean Time without Daylight Savings time adjustments." This is simply untrue. GMT does not have Dayllight Savings Time. GMT is measured in Greenwich, England (at the Naval Obeservatory) [0 longitude, but not 0 lattitude]. UTC is measured at the equator [0 longitude and 0 lattitude - which happens to lie in the ocean off the cost of Africa].
What difference does it make? It doesn't make a difference in terms of "what time of day is it?" It does, however, make a difference in terms of calculating a year. Now you'd think a year would be measured based upon the location of the center (the core) of the earth, right? When the earth's core is back in the same location it was ~365 days ago, it has been a year. It isn't measured that way. It is measured by a specific location on the earth getting back to the same location (relative to the sun) that it was ~365 days ago. But the period of a day and a year don't divide evenly. Once the earth is back to about where it was a year ago, the earth isn't facing the same direction it was last year, so that spot on the earth isn't facing the same direction it was a year ago. Being further north, Greenwich isn't going to get back to the same spot (relative to the sun) that it was last year at the same time that 0 Lat / 0 Long is. So if you base the definition on Greenwith vs. 0/0, you get a, albeit slightly, different answer to the question "how many days are in a year". To put it another way, when a given spot on the earth gets back to where it was a year ago (relative to the Sun), the core of the earth isn't in the same spot it was a year ago, so what spot you pick matters because the core of the earth is going to be in a different spot (relative to the sun) than it was one year ago, if you pick a different spot on the earth.
Neither UTC nor GMT have daylight savings time. Europe/London time, the timezone that Greenwich resides in, does. But GMT does not. GMT is, what Americans would call a "Standard Time" - i.e. without DST.
Getting back to the question, Epoch time doesn't technically have a timezone. It is based on a particular point in time, which just so happens to line up to an "even" UTC time (at the exact beginning of a year and a decade, etc.). If that concept doesn't fit well in your brain, and if it helps to think of Epoch time as being in UTC, go right ahead. You're in good company and in the grand scheme of things, it really doesn't matter. You ever see those law suits where somoene is awarded $1. It's kind of a "you're right, but it doesn't really matter" type of verdict. If someone sued you for saying Epoch time is in the UTC timezone, they would win $1. That wouldn't buy them a cup of coffee at any Starbucks in any timezone on the planet.
IF both computers are set up correctly with their clocks set for the correct timezone and UTC values, they should return the same value.
Of course that's a big IF. There's almost certain to be a difference of at least a second, more often minutes between the time reported by two computers. And many computers are set up to have incorrect timezone settings, and will report their local time when asked a timestamp rather than UTC.
And in that lies the difference between theory and practice. In theory it's all the same, in practice you should not rely on it.
No, epoch timestamp should not change, because it has a fixed timezone which is UTC.
If you want to use a time object in other time zone, just look it up in libraries of the language you use, but do NOT try to add/substract a couple of hours from epoch timestamp and assume it's in another time zone, which will make things very confusing to other people, especially when you expose it in your API.
If you use C++, I recommend this library. I heard it will soon be added into standard library.
For all, I understand sometimes time object is hard to deal with and it looks easier to add/substruct on epoch timestamp. Please don't do it and do not persuade others to do it. A time object is much easier once you get used to it and can take care of time zone conversion easily without messing up with historical time zone changes due to politics/law etc...
I live in a country where they change the time twice a year. That is: there is a period in the year when the offset from UTC is -3 hours (-180 mins) and other period where the offset is -4 hours (-240 mins)
Grafically:
|------- (offset = -3) -------|------- (offset is -4) -------|
start of year mid end of year
My question is:
the "timezone" is just the number representing the offset? that is: my country has two timezones? or the timezone includes this information?
This is important because I save every date in UTC timezone (offset = 0) in my database.
Should I, instead, be saving the dates with local timezone and saving their offset (at the moment of saving) too?
Here is an example of a problem I see by saving the dates with timezone UTC:
Lets say I have a system where people send messages.
I want to have a statistics section where I plot "messages sent v/s hour" (ie: "Messages sent by hour in a regular day")
Lets say there are just two messages in the whole database:
Message 1, sent in march 1, at UTC time 5 pm (local time 2 pm)
Message 2, sent in august 1, at UTC time 5 pm (local time 1 pm)
Then, if I create the plot on august 2, converting those UTC dates to local would give me: "2 messages where sent at 1 pm", which is erratic information!
From the timezone tag wiki here on StackOverflow:
TimeZone != Offset
A time zone can not be represented solely by an offset from UTC. Many
time zones have more than one offset due to "daylight savings time" or
"summer time" rules. The dates that offsets change are also part of
the rules for the time zone, as are any historical offset changes.
Many software programs, libraries, and web services disregard this
important detail, and erroneously call the standard or current offset
the "zone". This can lead to confusion, and misuse of the data. Please
use the correct terminology whenever possible.
There are two commonly used database, the Microsoft Windows time zone db, and the IANA/Olson time zone db. See the wiki for more detail.
Your specific questions:
the "timezone" is just the number representing the offset? that is: my country has two timezones? or the timezone includes this information?
You have one "time zone". It includes two "offsets".
Should I, instead, be saving the dates with local timezone and saving their offset (at the moment of saving) too?
If you are recording the precise moment an event occurred or will occur, then you should store the offset of that particular time with it. In .Net and SQL Server, this is represented using a DateTimeOffset. There are similar datatypes in other platforms. It only contains the offset information - not the time zone that the offset originated from. Commonly, it is serialized in ISO8601 format, such as:
2013-05-09T13:29:00-04:00
If you might need to edit that time, then you cannot just store the offset. Somewhere in your system, you also need to have the time zone identifier. Otherwise, you have no way to determine what the new offset should be after the edit is made. If you desire, you can store this with the value itself. Some platforms have objects for exactly this purpose - such as ZonedDateTime in NodaTime. Example:
2013-05-09T13:29:00-04:00 America/New_York
Even when storing the zone id, you still need to record the offset. This is to resolve ambiguity during a "fall-back" transition from a daylight offset to a standard offset.
Alternatively, you could store the time at UTC with the time zone name:
2013-05-09T17:29:00Z America/New_York
This would work just as well, but you'd have to apply the time zone before displaying the value to anyone. TIMESTAMP WITH TIME ZONE in Oracle and PostgreSQL work this way.
You can read more about this in this post, while .Net focused - the idea is applicable to other platforms as well. The example problem you gave is what I call "maintaining the perspective of the observer" - which is discussed in the same article.
that is: my country has two timezones? or the timezone includes this information?
The term "timezone" usually includes that information. For example, in Java, "TimeZone represents a time zone offset, and also figures out daylight savings" (link), and on Unix-like systems, the tz database contains DST information.
However, for a single timestamp, I think it's more common to give just a UTC offset than a complete time-zone identifier.
[…] in my database.
Naturally, you should consult your database's documentation, or at least indicate what database you're using, and what tools (e.g., what drivers, what languages) you're using to access it.
Here's an example of a very popular format for describing timezones (though not what Windows uses).
You can see that it's more than a simple offset. More along the lines of offsets and the set of rules (changing over time) for when to use which offset.
I finally found out the difference between UTC and GMT by making the effort to look it up on Wikipedia today. Technically speaking it appears that GMT != UTC because you do not know if it is UTC or UT1 being referred to. However practically, people use the terms interchangeably to indicate the same timezone.
A while ago, I suggested that we change the user interface of one of my companies apps to display UTC instead of GMT.
Just to be sure that our database was not calculating the potential seconds difference between GMT and UTC, I ran the below query and verified that they both are just acting as aliases for the same timezone.
select now() AT TIME ZONE 'GMT', now() AT TIME ZONE 'UTC';
timezone | timezone
----------------------------+----------------------------
2009-02-11 08:46:11.643032 | 2009-02-11 08:46:11.643032
(1 row)
What do you think? Do enough users out there understand UTC? Is it better to use the older but more common term? Or should I just do a UTC/GMT?
Normal humans don't need to worry about the few seconds difference between GMT and UTC. The difference only matters to astronomers and time nerds.
I have seen very little software that bothers to make the distinction. Most software ends up using the labels "GMT" and "UTC" interchangeably. Typically it just means "clock time after removing the local time zone offset in exact hours (or half/quarter hours)."
In most cases, nobody will be concerned about the sub-second technical difference between GMT and UTC.
However, writing that the time is expressed in UTC instead of GMT avoids one source of confusion:
Greenwich (and the UK in general) is currently GMT+01:00 because of the daylight saving time (DST).
GMT+01:00 does not mean 1 hour ahead of the time in the UK as one could mistakenly think. Because of the DST, GMT+01:00 is currently the exact time in England.
Stating it as UTC+01:00 helps to avoid this confusion.
Personally, I think of the term UTC before I think of GMT.
I think of GMT before UTC, but I am also living at GMT (+/-0)