Related
I am using Rails 5 and my app's timezone is set to Brasilia.
Right now it's winter (no daylight saving) in Brazil so
Time.current.dst? returns false
But in Brazil daylight savings starts at 21/10/2018, so when I do
(Time.current + 10.days).dst? I expect it ti return true, but it returns false.
Is there a table where I can check what dates Rails considers for start end and of daylight savings in each timezone?
Or as another example. I understand that UTC does not have daylight savings. So
Time.current.strftime(%Z) return -3 which is the correct diference between UTC and current Brazil time.
But when I do
(Time.current+10.days).strftime(%Z) is still returns -3 but that date is after Brazil has changed time, so it should be -2. Something seems to be wrong.
Per the Rails documentation:
The TimeZone class serves as a wrapper around TZInfo::Timezone instances.
TZInfo::Timezone instances are provided via the Ruby TZInfo gem. The data for TZInfo is provided in a separate TZInfo-Data package. Staying current with that will keep your time zone data accurate. This data is sourced from the IANA time zone database (TZDB), as it is with most other programming languages.
Note that it is generally preferable to use the IANA time zone identifiers with tzinfo gem directly, as Rails time zones are limited to "a meaningful subset of 134 zones" (per the same docs), but contain many duplications and omissions, and provide no criteria as to what "meaningful" means. More on this in the timezone tag wiki (near the end).
Also, Brazil starts DST on November 4th in 2018, not on October 21st. See here and here. This change was reflected in IANA TZDB 2018a, which was reflected in TZInfo-Data v1.2018.1. If you have that version or newer, then you have the newer, correct Brazil DST date, and thus explains your results.
There was recently a plan to push Brazil's DST date out even further to November 18th, but the government retracted that plan before it ever became official, and thus was never represented in the time zone data.
As the subject asks; do UNIX timestamps change in each timezone?
For example, if I sent a request to another email the other side of the world saying, "Send out an email when the time is 1397484936", would the other server's timestamp be 12 hours behind my own?
The definition of UNIX timestamp is time zone independent. The UNIX timestamp is the number of seconds (or milliseconds) elapsed since an absolute point in time, midnight of Jan 1 1970 in UTC time. (UTC is Greenwich Mean Time without Daylight Savings time adjustments.)
Regardless of your time zone, the UNIX timestamp represents a moment that is the same everywhere. Of course you can convert back and forth to a local time zone representation (time 1397484936 is such-and-such local time in New York, or some other local time in Djakarta) if you want.
The article at http://en.wikipedia.org/wiki/Unix_time is pretty impressive if you'd like a longer read.
Unix time is defined as the number of seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC), Thursday, 1 January 1970. So the answer is no
Unix timestamps do not change accross timezones, they are created for the purpose of having a standard time across globe.
NOTE:-
Timestamps are calculated on the basis of current time in the computer thus do not rely on them until and unless you are very sure about the time settings in the participating machines.
Someone stated that "UTC is Greenwich Mean Time without Daylight Savings time adjustments." This is simply untrue. GMT does not have Dayllight Savings Time. GMT is measured in Greenwich, England (at the Naval Obeservatory) [0 longitude, but not 0 lattitude]. UTC is measured at the equator [0 longitude and 0 lattitude - which happens to lie in the ocean off the cost of Africa].
What difference does it make? It doesn't make a difference in terms of "what time of day is it?" It does, however, make a difference in terms of calculating a year. Now you'd think a year would be measured based upon the location of the center (the core) of the earth, right? When the earth's core is back in the same location it was ~365 days ago, it has been a year. It isn't measured that way. It is measured by a specific location on the earth getting back to the same location (relative to the sun) that it was ~365 days ago. But the period of a day and a year don't divide evenly. Once the earth is back to about where it was a year ago, the earth isn't facing the same direction it was last year, so that spot on the earth isn't facing the same direction it was a year ago. Being further north, Greenwich isn't going to get back to the same spot (relative to the sun) that it was last year at the same time that 0 Lat / 0 Long is. So if you base the definition on Greenwith vs. 0/0, you get a, albeit slightly, different answer to the question "how many days are in a year". To put it another way, when a given spot on the earth gets back to where it was a year ago (relative to the Sun), the core of the earth isn't in the same spot it was a year ago, so what spot you pick matters because the core of the earth is going to be in a different spot (relative to the sun) than it was one year ago, if you pick a different spot on the earth.
Neither UTC nor GMT have daylight savings time. Europe/London time, the timezone that Greenwich resides in, does. But GMT does not. GMT is, what Americans would call a "Standard Time" - i.e. without DST.
Getting back to the question, Epoch time doesn't technically have a timezone. It is based on a particular point in time, which just so happens to line up to an "even" UTC time (at the exact beginning of a year and a decade, etc.). If that concept doesn't fit well in your brain, and if it helps to think of Epoch time as being in UTC, go right ahead. You're in good company and in the grand scheme of things, it really doesn't matter. You ever see those law suits where somoene is awarded $1. It's kind of a "you're right, but it doesn't really matter" type of verdict. If someone sued you for saying Epoch time is in the UTC timezone, they would win $1. That wouldn't buy them a cup of coffee at any Starbucks in any timezone on the planet.
IF both computers are set up correctly with their clocks set for the correct timezone and UTC values, they should return the same value.
Of course that's a big IF. There's almost certain to be a difference of at least a second, more often minutes between the time reported by two computers. And many computers are set up to have incorrect timezone settings, and will report their local time when asked a timestamp rather than UTC.
And in that lies the difference between theory and practice. In theory it's all the same, in practice you should not rely on it.
No, epoch timestamp should not change, because it has a fixed timezone which is UTC.
If you want to use a time object in other time zone, just look it up in libraries of the language you use, but do NOT try to add/substract a couple of hours from epoch timestamp and assume it's in another time zone, which will make things very confusing to other people, especially when you expose it in your API.
If you use C++, I recommend this library. I heard it will soon be added into standard library.
For all, I understand sometimes time object is hard to deal with and it looks easier to add/substruct on epoch timestamp. Please don't do it and do not persuade others to do it. A time object is much easier once you get used to it and can take care of time zone conversion easily without messing up with historical time zone changes due to politics/law etc...
We're currently storing local datetime of Pacific/Hawaii in the database. Assuming that I cannot change these dates to UTC, what information do I need to add to support timezone?
My thoughts are:
First, add a timezone field to indicate which timezone the user is viewing from. (The user will select this from a dropdown)
Second, add timezone field to indicate the timezone (Pacific/Hawaii) of the current datetimes values in the database.
Third, add offset to cover DST hours
So say a user from America/Los_Angeles views the site, it would pull the datetime from the database, append the offset and apply the timezone of Hawaii before converting it to Los Angeles time. For any calculation or comparison I would convert the Hawaii time to UTC first, then convert the UTC result to Los Angeles. Am I missing anything?
Your question is very broad, and without knowing more about your application, the platform, how you use collect dates and times, what they represent, etc., I can only speak in generalities.
Storing in UTC is recommended, but that is just by convention. The main necessities are that the time zone you are storing data in does not have DST (which Hawaii hasn't since 1947), and that you do not rely on your computer's operating system or environment settings to determine what time zone to use. You can use the Hawaiian time zone if you must. Be sure you document it somewhere though! It will surely be a surprise to anyone else that comes along in the lifecycle of the application.
While it would wolk, there is absolutely no advantage to doing this. You could just as easily convert your data to UTC when you roll out these changes and use UTC going forward. (That would be the preferred approach.)
The IANA time zone ID for Hawaii is "Pacific/Honolulu". If you're on Windows/.Net, the TimeZoneInfo ID is "Hawaiian Standard Time". Either way, they must be spelled, cased, and punctuated in exactly that manner.
Make sure you understand that a Time Zone Offset and a Time Zone are two different concepts. While Hawaii may use a fixed offset of -10:00, that's not guaranteed for most time zones. Please read the timezone tag wiki for further details.
You should probably not attempt to implement your own time zone logic. There are libraries for this in almost every language. Look to see what is appropriate for your platform. (If you provide details, I can offer suggestions.)
It would be a more robust solution to store the times as UTC time as time zone is local to the specific PC that is displaying the data. In your case if you store time plus offset how can you decide which offset to store? Not a workable solution if multiple time zones are involved.
I live in a country where they change the time twice a year. That is: there is a period in the year when the offset from UTC is -3 hours (-180 mins) and other period where the offset is -4 hours (-240 mins)
Grafically:
|------- (offset = -3) -------|------- (offset is -4) -------|
start of year mid end of year
My question is:
the "timezone" is just the number representing the offset? that is: my country has two timezones? or the timezone includes this information?
This is important because I save every date in UTC timezone (offset = 0) in my database.
Should I, instead, be saving the dates with local timezone and saving their offset (at the moment of saving) too?
Here is an example of a problem I see by saving the dates with timezone UTC:
Lets say I have a system where people send messages.
I want to have a statistics section where I plot "messages sent v/s hour" (ie: "Messages sent by hour in a regular day")
Lets say there are just two messages in the whole database:
Message 1, sent in march 1, at UTC time 5 pm (local time 2 pm)
Message 2, sent in august 1, at UTC time 5 pm (local time 1 pm)
Then, if I create the plot on august 2, converting those UTC dates to local would give me: "2 messages where sent at 1 pm", which is erratic information!
From the timezone tag wiki here on StackOverflow:
TimeZone != Offset
A time zone can not be represented solely by an offset from UTC. Many
time zones have more than one offset due to "daylight savings time" or
"summer time" rules. The dates that offsets change are also part of
the rules for the time zone, as are any historical offset changes.
Many software programs, libraries, and web services disregard this
important detail, and erroneously call the standard or current offset
the "zone". This can lead to confusion, and misuse of the data. Please
use the correct terminology whenever possible.
There are two commonly used database, the Microsoft Windows time zone db, and the IANA/Olson time zone db. See the wiki for more detail.
Your specific questions:
the "timezone" is just the number representing the offset? that is: my country has two timezones? or the timezone includes this information?
You have one "time zone". It includes two "offsets".
Should I, instead, be saving the dates with local timezone and saving their offset (at the moment of saving) too?
If you are recording the precise moment an event occurred or will occur, then you should store the offset of that particular time with it. In .Net and SQL Server, this is represented using a DateTimeOffset. There are similar datatypes in other platforms. It only contains the offset information - not the time zone that the offset originated from. Commonly, it is serialized in ISO8601 format, such as:
2013-05-09T13:29:00-04:00
If you might need to edit that time, then you cannot just store the offset. Somewhere in your system, you also need to have the time zone identifier. Otherwise, you have no way to determine what the new offset should be after the edit is made. If you desire, you can store this with the value itself. Some platforms have objects for exactly this purpose - such as ZonedDateTime in NodaTime. Example:
2013-05-09T13:29:00-04:00 America/New_York
Even when storing the zone id, you still need to record the offset. This is to resolve ambiguity during a "fall-back" transition from a daylight offset to a standard offset.
Alternatively, you could store the time at UTC with the time zone name:
2013-05-09T17:29:00Z America/New_York
This would work just as well, but you'd have to apply the time zone before displaying the value to anyone. TIMESTAMP WITH TIME ZONE in Oracle and PostgreSQL work this way.
You can read more about this in this post, while .Net focused - the idea is applicable to other platforms as well. The example problem you gave is what I call "maintaining the perspective of the observer" - which is discussed in the same article.
that is: my country has two timezones? or the timezone includes this information?
The term "timezone" usually includes that information. For example, in Java, "TimeZone represents a time zone offset, and also figures out daylight savings" (link), and on Unix-like systems, the tz database contains DST information.
However, for a single timestamp, I think it's more common to give just a UTC offset than a complete time-zone identifier.
[…] in my database.
Naturally, you should consult your database's documentation, or at least indicate what database you're using, and what tools (e.g., what drivers, what languages) you're using to access it.
Here's an example of a very popular format for describing timezones (though not what Windows uses).
You can see that it's more than a simple offset. More along the lines of offsets and the set of rules (changing over time) for when to use which offset.
I'm building an application which will be able to send emails at any specific local time to any place in the world.
For example, my daily schedule (localtime):
8:00 AM - Send email to John in Toronto, Canada
9:15 AM Western Standard Time (Australia) - Send email to Bob in Perth, Australia
10:12 PM - Send email to Anas in Rabat, Morocco
I want to be able to execute this code on and Amazon EC2 server in a single location (e.g. São Paulo, Brasil).
I also know that Toronto is in Eastern Standard Time, (UTC - 5h) , but from March 11, 2012 to November 4, 2012, it is in Eastern Daylight Time (UTC - 4h).
I also know that Perth is in Western Standard Time (UTC + 8h), with no daylight savings.
I also know that Rabat is in Western European Time (UTC), but from April 29,2012 to July 20,2012, and August 19,2012 to Sept 30, 2012 it is in in West European Summer Time (UTC + 1h)
To keep track of these combinations of time zone, daylight savings, et cetera, I will, of course insist that all internal server times be in UTC. However, I need some way to keep track of when and how each time-zone jurisdiction switches time zones because of Daylight Savings or (in the case of Rabat) Ramadan, and then adjust my crontabs to accommodate these changes.
Is there an authoritative web service or set of tables somewhere which would help me keep these timezone changes in sync with my desire to deliver emails at the same local time every day to users in different time zones with different switchover dates for daylight savings?
Most programming languages give you access to timezone conversion functions. The most rudimentary ones only work between UTC and the "local" timezone of the server, so you will need a full-featured one, such as pytz for Python that will let you specify a local time with a timezone name (e.g. "America/Toronto") and convert it to UTC for you. Given that, you don't need to worry about the UTC offsets of different timezones (including historical offsets if they've changed) nor DST start end end times: the library will take care of it for you. Just make sure you have the latest database, which comes in the tzdata package.
As for your crontab, you're probably best off if the local timezone on the server that runs cron is UTC, that way you can put UTC times directly in the crontab. On the other hand, depending on the volume of events that you have, I would advice just having cron run your code at regular schedules intervals (such as every 5 minutes) and then your code figures out what events need to be triggered based on the current UTC time and the contents of your database. Then it doesn't matter what the timezone of the server is.
Is there an authoritative web service or set of tables somewhere ...
No, there is nothing "authoritative", but there is something close. It's called the TZ database, and it is currently under the oversight of IANA. Its home page is here.
It is also known as tzdata, zoneinfo, timezonedb, tzdb, the Olson Database, or the IANA Time Zone Database.
There are implementations for just about every language and platform you can imagine. You can read more in the tz-link file from the tzdb, and in the timezone tag wiki, here on StackOverflow.