I have a data converter that generates the following output
[{
"deviceName": "meter 34936959",
"groupName": "All",
"ts": 1579788289,
"values": {
"counter": 2686
}
}]
timestamp is 1579788289, which is Thursday, 23 January 2020 14:04:49
but in the device page, latest telemetry is "1970-01-19 08:49:48"
can you please help me to understand what is wrong with the structure?
Thank you
It is expecting a timestamp in milliseconds, but you are supplying a timestamp in seconds.
1579788289 seconds since 1970 is "2020-01-23 14:04:49"
1579788289 milliseconds since 1970 is "1970-01-19 08:49:48"
There is an example of this in the ThingsBoard documentation:
In the example above, we assume that “1451649600512” is a unix
timestamp with milliseconds precision.
If you can't get the millisecond timestamp, try multiplying your timestamp by 1000 before sending it. With your example, that would be:
[{
"deviceName": "meter 34936959",
"groupName": "All",
"ts": 1579788289000,
"values": {
"counter": 2686
}
}]
Related
I am seeing issues with the Graph API method findMeetingTimes.
As you can see from the attached file the API response differs depending on the start time. When using flat times like 12:00 the response includes only flat times - while when using non-flat times like 12:15 it only includes 'half-hour times'.
)
So to get all possible meeting times I would have to make at least two API calls, which doesn't seem very practical.
Is there something I am missing ?
Thanks in advance,
Jacky
No, you are not missing anything. You'll need to call the API more than once to get suggestions with overlapping times.
The API returns the nearest available time to the start time specified in the request. The suggestions will always be on the hour or at half past.
Thereafter, it will give suggestions with increments of 30 minutes, or value specified in the meetingDuration property from the first suggestion, without overlaps.
If you had set your start time to say, 12:15 and the first available time is 13:00 and meeting duration is 1h, all suggestions will be on the hour. The same applies if you'd set the start time to 12:00 and the first available time is at 12:30, all suggestions will be at half past.
You can add the returnSuggestionReasons property in your request which gives an explanation on why a particular time was suggested.
{
"timeConstraint": {
"activityDomain": "unrestricted",
"timeSlots": [
{
"start": {
"dateTime": "2021-05-24T12:00:00",
"timeZone": "UTC"
},
"end": {
"dateTime": "2021-05-24T18:00:00",
"timeZone": "UTC"
}
}
]
},
"meetingDuration": "PT30MIN",
"returnSuggestionReasons": "true"
}
I have been facing the issue while retrieving year from snowflake table.
My table has value as below:
year :20
day:10
month :02
I need to dob value as 2020-10-02. When I am using the concat_ws I'm getting expected result, however the padded with 00 the dob printed like 0020-10-02.
Also when we have 99 in the year column then while retrieving it should display 1999
I have created query as below:
select to_date('concat_ws('-',json:year::varchar,json:month::varchar,json:date::varchar)', 'MM/DD/YYYY') from XXX;
Suggest me if any functions also.
Take a look at this
YY
Two-digit year, controlled by the TWO_DIGIT_CENTURY_START session parameter, e.g. when set to 1980, values of 79 and 80 parsed as 2079 and 1980 respectively.
select TO_DATE('99-10-02','YY-MM-DD');
There's no way to have automagically the "right" year display before, if some of your users are going to be very young and others very old, how could you know for sure if the person was born in 20th or 21st century?
I didn't quite get how your data is stored, so I'll assume a json structure since it appears in your query.
set json = '{
"elements": [{
"year": "01",
"month": "02",
"day": "10"
}, {
"year": "99",
"month": "02",
"day": "10"
}, {
"year": "20",
"month": "02",
"day": "10"
}]
}'::varchar;
Now I'll parse the json, extract the values, putting that here so you can make sure we're having the same data structure.
CREATE OR REPLACE TEMP TABLE test AS
select t2.VALUE: day::varchar dob_day,
t2.VALUE: month::varchar dob_month,
t2.VALUE: year::varchar dob_year
from (select parse_json($json) as json) t,
lateral flatten(input => parse_json(t.json), path => 'elements') t2
Now is the part that will interest you. It is dirty trick and assumes that if the two digit year is higher than the current two digit year, then it cannot be 2000, but instead 1900.
SELECT to_date(concat_ws('-',
IFF(t2.dob_year > RIGHT(extract(year, current_timestamp), 2), '19'||t2.dob_year , '20'||t2.dob_year ),
t2.dob_month,
t2.dob_day)) proper_date
FROM test t2;
Change the date format in your to_date to 'YY-MM-DD' should give you DOB you want, and I suggest to use try_to_date if you suspect data issues as it will NULL the field if not a valid date.
NOTE if you are using US formatting then use YY-DD-MM (the month in the end)
select to_date('concat_ws('-',json:year::varchar,json:month::varchar,json:date::varchar)', 'YY-MM-DD') from XXX;
Also, if you want to safely check that the DOB is not in future then add IFF as follows
IFF(CURRENT_DATE > to_date('concat_ws('-',json:year::varchar,json:month::varchar,json:date::varchar)', 'YY-MM-DD'), to_date('concat_ws('-',json:year::varchar,json:month::varchar,json:date::varchar)', 'YY-MM-DD'),NULL);
I'm looking a way to override Highcharts.timeUnits, which seems the array defining when jump from a unit to another unit. (It's not as simple as that in the source code, but ...)
Why ? Because my need is the following :
I have timeserie data, by 15 minutes, for 3 years.
I want a zoomable column chart (I use highstock in practice), where I have 1 column by month with a given zoom, 1 column by week when I zoom more, then 1 by day, 1 by hour and then 1 by 15min
In the idea, I use plotOptions.series.dataGrouping.units to define these parameters:
units: [
[
'minute',
[15]
], [
'day',
[1]
], [
'week',
[1]
], [
'month',
[1]
], [
'year',
[1]
]
],
But Highcharts use its own rules to jump from an unit to another one, and I find these rules not appropriated to my use-case (switch is too "early", I think)
I can change a bit the behaviour by modifying groupPixelWidth but this is a last resort, because it behaves strangely anyway.
Here is an example. If you click on the various time scales, you'll see that the the timeUnit choosen are often not appropriated.
For example choosing week or with a span from Nov 5, 2018 to Nov 26 2018 : you get data displayed by hour, but data by would be more readable, etc.
https://jsfiddle.net/f4wma38h/
so you are looking for 15 minutes & hour zooming in your chart am i right?
xAxis: {
type: 'datetime',
minRange: 15 * 60 * 1000
},
rangeSelector: {
buttons: [{
type: 'minute',
count: 15,
text: '15m'
}, {
type: 'minute',
count: 60,
text: '1h'
},.....
make the xrange minvalue as 15 in milliseconds & in range selector add the button for 15m & 1h
jsfiddle:https://jsfiddle.net/karnan796/f4wma38h/11/
select * from weather.forecast where woeid in (SELECT woeid FROM geo.placefinder WHERE text="30.7063633,76.7047791" and gflags="R")
I am using the above YQL to fetch the weather conditions for some lat, lng to show in my iOS app. The response has "pubDate":
"pubDate": "Fri, 29 May 2015 8:30 am IST",
"condition": {
"code": "28",
"date": "Fri, 29 May 2015 8:30 am IST",
"temp": "89",
"text": "Mostly Cloudy"
My concern is, will this "pubDate" ever change? I mean at 8:30 am the weather is mostly cloudy may be at 12 noon it won't be. If i access this YQL at 12 Noon the response will be same ??
Also, I have no idea about the "and gflags="R"" part of the query..
As per Yahoo developer docs here.
pubDate The date and time this forecast was posted, in the date
format defined by RFC822 Section 5, for example Mon, 25 Sep 17:25:18
-0700.
lastBuildDate The last time the feed was updated. The format is in
the date format defined by RFC822 Section 5, for example Mon, 25 Sep
17:25:18 -0700.
So, until and unless the backend gets an update of "temperature change" for a particular location, the API will not reflect any change. So that's why "lastBuildDate" is also comes in the json, which specifies when the temperature feed was last updated. So you can't do anything manually to get the temperature of a particular location for current time,
If you try to call this API in different moments during the same day you will see that lastBuildDate is the same date and time of your call.
The problem here is that the date in the condition doesn't change, and after some time the condition itself becomes obsolete, as you can easily verify using yahoo meteo app.
I'm using the latest Kibana 4 / ES 1.4 version and I'm trying to display the number of tweets over time. My idea is to slice the 'created_at' field from the tweets documents.
The mapping defined for this field is the following
dynamic_templates": [
{
"created_at": {
"mapping": {
"locale": "US",
"format": "EEE MMM dd HH:mm:ss Z yyyy",
"type": "date"
},
"match": "created_at"
}
},
...
I can create basic charts in Kibana (with term aggregates field) and overall seems to be working but I cannot display any trends with line charts or date histogram .with the created_at field.
Below is the error
ElasticsearchParseException[failed to parse date field [2014-10-13T23:35:31.450Z],
tried both date format [EEE MMM dd HH:mm:ss Z yyyy], and timestamp number]; nested:
IllegalArgumentException[Invalid format: \"2014-10-13T23:35:31.450Z\"]; }
Thanks for your help,
Arnaud
I'm unsure if the format of your date is correct but if it is I had a similar problem in ElasticSearch 1.4 that the parser wouldn't recognise the timestamp unless the timestamp was mapped within properties:
curl -XPUT 'http://localhost:9200/index/container/_mapping' -d'
{
"container" : {
"properties" : {
"#timestamp" : {"type":"date", "format": "dateOptionalTime"}
}
}
}'
Also may not be relavant to you but for a useful article for naming date conventions: http://joelabrahamsson.com/dynamic-mappings-and-dates-in-elasticsearch/
And the elastic search date format list: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/mapping-date-format.html
Hope this helps!