Is it possible to write a InfluxDB query that will give me the number of milliseconds since the last entry in a time series? I'd like to add a single-stat panel in Grafana displaying how old the data is.
I don't think it is possible since you are not able to query the time alone. A influxdb query needs at least one non-time field in a query. You could workaround that by double saving the time in a extra field which you are able to query alone.
But you still want to use now() - "the extra time field". But as far as I found out you also can't use now() inside grafana.
Update: there is a [Feature-Request] now on grafanas github. Make sure to vote it up so it gets implemented one day: https://github.com/grafana/grafana/issues/6710
Update 2: The feature got finaly implemented -> See my answer here: How to show "33 minutes ago" on Grafana dashboard with InfluxDB?
Related
Every example of a time-series graph I've seen for Google Data Studio has a metric plotted per day. Is there any way to configure the granularity of the time axis (hour, month, etc)?
I want to show the count of events per hour throughout one day.
My columns are in bigquery as types datetime:TIMESTAMP and count:INTEGER
This is old, but its high in search results, and I found the correct answer here: https://www.en.advertisercommunity.com/t5/Data-Studio/Is-it-possible-to-aggregate-by-hour/td-p/1104815
Click the pencil next to the datasource, on your timestamp, click on the type column, and change "Date(YYYYMMDD" to another format, such as DATE HOUR("YYYYMMDDHH") to get hourly aggregation on the graphs.
Now, there seems to be a straightforward way:
Assuming that you have a Time Series plot, use Timestamp for your dimension and want to show hourly aggregated values.
Select the dimension's preferences:
Change Type to Date Hour
Edit: Based on new updates to Data Studio, #Brian's answer above is the correct one.
You can create a calculated field with the TODATE funciton. Example formula can be TODATE(source_field, 'SECONDS', '%Y%m%d%H'). Then this field should be marked as Date(YYYYMMDDHH) in the field editing screen.
There is a way to do this. If the data is broken up hour by hour, make the hour column(s) the Time Dimension. If all of the hour data isn't in a single column, you may want to reformat your data (manually or using a data prep source, your choice). Or you can go into the time setting of the field and change it into hours.
Is it possible to use the time field in a single stat panel in grafana?
I understand you cannot only query the time field in influxdb, but I can get the time of the stat I'm interested in like so:
select time, last(context_id) from "data_context"
And just need a way to show the time field from the execution of the query.
This is quiet often asked on stack overflow, but it is not possible at the moment. But there are open Feature requests for this on github:
[Feature request] Show timestamp on SingleStat #6710
Showing time from InfluxDB query in Singlestat panel #2764
So my main goal is to build a graph that charts how much data I've went through in a month in grafana (I'm on a comcast line).
Month is not a time period however that influxdb's GROUP BY time() function supports. The documentation I looked at. From here it looks like the longest time period is a week, likely because it doesn't change like a month does.
However I noticed that all my time stamps use the same format (it would be weird if they didn't I guess). I know that influxdb supports regex FROM and WHERE statements, but does it support GROUP BY? If it did, I could use something like "/-([^-]+)-/" to query timestamps like 2016-12-18T08:25:50Z and group by that? Or does influxdb support nested queries?
edit: looks like I was looking at .9. I edited to 1.1 but didn't edit anything about my question.
GROUP BY time() queries group query results by a user-specified time interval
I'm not really sure how that would work for a month considering that my length of time changes?
No version of InfluxDB supports GROUP BY <regular expression>, but as on 1.2 subqueries will be supported.
Iam very much new to Zabbix. i have tried my hands on triggers. what i was able to make out was it can set triggers on some constant threshold. what i need is that it should compare with the data which i exactly one week old for that exact time and if the change is above some particular % threshold then trigger an alert.
i had tried some steps like keeping the current data and one week old data in and external database and then querying that data with zabbix ODBC drivers but then i was stuck when i was not able to compare two items.
if i may be confusing stating my issue. let me know and i will be more clear with my problem
you can use the last() function for this.
For example if we sample our data every 5 minutes and we want to compare the last value with the value 10 minutes ago we can use
(item1.last(#1)/item2.last(#3)) > 1.2 - this will trigger an alert if the latest value is greater by 20% than the value 10 minutes ago.
From the documentation it is not very clear to me if you can use seconds or if they will be ignored (for example item.last(60) - to get the value 1 minute ago), but you can read more about the last function here:
https://www.zabbix.com/documentation/2.4/manual/appendix/triggers/functions
I am to store quite large amount of boolean values in database used by Rails application - it needs to store 60 boolean values in single record per day. What is best way to do this in Rails?
Queries that I will need to program or execute:
* CRUD
* summing up how many true values are for each day
* possibly (but not nessesarily) other reports like how often true is recorded in each of field
UPDATE: This is to store events that may or may not occur in 5 minute intervals between 9am and 1pm. If it occurs, then I need to set it to true, if not then false. Measurements are done manually and users will be reporting these information using checkboxes on the website. There might be small updates, but most of the time it's just one time entry and then queries as listed above.
UPDATE 2: 60 values per day is per one user, there will be between 1000-2000 users. If there isn't some library that helps with that, I will go for simplest approach and deal with it later if I will get issues with performance. Every day user reports events by checking desired checkboxes on the website, so there is normally a single data entry moment per day (or few if not done on daily basis).
This is dependent on a lot of different things. Do you need callbacks to run? Do you need AR objects instantiated? What is the frequency of these updates? Is it done frequently but not many at a time or rarely but a bunch at once? Could you represent these booleans as a mask instead? We definitely need more context.
Why do these need to be in a single record? Can't you use a 'days' table to tie them all together, then use a day_id column in your 'events' table?
Specify in the Day model that it 'has_many :events' and specify in the Event model file that it 'belongs_to :day'. Then you can find all the events for a day with just the id for the day.
For the third day record, you'd do this:
this_day = Day.find 3
Then you can you use 'this_day.events' to get all the events for that day.
You'll need to decide what you wish to use to identify each day so you query for a day's events using something that you understand. The id column I used above to find it probably won't work.
You could use the timestamp first moment of each day to do that, for example. Or you could rely upon the 'created_at' column of the table to be between the start and end of a day
And you'll want to be sure to thing about what time zone you are using and how this will be stored in the database.
And if your data will be stored close to midnight, daylight savings time could also be an issue. I find it best to use GMT to avoid that issue.
Good luck.