how to get top 10 requester from influxdb - influxdb

I am using influxdb version 1.7.9.
I have below data in measurement name "tpam" and database name is "test".
I have around 15000 rows with this data in measurement.
I am looking for query to get top 10 "Requester Name" and there count of requests in particular "Request Month"
enter image description here

Related

how to get latest timestamp data

I am new to influx db, please help me with query?
I have below like data in influx where same Name data (A1, A2) can be available for multiple time.
I need only latest time stamp data (row 3,4,5) if same data is available in multiple time stamp and the new data (A3). Is such query available in influx?
this query only gives one record,
SELECT time, Name, value FROM "data" order by time desc limit 1
You can use the InfluxDB's last function to achieve this.
SELECT LAST("value") FROM AssetAssetType GROUP BY "Name"

InfluxDB: Starting cumulative_sum() from zero / aggregate grouping required for cumulative_sum and non_negative_difference

Using InfluxDB, I'm trying produce an output that shows cumulative rainfall for a time period, that starts from zero.
The rainfall sensor outputs a cumulative rainfall amount, but resets to zero on power-failure, restart etc.
My first query component uses non_negative_difference() to show the increments.
select
non_negative_difference(rain) as nnd
FROM
weather
WHERE
$time_query
.... yields an increment per raw data point, for example:
2018-06-01T14:21:00.926Z 0
2018-06-01T14:22:02.959Z 0.30000000000000426
2018-06-01T14:23:04.992Z 0.3999999999999986
2018-06-01T14:24:07.024Z 0.10000000000000142
2018-06-01T14:25:09.059Z 0.19999999999999574
2018-06-01T14:26:11.094Z 0
2018-06-01T14:27:13.127Z 0.10000000000000142
2018-06-01T14:28:15.158Z 0.20000000000000284
2018-06-01T14:29:20.027Z 0.09999999999999432
2018-06-01T14:30:22.476Z 0.10000000000000142
2018-06-01T14:30:53.918Z 0.6000000000000014
2018-06-01T14:31:55.968Z 0.5
2018-06-01T14:32:58.007Z 0.5
2018-06-01T14:34:00.046Z 0.20000000000000284
2018-06-01T14:35:02.075Z 0.3999999999999986
2018-06-01T14:36:04.102Z 0.3999999999999986
2018-06-01T14:37:06.136Z 0.20000000000000284
2018-06-01T14:38:08.201Z 0
So far so good.
I'm now trying to stitch these readings back to cumulative total, starting from zero for the intended period.
I can use cumulative_sum() for this, for example:
SELECT
cumulative_sum(nnd)
FROM
(SELECT
non_negative_difference(rain) as nnd
FROM
weather
WHERE
$time_query )
which yields:
2018-06-01T14:21:00.926Z 0
2018-06-01T14:22:02.959Z 0.30000000000000426
2018-06-01T14:23:04.992Z 0.7000000000000028
2018-06-01T14:24:07.024Z 0.8000000000000043
2018-06-01T14:25:09.059Z 1
2018-06-01T14:26:11.094Z 1
2018-06-01T14:27:13.127Z 1.1000000000000014
2018-06-01T14:28:15.158Z 1.3000000000000043
2018-06-01T14:29:20.027Z 1.3999999999999986
2018-06-01T14:30:22.476Z 1.5
2018-06-01T14:30:53.918Z 2.1000000000000014
2018-06-01T14:31:55.968Z 2.6000000000000014
2018-06-01T14:32:58.007Z 3.1000000000000014
2018-06-01T14:34:00.046Z 3.3000000000000043
2018-06-01T14:35:02.075Z 3.700000000000003
2018-06-01T14:36:04.102Z 4.100000000000001
2018-06-01T14:37:06.136Z 4.300000000000004
2018-06-01T14:38:08.201Z 4.300000000000004
Looking good!
Now I'd like to group it up into more distinct time buckets, for nice graphing.
Let's try....
SELECT
cumulative_sum(max(nnd))
FROM (SELECT
non_negative_difference(rain) as nnd
FROM
weather
WHERE
$time_query)
GROUP BY
time(5m)
and I get an error: ERR: aggregate function required inside the call to non_negative_difference
But I cannot find a reasonable way of adding aggregates and groupings to non_negative_difference() that do not affect the accuracy of the differencing function itself.
The only thing I've been able to do is a dummy aggregate SUM() over time groups that are smaller than the sensor period. But this isn't robust enough for my liking - (and i'm still not sure it is 100% correct)
Is it correct that I must have both queries as aggregate queries?
I was trying to do this very thing for my weather station. Instead of having the weather station calculate the cumulative value I wanted Grafana to do it. The solution that worked for me is the advanced syntax Yuri Lachin mentions in his comments.
With InfluxDB you can use CUMULATIVE_SUM(), but the basic syntax doesn't allow you to group by time (only by tag). The "advanced syntax", however, allows you to to have a time series by nesting an aggregate function like MEAN() or SUM().
Here's the function I am using in Grafana to get a cumulative rainfall total for a selected time period:
SELECT CUMULATIVE_SUM(MEAN("rainfall")) FROM "weather" WHERE $timeFilter GROUP BY time(1h) fill(0).
The GROUP BY is, of course, flexible. I was interested in hourly rainfall so I grouped by 1h. You can group by the time period you find most interesting.
Using this query the rainfall will start from zero for period you select in Grafana. In the Seattle area we had measurable rain (I know, shocker) on 8/6/2020 and 8/8/2020. If I set my date range to include both dates the graph shows just under .2mm total rainfall:
If I switch my graph to 8/8 and 8/9 the total is just under 1mm:
Note: I was also interested in seeing the individual bucket tips so included those as bars on the second Y-axis.
For more detail see: https://docs.influxdata.com/influxdb/v1.8/query_language/functions/#advanced-syntax-7

Elasticsearch group by ranges of time in Rails

I have a large dataset in which I need to group values based on the created_at time.
Requirements are that they're grouped by 5 minute intervals.
I think that this should do it, but it doesn't seem to work:
self.search(aggs: {created_at: {date_histogram: {field: 'created_at', interval:
'5m'}}})
This is the search query:
curl http://localhost:9200/prices_development/_search?pretty -d '{"query":
{"match_all":{}},"size":1000,"from":0,"aggs":{"created_at":{"date_histogram":
{"field":"created_at","interval":"5m"}}},"timeout":"11s","_source":false}'
It just gives me back the entire set of data though. i.e. every record:
How can I to get back the data set, grouped by 5 minutes intervals?
Aggregation results are present in aggregationsfield of the response, not hits.
Set size inside aggs to limit number of aggregations.
If you only want aggregation results then set outer size to 0

How to query how many metrics in a period with Influxdb?

I want to know how many events we send to InfluxDB for a given period. If I use the following query SELECT COUNT(value) FROM /./ WHERE time > now() - 1h GROUP BY time(10m), I get that grouped for each metric but I want the total for all metrics.
If I use SELECT COUNT(*) FROM /./ WHERE time > now() - 1h GROUP BY time(10m), I get an error:
Server returned error: expected field argument in count()
The COUNT function takes one and only one field key as an argument. If you have field keys that are not named value you will have to run a separate query to count them.
Alternately, you can run them together like:
SELECT COUNT(value), COUNT(otherfield), COUNT(anotherfield) FROM /./ WHERE time > now() - 1h GROUP BY time(10m)

Tricky getting average for price when summing extended and qty

The data we display is summarized by order type. Data looks: where ABCDE is the sum of one or more rows with same item #.
Item Order Type QTY PRICE EXT. PRICE
ABCDE INT 10 $100
I am not displaying price because this row is a summary of several in time frame selected. Price on these items changes in time or for customer.
So What I can do is give an average price.
I have 2 formulas but the result is not correct. all result is same number 91,979.00
formula 'new avg' sum({DATA_WHSV3.ITEM_PRC$}) / count({DATA_WHSV3.ITEM_PRC$})
then If {#new avg} > 0 then
sum({DATA_WHSV3.ITEM_PRC$})/{#new avg}
You can use a Running Total Field to do the average for you. So say your report is grouped by Item# you would create a new Running Total Field.
Running Total Name: RTotal0 (can be anything, this is just the default)
Field to Summarize: {DATA_WHSV3.ITEM_PRC$}
Type of Summary: average
Evaluate: For each record
Reset: On change of group: Group #1: DATA_WHSV3.ITEM_NUMBER
Then you can drop the {#RTotal0} into the group footer along with the other details for that item number and it should be the correct average.

Resources