I have some data that is coming in from the network every minute from collectd and being entered into my influx database. Since its via the network it never comes in right on the one minute mark and has some jitter...
I am trying to reconcile this irregular data to one minute intervals but I am seeing some missing gaps, please see example below.
> precision rfc3339
>
> select value from snmp_ecio where host = 'my_host_a' and time > now() - 10m
name: snmp_ecio
time value
---- -----
2017-01-22T00:25:59.987735Z -0.1
2017-01-22T00:27:00.003208Z -0.1
2017-01-22T00:28:00.047265Z -0.2
2017-01-22T00:29:00.142676Z -0.1
2017-01-22T00:30:00.048707Z -0.3
2017-01-22T00:31:00.211728Z -0.1
2017-01-22T00:31:59.980621Z -0.1
2017-01-22T00:32:59.795329Z -0.1
2017-01-22T00:34:03.206552Z -0.1
2017-01-22T00:35:00.01463Z -0.1
> select mean(value) from snmp_ecio where host = 'my_host_a' and time > now() - 10m group by time(1m)
name: snmp_ecio
time mean
---- ----
2017-01-22T00:25:00Z -0.1
2017-01-22T00:26:00Z
2017-01-22T00:27:00Z -0.1
2017-01-22T00:28:00Z -0.2
2017-01-22T00:29:00Z -0.1
2017-01-22T00:30:00Z -0.3
2017-01-22T00:31:00Z -0.1
2017-01-22T00:32:00Z -0.1
2017-01-22T00:33:00Z
2017-01-22T00:34:00Z -0.1
2017-01-22T00:35:00Z -0.1
Has anyone experienced this issue or can point me in the right direction?
Thanks!
To fill in missing time intervals, use the fill operator. Depending on which version you're using, you'll have various options for your fill. If you're using 1.1+, I'd use the fill(linear) otherwise, I'd choose fill(previous).
SELECT mean(value)
FROM snmp_ecio
WHERE host = 'my_host_a' AND time > now() - 10m
GROUP BY time(1m) fill(linear)
Related
Prometheus does support binary comparison operators between an instant vector and a scalar. E.g. memory_usage_bytes > 1024. But is it possible to query a gauge metric that is greater than X and smaller than Y at the same time? How can I achieve something like memory_usage_bytes > 1024 && <= 2048?
Ok, I think I figured it out. A query like this would return metrics within a value range:
(go_gc_duration_seconds > 0.0002) < 0.0006
Use the following:
memory_usage_bytes > 1024 and memory_usage_bytes <= 2048
Hello generous people,
I am writing a model for farmer's decision making based on last period crop production. Initially land parcels (small or large) make farmer to use either of ground or surface water. In later ticks farmer will decide using type of water groundwater or surfacewater based on crop production. A high level of crop production makes farmers to have a memory more than a number X for instance and if memory is higher than X; farmer will choose to follow the strategy he has used to obtain higher crop. I am unable understand that how memory of a farmer will be build to use as an input in the same loop/ code block which I have written for initial yield. Experts on board please extend your help.
Globals [ surface-water surface-water maximum-yield water-demand water-used ]
Turtle-own [ yield memory]
to setup
clear-all
create 5 [ set yield 0
set memory 0
set surface-water 10
set maximum yield 60
set groundwater 20
set water-demand 17
set land random 5 + 3]
reset-ticks
end
to go
tick
ask turtles with land >= 4 [ ifelse random 1 = 0 [set groundwater-use groundwater - water-demand
set yield 0.8 * maximum-yield
set memory % of yield ]
[ set groundwater-use 0.5 * water-demand
set surfacewater-use groundwater-use 0.5 * water-demand
set yield 0.85 * maximum-yield
set memory % of yield]
ask ask turtles with land < 4 [ set groundwater-use 0.5 * water-demand
set surfacewater-use groundwater-use 0.5 * water-demand
set yield 0.85 * maximum-yield
set memory % of yield]
end
I want to build Exploration AFL. Below is the scenario.
Momentum Score:
Monthly momentum values are calculated as cumulative returns over the past 12 months.
The monthly momentum is calculated in 3 steps
1) We calculate gross monthly returns by adding one to the percent monthly return. For example, from a monthly return of 5% (0.05), we get the gross monthly return value of 1.05 (0.05 + 1) while from a monthly return of -5% (-0.05) we get a gross monthly return of 0.95 (0.05 + 1.0).
2) We multiply all the gross monthly returns of past 12 months.
3) We subtract one from the resultant value from step 2 to get the net 12-month momentum score.
To illustrate this calculation, let's say AUROPHARMA (Aurobindo Pharma) stock has moved by 2%, -5%, 4.3%, 0.5%, 10.1%, -2.2%, 6%, 3.6%, 0.1%, 0.4%, 1.4%, -2.6% over the past 12 months. Then, we add 1 to monthly return, multiply all of them & subtract one from it to get the momentum score.
Momentum Score = (1.02)(0.95)(1.043)(1.05)(1.101)(0.978)(0.94)(1.036)(1.001)(1.004)(1.014)*(0.974) - 1
This will give a momentum score of 10.45% (0.1045) to the Aurobindo Pharma Stock.
Can someone please help?
TimeFrameSet(inMonthly);
TtD_Change = 100 * (Close - Ref(Close, -12) ) / Ref(Close, -12);
_SECTION_BEGIN("Explorer");
Filter = 1;
AddColumn(TtD_Change,"Momentum",1.2,IIf(TtD_Change>0,colorGreen,colorRed));
_SECTION_END();
Want to map Euclidean distance to the range [0, 1], somewhat like the cosine similarity of vectors.
For instance
input output
0 1.0
1 0.9 approximate
2 0.8 to 0.9 somewhere
inf 0.0
I tried the formula 1/(1+d), but that falls away from 1.0 too quickly.
It seems that you want the fraction's denominator to grow more slowly (the denominator is the bottom part, which you have as (d+1) so far). There are various ways to handle this. For instance, try a lower power for d, such as
1 / (1 + d**(0.25))
... or an exponential decay in the denominator, such as
1 / (1.1 ** d)
... or using a trig function to temper your mapping, such as
1 - tanh(d)
Would something in one of these families work for you?
Suppose I'm trying to create a Neural Network to recognize characters on a simple 5x5 grid of pixels. I have only 6 possible characters (symbols) - X,+,/,\,|
At the moment I have a Feedforward Neural Network - with 25 input nodes, 6 hidden nodes and a single output node (between 0 and 1 - sigmoid).
The output corresponds to a symbol. Such as 'X' = 0.125, '+' = 0.275, '/' = 0.425 etc.
Whatever the output of the network (on testing) is, corresponds to whatever character is closest numerically. i.e - 0.13 = 'X'
On Input, 0.1 means the pixel is not shaded at all, 0.9 means fully shaded.
After training the network on the 6 symbols I test it by adding some noise.
Unfortunately, if I add a tiny bit of noise to '/', the network thinks it's '\'.
I thought maybe the ordering of the 6 symbols (i.,e - what numeric representation they correspond to) might make a difference.
Maybe the number of hidden nodes is causing this problem.
Maybe my general concept of mapping characters to numbers is causing the problem.
Any help would be hugely appreciated to make the network more accurate.
The output encoding is the biggest problem. You should better use a one-hot encoding for the output so that you have six output nodes.
For example,
- 1 0 0 0 0 0
X 0 1 0 0 0 0
+ 0 0 1 0 0 0
/ 0 0 0 1 0 0
\ 0 0 0 0 1 0
| 0 0 0 0 0 1
This is much easier for the neural network to learn. At prediction time, pick the node that has the highest value as your prediction. For example, if you have below output values at each output node:
- 0.01
X 0.5
+ 0.2
/ 0.1
\ 0.2
| 0.1
Predict the character as "X".