I have a RRD created by this command
===
data_sources = [ 'DS:Pprod:GAUGE:10:0:6000',
'DS:Pcons:GAUGE:10:0:6000',
'DS:Pdiff:GAUGE:10:-6000:6000'
]
rrdtool.create ( rrd_filename,
'--step', '5', # one sample every 5 sec
'--start', 'now - 7d',
data_sources,
'RRA:MAX:0.5:1:6307200') # we keep 1 year
===
I am filling it properly -
===
<!-- 2022-05-19 12:25:45 CEST / 1652955945 --> <row><v>4.291018731e+03</v><v>1.715717220e+03</v><v>2.574301511e+03</v></row>
<!-- 2022-05-19 12:25:50 CEST / 1652955950 --> <row><v>4.286880929e+03</v><v>1.721000000e+03</v><v>2.564880929e+03</v></row>
<!-- 2022-05-19 12:25:55 CEST / 1652955955 --> <row><v>4.286880929e+03</v><v>1.721000000e+03</v><v>2.564880929e+03</v></row>
===
Now I want to "xport" some data from it :
rrdtool xport -s $epochStart -e $epochEnd --step 5 --json -t DEF:pot=$ifn:Pprod:MAX XPORT:pot
===
And in the output JSON I find the "step" has changed :
{ "about": "RRDtool graph JSON output",
"meta": {
"start": 1652738755,
"end": 1652824970,
"step": 215,
"legend": [
""
]
},
"data": [
[ "1652738755",0.000000000e+00 ],
[ "1652738970",0.000000000e+00 ],
===
Any logical reason for that ?
PD.- "rrdtool info" says
pi#R4:~/python/pkw/dades $ rrdtool info ./pkw.rrd
filename = "./pkw.rrd"
rrd_version = "0003"
step = 5
This is most likely caused by trying to export data outside of the current rage for the defined RRA.
Your RRD has only a single RRA defined, from (now-7d) until (6307200x5sec) in the future.
The data you show as being added to the RRD are for times around epoch=1652955945; but the export you are trying is for times around epoch=1652738755, before the RRA starts.
When I tried to duplicate this, I created an RRD:
rrdtool create $RRD --step 5 --start 1652955935 \
'DS:Pprod:GAUGE:10:0:6000' 'DS:Pcons:GAUGE:10:0:6000' \
'DS:Pdiff:GAUGE:10:-6000:6000' 'RRA:MAX:0.5:1:6307200'
Then added data to it
rrdtool update $RRD \
1652955940:0:10:20 \
1652955945:100:200:300 \
1652955950:400:500:600 \
1652955955:700:800:900 \
1652955960:1000:1100:1200
If I try to extract the data from within the range of the defined RRA, then it works:
rrdtool xport -s 1652955945 -e 1652956000 --step 5 --json \
-t DEF:pot=${RRD}:Pprod:MAX XPORT:pot
{ "about": "RRDtool graph JSON output",
"meta": {
"start": 1652955950,
"end": 1652956000,
"step": 5,
...
However, if I try to extract from a range that is not within the available data space, RRDTool fails to pick a meaningful RRA and so appears to default to a 215s step.
rrdtool xport -s 1652738755 -e 1652824970 --step 5 --json \
-t DEF:pot=${RRD}:Pprod:MAX XPORT:pot
{ "about": "RRDtool graph JSON output",
"meta": {
"start": 1652738970,
"end": 1652824970,
"step": 215,
...
I don't know why RRDTool is defaulting to this step size; however, it is documented behaviour that RRDTool, when doing a graph or xport, will attempt to find the best-fit RRA to cover the entire requested data range at the requested consolodation, even if this means changing the step size. So, this looks as if RRDTool has given up on finding a workable RRA and has given you a default of 215s step with empty data.
Related
I made a bash script for Nagios to test with Nagiosgraph. Rrd files are however not being created for this script. Default plugins that come with Nagios work well with Nagiosgraph and rrd files of those plugins are also present.
Here is the script:
#!/bin/bash
checkgpu=$( nvidia-smi --format=csv --query-gpu=utilization.gpu | awk '/[[:digit:]]+[[:space:]]%/ { tot+=$1;cnt++ } END { print tot/cnt }' | cut -d$
output="Load Average: $checkgpu"
if [ $checkgpu -ge 0 ]
then
echo "OK- $output"
exit 0
elif [ $checkgpu -eq 101 ]
then
echo "WARNING- $output"
exit 1
elif [ $checkgpu -eq 102 ]
then
echo "CRITICAL- $output"
exit 2
else
echo "UNKNOWN- $output"
exit 3
fi
What should i do to make this script work with Nagiosgraph/Performance data ?
Have a look at the development guidelines: https://nagios-plugins.org/doc/guidelines.html#AEN200
The expected format for perfdata is 'label'=value[UOM];[warn];[crit];[min];[max] which can look something like this:
PING ok - Packet loss = 0%, RTA = 0.80 ms | percent_packet_loss=0, rta=0.80
The pipe (|) character tells Nagios that the plugin output has ended and performance data starts.
Note that the above example does not specify UOM (unit of measurement, like percent), nor does it specify any warn/crit thresholds for the data, or min/max values for the graphs. These are all optional.
I am fairly new to jq and I am using this tutorial to add a new Org to a hyperledger fabric network.
There is extensive use of jq throughout the tutorial, especially modifying json files.
The tutorial uses an example Org name but I am trying to make the org name dynamic. Everything works out well except when I try to pass variables to jq.
Here are the jq commands and their outputs.
jq version: 1.5.1
$ export MSPID=Org4MSP
$ echo $MSPID
Org4MSP
Trying to pass the variable using env.
Keyword: env.MSPID
$ jq -s '.[0] * {"channel_group":{"groups":{"Application":{"groups": {"env.MSPID":.[1]}}}}}' config.json org4.json
Output snippet: Instead of printing Org4MSP, it prints the literal string env.MSPID
"env.MSPID": {
"groups": {},
"mod_policy": "Admins",
"policies": {
"Admins": {
"mod_policy": "Admins",
"policy": {
"type": 1,
"value": {
"identities": [
{
Trying to pass the variable using --arg option
Keyword: "$MSP"
jq --arg MSP "$MSPID" -s '.[0] * {"channel_group":{"groups":{"Application":{"groups": {"$MSP":.[1]}}}}}' config.json org4.json
Output snippet: Instead of printing Org4MSP, it prints the literal string $MSP
"$MSP": {
"groups": {},
"mod_policy": "Admins",
"policies": {
"Admins": {
"mod_policy": "Admins",
"policy": {
"type": 1,
"value": {
"identities": [
{
Trying to pass a variable using --arg option and without using double quotes:
keyword: $MSP
$ jq --arg MSP "$MSPID" -s '.[0] * {"channel_group":{"groups":{"Application":{"groups": {$MSP:.[1]}}}}}' config.json org4.json
jq: error: syntax error, unexpected ':', expecting '}' (Unix shell quoting issues?) at , line 1:
.[0] * {"channel_group":{"groups":{"Application":{"groups": {$MSP:.[1]}}}}}
jq: 1 compile error
Trying to pass variable using env. and without double quotes:
keyword: env.MSPID
$ jq -s '.[0] * {"channel_group":{"groups":{"Application":{"groups": {env.MSPID:.[1]}}}}}' config.json org4.json
jq: error: syntax error, unexpected FIELD, expecting '}' (Unix shell quoting issues?) at , line 1:
.[0] * {"channel_group":{"groups":{"Application":{"groups": {env.MSPID:.[1]}}}}}
jq: 1 compile error
I apologize if this seems to be a trivial question but I have searched online and in docs and do not understand why the JSON key will not convert to the shell variable's value.
Thank you
Environment variables
In your sub-expression:
{"env.MSPID":.[1]}
you have quoted env.MSPID thereby making it a literal string. Since you want to invoke the env function, you should instead write:
{ (env.MSPID):.[1]}
The parentheses are needed to ensure that jq will evaluate the parenthesized expression properly.
{$MSP:.[1]}
As noted above, when an expression must be evaluated to determine the string-value of a key, the expression must be parenthesized, e.g.
{($MSP):.[1]}
I am collecting some metrics using graphite, but sometimes there is no data coming into it (probably because the server has gone down, or no network connectivity). I want nagios to send me an alert during such an event. How do i do that?
You could use the check_file_age script from nagios-plugins to check a single known datapoint of interest per system that you are collecting data from.
check_file_age -w 600 -c 1800 /opt/graphite/storage/whisper/servers/$(uname -f)/cpu/idl.wsp
That would alert you if a certain metric was missing within 5 minutes.
Else
You could run a find command over all the points, and report any that have not been updated in n hours.
#!/bin/bash
OLD_GRAPHS=$(find /opt/graphite/storage/whisper -mmin +120 -type f | wc -l)
if [[ OLD_GRAPHS -gt 0 ]];then
echo "Found ${OLD_GRAPHS} graph(s) without an update in 120 minutes"
exit 1
fi
echo "All graphs are up to date"
exit 0
Evening all,
I just have a quick questions about bulk importing via the rest api. I've tried various methods to automate looping through a file and adding the results to Parse backend without success. One example:
curl -X POST \ -H "X-Parse-Application-Id: Removed" \ -H "X-Parse-REST-API-Key: Removed" \ -H "Content-Type: application/json" \ --data '{
"requests": [
{
"method": "POST",
"path": "/1/classes/testnew",
"body": {
#Posts.json
}
}
]
}' \
https://api.parse.com/1/batch
I've tried many other Curl commands and also checking the network tab in parse when uploading a .json file, it looks like when you click the upload it is using the form multipart command to upload the data in a .json file. Does anyone know of a way to automate uploading of data from a .json file into parse without having to manually execute the batch/individual calls as described in the rest api documentation via cUrl?
Any help would be seriously appreciated :-).
Thanks,
Gerard
it seems parse doesn't provide desired function . you can still write a small python script quick fix.
{
"all_players": [
{
"score": 1337,
"playerName": "Sean Plott"
},
{
"score": 1338,
"playerName": "ZeroCool"
}]
}
import urllib2
import json
import pprint
def frame_request():
f=open('hello.json','r')
data = json.load(f)
flag=1;
req = '{ "requests": ['
for d in data["all_players"]:
if flag:
flag=0
else:
req=req+','
req=req + '{ "method": "POST", "path": "/1/classes/GameScore", "body":' + json.dumps(d) + ' }'
req=req+']}'
return req
Add -k or --insecure command with curl to handle your https link.
Linux supports single quote for data value, but windows doesn't. If you are from windows based OS, then --data should be something like below:
--data "{sample:\"json-data\"}"
I googled and couldn't find any could that would compare a webpage to a previous version.
In this case the page I'm trying to watch is link text. There are services that can watch a page, but I'd like to set this up on my own server.
I've set this up as a wiki so anyone can add to the code. Here's my idea
Check if previous version of file exists. If false then download page
If page exists, diff to find differences and email the new content along with dates of new and old versions.
This script would be called nightly via cron or on-demand via the browser (the latter is not a priority)
Sounds simple, maybe I'm just not looking in the right place.
Perhaps a simple sh-script like this, featuring wget, diff & test?
#!/bin/sh
WWWURI="http://foo.bar/testfile.html"
LOCALCOPY="testfile.html"
TMPFILE="tmpfile"
WEBFILE="changed.html"
MAILADDRESS="$(whoami)"
SUBJECT_NEWFILE="$LOCALCOPY is new"
BODY_NEWFILE="first version of $LOCALCOPY loaded"
SUBJECT_CHANGEDFILE="$LOCALCOPY updated"
SUBJECT_NOTCHANGED="$LOCALCOPY not updated"
BODY_CHANGEDFILE="new version of $LOCALCOPY"
# test for old file
if [ -e "$LOCALCOPY" ]
then
mv "$LOCALCOPY" "$LOCALCOPY.bak"
wget "$WWWURI" -O"$LOCALCOPY" -o/dev/null
diff "$LOCALCOPY" "$LOCALCOPY.bak" > $TMPFILE
# test for update
if [ -s "$TMPFILE" ]
then
echo "$SUBJECT_CHANGEDFILE"
( echo "$BODY_CHANGEDFILE" ; cat "$TMPFILE" ) | tee "$WEBFILE" | mail -s "$SUBJECT_CHANGEDFILE" "$MAILADDRESS"
else
echo "$SUBJECT_NOTCHANGED"
fi
else
wget "$WWWURI" -O"$LOCALCOPY" -o/dev/null
echo "$BODY_NEWFILE"
echo "$BODY_NEWFILE" | tee "$WEBFILE" | mail -s "$SUBJECT_NEWFILE" "$MAILADDRESS"
fi
[ -e "$TMPFILE" ] && rm "$TMPFILE"
Update: Pipe through tee, little spelling & remove of $TMPFILE
You can check This SO posting to get a few ideas and also information about the challenge of detecting "true" changes to a web page (with fluctuating advertisement block, and other "noise")