Create Timeseries chart with custom timestamp for custom events sent to New Relic - time-series

I have some custom load test data, something like -
{“payload_size”: 15, “response_time”: 0.2, “timestamp”: 1670584754}
So i wanted to plot payload_size against timestamp as a timeseries chart, but the issue is I am sending all of this data in bulk, which may include 600 events like this (if ran for 10 minutes).
So when i do timeseries , it shows me one entry only as all the data is pushed at simple point of time.
Is there a workaround for this. I am using EventAPI to send all the data.

Yes - you can do this with New Relic's custom visualizations
I wrote one specifically for this same use case here: https://github.com/khpeet/custom-timeseries
This allows you to plot your values on a timeseries chart with your own custom timestamp field. Note that timestamp is a reserved word in New Relic though, so you probably have to rename your timestamp key to something else upon posting the events via API.

Related

Why does my infulxdb insert a new row of data only every 10 seconds?

I wrote a timed task in c# to insert one row of data per second,
but I found that only one row of data is inserted every 10 seconds.
I also noticed that new insert requests within 10 seconds will only update the same row of data and not insert a new one.
What is the setting that causes this and how do I change it?
The version of influxdb is 2.2, I downloaded it from the website and started it directly without changing any configuration.
You are probably using query creator which aggregates data (prepares query with aggregation). Example setting in InfluxDB v2 web GUI:
Setting period to 1s or writing your own query without any aggregation should solve your problem.
What is more: writing data with the same tag keys to InfluxDB, with the same timestamp and the same value field name will overwrite existing value in InfluxDB. So described behaviour is normal.

How can I render time-series data on a geographic display in grafana?

My goal is to render time-series data from set locations on a map. Essentially, I have about 30 predefined (static) locations in Switzerland from which I will be receiving real-time data. The data itself is relatively simple, just the signal/noise ratio of the signal we're receiving, which should be updated every few seconds or every minute. I am using InfluxDB as my database. Are there any specific setups I should be using for this kind of visualization?
My first question is: is it best to use the worldmap panel or the geomap panel at this time? I seem to be finding more information/documentation on the worldmap panel even though i have also read that geomap is (or at least will be) its replacement.
Second, I assume that since I'm using time-series data, that I should be using the Time-Series format, and not the Table format. However, I have not been able to render any data points using the time-series feature, even by following the simplest of examples in your documentation. The best I can do is use the Table feature, and internally remove previous points from my database at every iteration (so that multiple points aren't rendered at the same time for each location). Here are two screenshots of when I'm able to render data on the geomap using the Table format, and then after switching to Time-Series format that the points are no longer there (note that I have the same problem with the Worldmap application as well).
I'm able to render data using the Table method:
...but not using time series:
Thanks for any help!
For rendering timeseries data on the geomap, you must convert your lat/long fields to a single geohash field. You'll have to do that prior to inserting the lat/longs into influxDB
See this answer

Is there a way to manually insert records into InfluxDB with custom timestamps via telegraf?

https://github.com/influxdata/telegraf/pull/1557
Apparently some people have been asking for this, and this Github PR is the closest thing I can find to a solution, but it was ultimately denied(I think?).
Basically, I have a JSON object I'm getting from Stackdriver, which includes a Timestamp in ISO8601, which I convert to Unix time. I can insert the entire JSON response into Influx fine, but the timestamp from Stackdriver appears as a tag for a series, rather than the index of the time series itself. As a result, it is unfeasible to query by Stackdriver's provided timestamp. I could simply just drop it, and use the Influx provided timestamp, but it is essentially querying incorrect/imprecise data.
Does anyone have a clever way to approach this?
tl;dr How can I use Telegraf to override InfluxDB's timestamps with my own timestamps?

Adding a version to time-series data

I wish to store time series data with versioning. By versioning, I mean that I might have a metric energy_mwh with tag meter_id=123 and a fieldset something like this time=2016-01-01 10:00, mwh=20.50, read-time=2016-01-01 20:15 and if I re-read the meter at a later time I want to keep both the new and old version of the meter reading. Later when I query the data I will be mostly just interested in the mwh value with the highest read-time for any given time. If I query over a range of times the read-time is going to vary.
I am thinking of using InfluxDB or some other time series database with a similar data model.
Is there a right way of doing this? I believe that I must keep read-time as a tag - not a field or I will lose the older version of the data. I guess that is answer - but it doesn't feel right to me to have what I see as a piece of data: read-time sitting in an identifier - specifically a tag. Am I on the right track?

Is it possible keep only value changes in influxdb?

Is it possible to downsample older data using influxdb in a way that it only keeps change of values?
My example is the following:
I have a binary sensor sending data every 10 min, so naturally the consecutive values look something like this: 0,0,0,0,0,1,1,0,0,0,0...
My goal is to keep this kind of raw data over a certain period of time using retention policies and downsample the data for longer storage. I want to delete all successive values with the same number so that I have only the datapoint with their timestamps when the value actually changed. The downsampled data should look like this: 0,1,0,1,0,1,0.... but with the correct timestamp when the event actually occurred.
Currently this isn't possible with InfluxDB, though the plan is to eventually support this kind of use case.
I would encourage you to open an feature request on the InfluxDB repo asking for this.

Resources