storing grafana's metrics data in influxdb - influxdb

I use grafana(3.0.3) fetching cloudwatch data, and I want to store fetched data inside InfluxDB(0.13). Any idea how I can do so, thank you in advance.

Telegraf has a CloudWatch plugin that you can use to store CloudWatch data in InfluxDB.
Alternatively if you're fetching the data yourself you could write the data to InfluxDB yourself. To do so, you'd issue an POST request to the /write endpoint of your InfluxDB instance with some your data in line protocol
Some examples of data in line protocol:
cpu,host=server01,region=uswest value=1 1434055562000000000
cpu,host=server02,region=uswest value=3 1434055562000010000
temperature,machine=unit42,type=assembly internal=32,external=100 1434055562000000035
temperature,machine=unit143,type=assembly internal=22,external=130 1434055562005000035

Related

Can we call External Rest API from Rule Chain to collect telemetry data from another API?

I am trying to collect energy generation statistics like Watts and wattHour form external API. I have External Rest API endpoint available for it.
Is there a way in Thingsboard using rule chain to call external endpoint and store its as telemetry data. Later i want to show this data in dashboards.
I know it has been too long, but thingsboard has lacking documentation and it might be useful for someone else.
You'd have to use the REST API CALL external node (https://thingsboard.io/docs/user-guide/rule-engine-2-0/external-nodes/#rest-api-call-node)
If the Node was successful, it will output it's OutboundMessage containing the HTTP Response, with the metadata containing:
- metadata.status
- metadata.statusCode
- metadata.statusReason
and with the payload of the message containing the response body from your external REST service (i.e. your stored telemetry).
You then have to use a script transformation node in order to format the metadata, payload and msgType, into the POST_TELEMETRY_REQUEST message format, see: https://thingsboard.io/docs/user-guide/rule-engine-2-0/overview/#predefined-message-types
Your external REST API should provide the correct "deviceName" and "deviceType", as well as the "ts" in UNIX milliseconds.
Notice that you also need to change the messageType (msgType return variable) to "POST_TELEMETRY_REQUEST".
Finally, just transmit the result to the Save timeseries action node and it will be stored as a telemetry from the specified device. Hope this helps.

New stream from a streaming topic (Json format) without any data

I have a streaming topic in Json with 50 fields. I try to create another stream with 1 field using KSQL from the topic as below:
create stream data (timeGMT string) with (kafka_topic='json_data', value_format='json');
The stream was created successfully, however no data returns from below KSQL query:
select * from data;
This is running on KSQL 5.0.0
There are a few things to check, including:
Is there any data in the topic?
PRINT 'json_data' FROM BEGINNING;
Have you told KSQL to read from the beginning of the topic?
SET 'auto.offset.reset' = 'earliest';
Are there messages in your topic that aren't JSON or can't be parsed? Check the KSQL Server log for errors.
You can see more information on these, and other troubleshooting tips, in this blog.

Unable to read kafka-stream data from debezium-postgres's kafka-stream

I started kafka connector using following command:
./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-postgres/connect-postgres.properties
Serialization props in the connect-avro-standalone.properties is:
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
I've created a java backend which listen to this kafka stream topic and its able to get the data from postgres with each add/update/delete.
But the data is coming in some unknown encoding format and that's why ican't read the data correctly.
Here is the relevant code snippet:
properties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG,
Serdes.String().getClass().getName());
properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG,
Serdes.ByteArray().getClass().getName());
StreamsBuilder streamsBuilder = new StreamsBuilder();
final Serde<String> stringSerde = Serdes.String();
final Serde<byte[]> byteSerde = Serdes.ByteArray();
streamsBuilder.stream(Pattern.compile(getTopic()), Consumed.with(stringSerde, byteSerde))
.mapValues(data -> {
System.out.println("->"+new String(data));
return data;
});
I'm confused on where and what I need to change; in the avro connector prop or in the java side code
Your Kafka Connect config here means that the messages on the Kafka topic will be Avro serialised:
value.converter=io.confluent.connect.avro.AvroConverter
Which means that you need to deserialise using Avro in your Streams app. See here for more details: https://docs.confluent.io/current/streams/developer-guide/datatypes.html#avro

Save users data C#

I need to save user temporary data (500 users). I am currently using:
DataTable myData = new DataTable();
myData.Columns.Add("id", typeof(int));
myData.Columns.Add("name", typeof(string));
...
myData.Rows.Add("1", "name1", ...);
myData.Rows.Add("2", "name2", ...);
...
myData.Rows.Add("500", "name500" ,...);
New rows are added/edited, eg 50x per second from one user... and every 1 minute the data are sent to Mysql database.
Is this method sufficiently stable and fast to add/edit/delete a large amount of temporary data?
I was thinking about saving to an xml file, but I think my way is faster ...
Thank you for any advice.
Edit:
I have a server and the users (clients) are connected to server, and they send data to server and server send data to them.
E.g. When the client send a message to others clients.

Ruby Constantly consume external API and share output across processes

I'm writing trading bot on ruby and I need to constantly do some calculations based on Exchange's orderbook depth data across several daemons (daemons gem).
The problem is that right now I'm fetching data via Exchange's API separately in each daemon, so I ran into API calls limit (40 requests/second). That's why I'm trying to utilize ruby Drb to share orderbook data across several processes (daemons) in order to not sending unnecessary API calls.
Although I m not sure how to constantly consume API on the server side and provide the latest actual data to the client process. In the sample code below, client will get data actual at the moment when I started server
server_daemon.rb
require 'drb'
exchange = exchange.new api_key: ENV['APIKEY'], secret_key: ENV['SECRET']
shared_orderbook = exchange.orderbook limit: 50
DRb.start_service('druby://127.0.0.1:61676', shared_orderbook)
puts 'Listening for connection… '
DRb.thread.join
client_daemon.rb
require 'drb'
DRb.start_service
puts shared_data = DRbObject.new_with_uri('druby://127.0.0.1:61676')

Resources