Why am I receiving random null field values on my influx queries? - influxdb

I have a production application in which I am using a nodejs client to query the last point of a measurement every second. A separate client is writing a point to the measurement on a 1 second interval as well. After a few minutes, I occasionally start receiving null values for random fields. Some fields have data and others are null.
If I re-query for that exact same point a second later, all the values are there. I have also outputted the data to a csv file, and all the data is there as well.
I have recreated the issue on a standalone example. The issue occurs no matter if I use the npm influx library or make the http requests myself. The more frequent the requests are sent, the more frequent I get points containing null values. With the below example, if I run several instances of this app simultaneously, then many points will contain null values.
Is it possible that I am reading the point before influx has written all field values?
Any help is appreciated.
I'm using influx v 1.8.2 and have recreated the issue on ubuntu and windows.
Code to recreate:
const { InfluxDB } = require("influx");
const influx = new InfluxDB({
host: "localhost",
database: "testDb",
});
const createDb = async () => {
await influx.createDatabase("testDb");
};
const read = async () => {
const res = await influx.query(`
select * from livedata
ORDER BY time desc
limit 1
`);
console.log(res[0]);
};
const write = async () => {
await influx.writeMeasurement("livedata", [
{
tags: {
id: "site",
},
fields: {
site1: Math.random() * 1000,
site2: Math.random() * 1000,
site3: Math.random() * 1000,
site4: Math.random() * 1000,
site5: Math.random() * 1000,
site6: Math.random() * 1000,
site7: Math.random() * 1000,
site8: Math.random() * 1000,
},
},
]);
};
createDb();
setInterval(() => {
write();
read();
}, 300);
Example output:
[
'2021-01-07T21:40:11.4031559Z',
'site',
830.4042230769617,
522.1830877694142,
698.8789904008146,
678.305459618109,
118.82269436309988,
631.6295948279627,
376.3112870744887,
830.4872612882643
]
]
[
[
'2021-01-07T21:40:11.7034901Z',
'site',
null,
null,
null,
65.3968316403697,
680.7946463560837,
330.7338852317838,
872.7936919556367,
145.03057994702618
]
]
[
[
'2021-01-07T21:40:12.0036893Z',
'site',
901.031149970251,
501.1825877093237,
99.38758592260699,
78.79549874505165,
403.8558500935323,
545.085784401504,
969.637642068842,
51.657735620841194
]
]

Related

Error message : Google.Apis.Requests.RequestError Range (Sheet1!K1254) exceeds grid limits. Max rows: 804, max columns: 25 [400]

Started getting below error randomly. while same code works fine if I stop my service and sun again. but after 1 days or two its start giving this error. I have scheduled a job to run after every 15 minutes where based on some condition it update the K column value.
A quick help is highly appreciated.
Error message : Google.Apis.Requests.RequestError Range (Sheet1!K1254) exceeds grid limits. Max rows: 804, max columns: 25 [400] Errors [ Message[Range (Sheet1!K1254) exceeds grid limits. Max rows: 804, max columns: 25] Location[ - ] Reason[badRequest] Domain[global] ]
Edit1: code which I am using to update the code:
private static async Task UpdateValue(SpreadsheetsResource.ValuesResource valuesResource,string updatedvalue, string emailaddr)
{
var valueRange = new ValueRange { Values = new List<IList<object>> { new List<object> { updatedvalue } } };
var update = valuesResource.Update(valueRange, SpreadsheetId, WriteRange);
update.ValueInputOption = SpreadsheetsResource.ValuesResource.UpdateRequest.ValueInputOptionEnum.RAW;
var response = await update.ExecuteAsync();
Console.WriteLine($"Updated Licenses Status for : { response.UpdatedRows} " + emailaddr );
}

Merging topojson using topomerge messes up winding order

I'm trying to create a custom world map where countries are merged into regions instead of having individual countries. Unfortunately for some reason something seems to get messed up with the winding order along the process.
As base data I'm using the natural earth 10m_admin_0_countries shape files available here. As criteria for merging countries I have a lookup map that looks like this:
const countryGroups = {
"EUR": ["ALA", "AUT", "BEL"...],
"AFR": ["AGO", "BDI", "BEN"...],
...
}
To merge the shapes I'm using topojson-client. Since I want to have a higher level of control than the CLI commands offer, I wrote a script. It goes through the lookup map and picks out all the topojson features that belong to a group and merges them into one shape and places the resulting merged features into a geojson frame:
const topojsonClient = require("topojson-client");
const topojsonServer = require("topojson-server");
const worldTopo = topojsonServer.topology({
countries: JSON.parse(fs.readFileSync("./world.geojson", "utf-8")),
});
const geoJson = {
type: "FeatureCollection",
features: Object.entries(countryGroups).map(([region, ids]) => {
const relevantCountries = worldTopo.objects.countries.geometries.filter(
(country, i) =>
ids.indexOf(country.properties.ISO_A3) >= 0
);
return {
type: "Feature",
properties: { region, countries: ids },
geometry: topojsonClient.merge(worldTopo, relevantCountries),
};
}),
};
So far everything works well (allegedly). When I try to visualise the map using github gist (or any other visualisation tool like vega lite) the shapes seem to be all messed up. I'm suspecting that I'm doing something wrong during the merging of the features but I can't figure out what it is.
When I try to do the same using the CLI it seems to work fine. But since I need more control over the merging, using just the CLI is not really an option.
The last feature, called "World", should contain all remaining countries, but instead, it contains all countries, period. You can see this in the following showcase.
var w = 900,
h = 300;
var projection = d3.geoMercator().translate([w / 2, h / 2]).scale(100);
var path = d3.geoPath().projection(projection);
var color = d3.scaleOrdinal(d3.schemeCategory10);
var svg = d3.select('svg')
.attr('width', w)
.attr('height', h);
var url = "https://gist.githubusercontent.com/Flave/832ebba5726aeca3518b1356d9d726cb/raw/5957dca433cbf50fe4dea0c3fa94bb4f91c754b7/world-regions-wrong.topojson";
d3.json(url)
.then(data => {
var geojson = topojson.feature(data, data.objects.regions);
geojson.features.forEach(f => {
console.log(f.properties.region, f.properties.countries);
});
svg.selectAll('path')
// Reverse because it's the last feature that is the problem
.data(geojson.features.reverse())
.enter()
.append('path')
.attr('d', path)
.attr('fill', d => color(d.properties.region))
.attr('stroke', d => color(d.properties.region))
.on('mouseenter', function() {
d3.select(this).style('fill-opacity', 1);
})
.on('mouseleave', function() {
d3.select(this).style('fill-opacity', null);
});
});
path {
fill-opacity: 0.3;
stroke-width: 2px;
stroke-opacity: 0.4;
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/d3/5.7.0/d3.js"></script>
<script src="https://d3js.org/topojson.v3.js"></script>
<svg></svg>
To fix this, I'd make sure to always remove all assigned countries from the list. From your data, I can't see where "World" is defined, and if it contains all countries on earth, or if it's a wildcard assignment.
In any case, you should be able to fix it by removing all matches from worldTopo:
const topojsonClient = require("topojson-client");
const topojsonServer = require("topojson-server");
const worldTopo = topojsonServer.topology({
countries: JSON.parse(fs.readFileSync("./world.geojson", "utf-8")),
});
const geoJson = {
type: "FeatureCollection",
features: Object.entries(countryGroups).map(([region, ids]) => {
const relevantCountries = worldTopo.objects.countries.geometries.filter(
(country, i) =>
ids.indexOf(country.properties.ISO_A3) >= 0
);
relevantCountries.forEach(c => {
const index = worldTopo.indexOf(c);
if (index === -1) throw Error(`Expected to find country ${c.properties.ISO_A3} in worldTopo`);
worldTopo.splice(index, 1);
});
return {
type: "Feature",
properties: { region, countries: ids },
geometry: topojsonClient.merge(worldTopo, relevantCountries),
};
}),
};

how to draw dynamic series in Highchart based on data

I want add number of series based on the data. Example sometimes for the same request, High-chart may include 2 series or 4 series.
Ex:
Request 1
[[ser1,ser2,datetime],[ser1,ser2,datetime]]
Request 2
[[ser1,ser2, ser3,ser4, datetime],[ser1,ser2, ser3,ser4, datetime]]
Where "datetime" is x-axis values
Could you please suggest me how to approach this.
You can convert your data to the right format in the preprocessing.
var json = [
[2, 4, 1500284119000],
[10, 20, 1500284141000]
],
series = [],
each = Highcharts.each,
len;
each(json, function(items, i) {
len = items.length;
each(items, function(item, j) {
if (j < len - 1) {
if (i === 0) { // create series structure
series.push({
data: []
});
}
series[j].data.push({
x: items[len - 1],
y: item
})
}
});
});
Example:
http://jsfiddle.net/hx97ak00/
I would suggest you to change the response structure if your xAxis is a datetime.
better include the timestamp along with data as below
seriesData = [[timestamp1, val1],[timestamp2, val],[timestamp3, val3],......]
now your request looks like this
Request 1
[seriesData1, seriesData2, seriesData3,seriesData4]
Request 2
[seriesData1, seriesData2, seriesData3, seriesData4, seriesData5, seriesData6, seriesData7,seriesData8]
I think you went to have a separate array for datetime because you have diffent number of series of data for 2 different timestamps.
In the above mentioned approach. you can directly feed the series section with the incoming response of the request.
Hope this will help you.

Impose max limit in Loopback

Is there a way to set a max limit in Loopback. I imagine something like this:
MyModel.paginateFind = function(filter, page, cb){
// Set max limit of 1000
if (filter.limit > 1000) filter.limit = 1000;
// Set pagination
filter.skip = filter.limit * (page - 1);
// Call the standard find function with the new filter
MyModel.find(filter, function(err, res){
cb(err, res);
});
}
MyModel.remoteMethod(
'paginatedFind',
{
http: {path: '/', verb: 'get'},
accepts: [
{arg: 'filter', type: 'object'},
{arg: 'page', type: 'number'}
],
returns: {arg: 'result', type: 'object'}
}
);
This works when the filter is formated as json, but not in this format: http://localhost:3000/api/MyModel?filter=[where][country]=DE. How can I use the standard StongLoop filter format?
The format, according the to documentation here should look like this:
api/MyModel?filter[where][country]=DE

Highcharts line with null values

I'm trying to create a line diagram (datetime x-axis) with null values.
var rawData = [{
(...)
}, {
"PointOfTime": 1424991600,
"value": 6831.28806
}, {
"PointOfTime": 1425078000,
"value": null
}, {
"PointOfTime": 1425164400,
"value": null
}, {
(...)
}];
Adjust the data from a json source to an array:
rawData.forEach(function (d) {
var datetime = (d.PointOfTime + 3600) * 1000;
data.push([datetime, parseFloat(d.value)]);
});
As stated in the following fiddle, http://jsfiddle.net/wiesson/1m5hpLef there are no lines, only bullets. Any suggestions? I need the PointOfTime to create the range of the x-axis, even they are empty.
// Edit: As displayed in the following figure, the values in the future are unknown and not 0, therefore I would like to set them to null.
Add a condition, which check if your value is null. If yes then push this, instead of call parseFloat(null).
rawData.forEach(function (d) {
var datetime = d.PointOfTime * 1000;
if(d.value!==null)
data.push([datetime, parseFloat(d.value)]);
else
data.push([datetime, null]);
});
Example: http://jsfiddle.net/wiesson/1m5hpLef/

Resources