I'm trying to parse logs from a file using tail reader plugin and convert it into json to ingest as metrics to Signalfx.
I'm able to parse the lines. but fluentbit is adding string to the original json as prefix.
tail.0: [1662039026.275151576, {"log"=>"{"gauge": [{"metric": "jey.test.pp.db.xtrack.xdr_ship_outstanding_objects.count","dimensions": {"environment": "qa"},"value": 4}]}"}]
Anyone have idea about how to remove "tail.0: [1662039026.275151576" and only send json fields
Expected output:
{"gauge": [{"metric": "jey.test.pp.db.xtrack.xdr_ship_outstanding_objects.count","dimensions": {"environment": "qa"},"value": 4}]}"}
Thanks in advance!
Related
I'm going to put the csv file into the bucket using influxdb v2.1.
Attempting to insert a simple example file results in the following error:
error in csv.from(): failed to read metadata: failed to read annotations: expected annotation datatype
The csv file that I was going to write is as follows.
#datatype measurement,tag,double,dateTime:RFC3339
m,host,used_percent,time
mem,host1,64.23,2020-01-01T00:00:00Z
mem,host2,72.01,2020-01-01T00:00:00Z
mem,host1,62.61,2020-01-01T00:00:10Z
mem,host2,72.98,2020-01-01T00:00:10Z
mem,host1,63.40,2020-01-01T00:00:20Z
mem,host2,73.77,2020-01-01T00:00:20Z
This is the example data in the official document of influxdata.
If you look at the first line of the example, you can see that datatype is annotated, but why does the error occur?
How should I modify it?
This looks like invalid annotated CVS.
In the csv.from function documentation, you can find examples (as string literals) of both annotated and raw CVS that the cvs.from supports.
I'm trying to read JSON-LD into Dask from Minio. The pipeline works but the strings come from Minio as binary strings
So
with oss.open('gleaner/summoned/repo/file.jsonld', 'rb') as f:
print(f.read())
results in
b'\n{\n "#context": "http://schema.org/",\n "#type": "Dataset",\n ...
I can simply convert this with
with oss.open('gleaner/summoned/repo/file.jsonld', 'rb') as f:
print(f.read().decode("utf-8"))
and now everything is as I expect it.
However, I am working with Dask and when reading into the a bag with
dgraphs = db.read_text('s3://bucket/prefa/prefb/*.jsonld',
storage_options={
"key": key,
"secret": secret,
"client_kwargs": {"endpoint_url":"https://example.org"}
}).map(json.loads)
I can not get the content coming from Minio to become strings vs binary strings. I need these converted before they hit the json.loads map I suspect.
I assume I can inject the "decode" in here somehow as well, but I can't resolve how.
Thanks
As the name implies, read_text opens the remote file in text mode, equivalent to open(..., 'rt'). The signature of read_text includes the various decoding arguments, such as UTF8 as the default encoding. You should not need to do anything else, but please post a specific error if you are having trouble, ideally with example file contents.
If your data isn't delimited by lines, read_text might not be right for you, and you can do something like
#dask.delayed()
def read_a_file(fn):
# or preferably open in text mode and json.load from the file
with oss.open('gleaner/summoned/repo/file.jsonld', 'rb') as f:
return json.loads(f.read().decode("utf-8"))
output = [read_a_file(f) for f in filenames]
and then you can create a bag or dataframe from this, as required.
I am trying to build a simple Suave.IO application to centralize the sending of emails. Currently the application has one endpoint that takes subject, body, recipients, attachments, and sender as form data and turns them into an EWS email message from a logging email account.
Everything works as intended in most cases, but I get a file corruption issue when one of the attachments is an excel file. In those cases, the file seems to get corrupted.
Currently, I am filtering the request.multipartFields down to only the ones that are marked as attachment files, and then doing this:
for (fileField: (string*string)) in fileFields do
let fname = (fst fileField)
let fpath = "uploadedFiles\\" + fname
File.WriteAllBytes(fpath, Encoding.ASCII.GetBytes (snd fileField)) |> ignore
The file path and the attachment names are then fed into the EWS message before sending.
Again, this seems to work with all attachments except attachments with binary. It seems like Suave.IO automatically encodes all multiPartFields as (string*string), which may require special handling when it's binary data.
How should I handle upload of binary files?
Thanks all in advance.
It looks like the issue was one of encoding. I was testing using python's request interface, and by default the files are encoded as multipart/form-data. By specifying a specific encoding for each file, I was able to help the server identify the incoming data as a file.
instead of
requests.post(url, data=data, files={filename: open(filepath, 'rb')})
I needed to make it
requests.post(url, data=data, files={filename: (filename, open(filepath, 'rb'), mimetypes.guess(filepath)})
With the second python script, files do end up in the files section of the request and I was able to save the excel file without corruption.
I am trying to localize a push notification text while inside a Parse Cloud function and after many tries I was not able to have working solution. Is there a way to localized text inside a Parse Server cloud function?
So, for anyone looking for a solution, I used the following library: i18n-node.
Then in the cloud code (I am using Typescript):
import i18n from 'i18n';
//... other imports
i18n.configure({
locales:['en', 'it'],
directory: __dirname + '/locales'
});
And then inside a cloud function is possible to run:
i18n.__({phrase: "Hey, well done!", locale: locale}
Where locale can come from the request or, in my case, from the user's device language preference.
I had the same problem, and solve it with UTF-8 encoding before sending the notification.
npm package: UTF8
Usage:
var utf8 = require("utf8");
// encode before sending the text
text = utf8.encode(text);
how store parse json string data into localltext file first and next time fetching data from local file it self this my code which i write
$.getJSON("my ruls ",function(data)
the above line am parseing json url and i need store that data into local text file then user second time read data from local text file only not json url ?
pls give some idea or suggestion or some links to achive this problem
i did this problem in ipohne
thanks & regards
you can write your data using FileWriter in Phonegap (http://docs.phonegap.com/en/2.5.0/cordova_file_file.md.html#FileWriter)
regards,