In which file, queue messages are stored in TIBCO EMS? - tibco-ems

Like queue names are stored in queues.conf file located in ems/bin folder....
Does any one know where does the messages on tibco ems queue are stored?
Can we access messages on these queues without any other tool?

Go to Your stores.conf file under [store_name] you can find parameter file= path;
the path in front of file will give you path of file where messages get stored. you can change it.

Related

Avoid reading the same file multiple times using Telegraf and file input plugin

I need to read csv files inside a folder. New csv files are generated every time a user submits a form. I'm using the "file" input plugin to read the data and send it to Influxdb. These steps are working fine.
The problem is that the same file is read multiple times every data collection interval. I was thinking of a solution where I could move the file that was read to a different folder, but I couldn't do that with Telegraf's "exec" output plug.
ps: I can't change the way csv files are generated.
Any ideas on how to avoid reading the same csv file multiple times?
As you discovered file input plugin is used to read entire files at each collection interval.
My suggestion is for you to instead use the directory monitor input plugin. This will read files in a directory, monitor the directory for new files, and parse the ones that have not already been picked up yet. There are some configuration settings in that plugin that make it easier to time when new files are read as well.
Another option is to use the tail input plugin which will tail a file and only read new updates to that file as things come. However, I think the directory monitor is more likely something you are after for your scenario.
Thanks!

How can I turn spool requests into PDF files on the application server?

I'm currently doing Invoicing and Printing setup on a SAP demo system. I've managed to create Smart Forms based on the standard ones. The problem starts with printing using FPCOPARA transaction and LP01 as Output device. I was able to generate a spool (was able to view it as well) but not printed (no actual file).
I just want to have a file from that Smart Form stored in AL11 and be able to archive it later on. Do you have idea on how can I proceed with this?
Thanks
We actually have an inhouse-developed program for this exact task. I don't have permission to publish the sourcecode of the program, but it involves:
Reading the list of spool requests from database table TSP01
Using the function module RSTS_GET_ATTRIBUTES to obtain the type of the spool request.
Calling the function modules CONVERT_OTFSPOOLJOB_2_PDF or CONVERT_ABAPSPOOLJOB_2_PDF, depending on the type determined by the previous function module. They return a table containing the content of the spool request in the PDF format.
Writing the table returned by the previous function modules to a file using the ABAP statements OPEN DATASET and TRANSFER

How to add different loggers (output destination like files and Crashlytics) for logs created by oslog?

Right now if I change the path of the stddr then my logs won't be going to the console anymore and they would only be going to the log files.
I'm wondering if there is a simple way that I can have them both?
Also is there anyway that I can pass logs generated by OSLog to Crashlytics?
You can only do such if you have a wrapper that manages sending logs to oslog and other loggers. Because anything that writes to "system logs" ie to the apple logging layer is where storage is generally opaque to you: os_log, NSLog, etc.
You could create a class which not only writes to os_log, but also to a local text file in your documents directory. That way, you can control what's logged to disk, and it's in a format and directory that's readable to you which you control. You could allow users to airdrop the file, have it readable via iTunes/USB, or email.
Also don't forget to wrap your own logging layer in a background thread/process/task to help with that so it doesn't block the core performance of your app.

Sequential Processing of Files in IIB v9

Environment:IIB9 broker on windows
SFTP server is on windows
We have requirements to process a batch of files generated by backend system in sequential order (i.e FIFO). A batch can have multiple files.
All the files are placed in the IIB source directory from where FileInputNode is polling using move command.
I Want to know if FileInputNode is capable of picking up files in the order they were created by backend system.
Thanks,
Below is the extract from help contents where in it says that FileInput node reads the file in the order they were created by default (oldest are picked up first).
How the broker reads a file at the start of a flow
The FileInput node processes messages that are read from files. The
FileInput node searches a specified input directory (in the file
system that is attached to the broker) for files that match specified
criteria. The node can also recursively search the input directory's
subdirectories.
Files that meet the criteria are processed in age order, that is the
oldest files are processed first regardless of where they appear in
the directory structure
.
https://www.ibm.com/support/knowledgecenter/en/SSMKHH_9.0.0/com.ibm.etools.mft.doc/ac55280_.htm
Yes you can loop in your messageflow and set the filepath of the file you want to load, one at a time. Feel free to ask for information if needed.
For the order, I have to check(you can still call a java node that will list the files in the order you want)
Regards

writing source and custom sink using flume-ng

I am new to flume-NG. I want my source to send some unique xml files to the channel one by one. The channel will validate the xml files and send the validity(either true or false) and the xml file to th custom sink. This sink will write the valid files and invalid files to different directories in HDFS. I am not sure which source to use. Please help.
None of the current ones will fit your use case. The SpoolingDirectorySource is line-oriented so XML files will confuse it and not arrive in one piece.
I suggest you write a custom source for your application.

Resources