Rosbag record does not record the first messages - ros

I am having a problem with ROS rosbag.
I have a python script that generates and publishes messages. I try to record them with rosbag record but the generated rosbag does not seem to contain the first messages.
I have done the following:
Run the script while in other terminal I have rostopic echo /the_message and I can see that all messages are being correctly generated --> so it is not a problem of the script
Run first (just in case some "initialization time" is needed) in one terminal rosbag record -O data.bag /the_message and then in other terminal the script. I got the bag file correctly. But then when playing the bag with rosbag play --clock data.bag and rostopic echo /the_message in other terminal I got only the messages from the third or fourth message. Never the first message
What could be happening? and How can I ensure the recording takes place including the first messages?

Related

How can I tell Jenkins I want the last n lines of logs to appear in Standard Error

Currently, from my test execution, I see the first 10 lines of logs if the test has passed. I see the first n lines of logs if the test has failed. Ideally, I would like to see the last 100 lines of logs in Standard Error.
There are no settings setup in my Jenkins file, so I assume the current behavior is the default behavior of test execution. I need a way to change it.
I have tried ${BUILD_LOG, maxLines, escapeHtml} which gets me the logs that can be emailed to someone
I have tried messing around with System Logs
I would like my standard error to show the last n lines of logs
This is probably A way to do it(definitely not THE way or may not be the most efficient way)
Once the build completes do a curl to get the console logs, use something like this and store the file in the workspace: curl -isk -X -H "$CRUMB" $BUILD_URL/consoleText --user "$USER:$APITOKEN" -o consolelogs.txt
Then you can simply tail the file for the last 100 lines. eg: tail -100 <log file> > newLogfile
Combine it with Log parser plugin with a known string like ****start of logs**** and create a rule to parse it.
For point #1 refer this for getting the crumb, Jenkins API access
Update:
Just realized that the above approach may be difficult from within the same job, Either a down stream job can used or redirect the entire test logs to a file on the workspace without sending it to stdout and then use the tail and Log parser plugin to achieve the result

Rails: How to pass data to Bash Script

I have a Rails webapp full of students with test scores. Each student has exactly one test score.
I want the following functionality:
1.) The user enters an arbitrary test score into the website and presses "enter"
2.) "Some weird magic where data is passed from Rails database to bash script"
3.) The following bash script is run:
./tool INPUT file.txt
where:
INPUT = the arbitrary test score
file.txt = a list of all student test scores in the database
4.) "More weird magic where output from the bash script is sent back up to a rails view and made displayable on the webpage"
And that's it.
I have no idea how to do the weird magic parts.
My attempt at a solution:
In the rails dbconsole, I can do this:
SELECT score FROM students;
which gives me a list of all the test scores (which satisfies the "file.txt" argument to the bash script).
But I still don't know how my bash script is supposed to gain access to that data.
Is my controller supposed to pass the data down to the bash script? Or is my model supposed to? And what's the syntax for doing so?
I know I can run a bash script from the controller like this:
system("./tool")
But, unfortunately, I still need to pass the arguments to my script, and I don't see how I can do that...
You can just use the built-in ruby tools for running shell commands:
https://ruby-doc.org/core-2.3.1/Kernel.html#method-i-60
For example, in one of my systems I need to get image orientation:
exif_orientation = `exiftool -Orientation -S "#{image_path}"`.to_s.chomp
Judging from my use of .to_s, running the command may sometimes return nil, and I don't want an error trying to chomp nil. A normal output includes the line ending which I feed to chomp.

CLI - Ignore a line when using the for command

I'm trying to set a variable which will identify the make of a laptop.
I am doing this by using the command
wmic csproduct get vendor
This gives the following output
Vendor
LENOVO
(blank)
So based on that command, I have used the for command in the way below, to try and set a variable with the value 'LENOVO'
for /f "skip=1 tokens=1 delims=" %i in ('wmic csproduct get vendor')
do set vendor=%i
however, problem is that the output of the wmic command actually produces a blank line, under the word LENOVO, so my variable gets set as a blank value. Is there anyway to stop the for command from parsing this 3rd line, therefore stopping once the variable has been set with the value of 'lenovo'?
The skip function works fine, and bypasses the first line completely. However it doesn't seem to give me the option to say for example, skip lines 1 and 3 but leave 2. I have experimented with the EOL parameter to try and ignore the blank line, but the for command still reads the empty 3rd line each time.
Many Thanks
A little whacky, but try using the following within your line. It will filter out the Vendor line and then sort alphabetically, putting the blank line first. If you really need to do stuff like this regularly, check into a Windows port of some Unix utils like sed and awk.
wmic csproduct get vendor | find /v /i "vendor" | sort

Jmeter doesn't save response data or headers

I'm building some simple load testing for my API, and to make sure everything is on the up and up I'd like to also review the response headers and data. But when I run my test using the command line and then re-open the GUI to add a View Results Tree listener and load the created file the response headers or response data is empty.
I entered the following values into user.properties (also tried uncommenting those values in jmeter.properties and changing them there, same result)
jmeter.save.saveservice.output_format=csv (tried xml, omitting it, jtl)
jmeter.save.saveservice.data_type=false
jmeter.save.saveservice.label=true
jmeter.save.saveservice.response_code=true
jmeter.save.saveservice.response_data.on_error=true
jmeter.save.saveservice.response_message=true
jmeter.save.saveservice.successful=true
jmeter.save.saveservice.thread_name=true
jmeter.save.saveservice.time=true
jmeter.save.saveservice.subresults=false
jmeter.save.saveservice.assertions=false
jmeter.save.saveservice.latency=true
jmeter.save.saveservice.bytes=true
jmeter.save.saveservice.hostname=true
jmeter.save.saveservice.thread_counts=true
jmeter.save.saveservice.sample_count=true
jmeter.save.saveservice.response_message=true
jmeter.save.saveservice.assertion_results_failure_message=true
jmeter.save.saveservice.timestamp_format=HH:mm:ss
jmeter.save.saveservice.default_delimiter=;
jmeter.save.saveservice.print_field_names=true
But still no luck when opening the result file. I tried declaring the file after the -l tag as results.csv, .jtl, even .xml but none of them show me the headers and data.
I'm running it locally on Mac OS X 10.10 using the following command, jmeter version is 2.12
java -jar ApacheJMeter.jar -n -t /Users/[username]/Documents/API_test.jmx -l results_15.jtl
I don't know if it's not even saving that data, or if the Listeners can't read it or if I've been cursed but any help is appreciated.
It works fine if I add a Listener and run it using the GUI, but if I try to run my larger tests that way, well, things don't end well for anyone.
So my question is:
How do I save the response header and data to a file when using the command line, and how do I then view said file in jmeter?
Add a Simple Data Writer (under Listeners) and output to a file (NB: different file than your log). Under the 'configure' button, there are all sorts of options of what to save. One of the check boxes is Save Response Header.
This file can get huge if you're saving a bunch of things for every request- one strategy is to check everything, but only save for errors. But you can do whatever works for you.
You can also turn on "Functional Test Mode" which will produce a large file but will contain pretty much anything you might need to debug your test.
Beware, this can create a very large JTL file, so don't forget to turn it off for your large test runs! See JMeter Maven mojo throws IllegalArgumentException with large JTL file
Alternatively use a Tree View Listener in the GUI for a small sample of the requests and check the request/response in the GUI (including headers) to debug or check your test.
Add Below lines in user.properties file
jmeter.save.saveservice.output_format=xml
jmeter.save.saveservice.response_data=true
jmeter.save.saveservice.samplerData=true
jmeter.save.saveservice.requestHeaders=true
jmeter.save.saveservice.url=true
Restart cmd prompt.

Output of syslog, sent to stdout, grepped, is getting truncated mid-line

So I'm logging some debug information, sending it to stdout whereupon I grep it for a string. At a certain point, logging is done and the application is waiting for stuff, but grep's output is truncated mid-line. So it matched a line, but didn't output all of that line.
Is there a way to force grep to flush?
Thanks.
UPDATE:
It appears that --line-buffered will help.
I think you solved your problem with grep by using the --line-buffered flag. Also make sure that your application is flushing your stdout after each line. If your stdout is a terminal, line buffering is the default but when you pipe it into another program the default is to use full buffering.
If you are piping data into a program that doesn't have the --line-buffered flag (like uniq for example), take a look at the stdbuf program (http://www.pixelbeat.org/programming/stdio_buffering/stdbuf-man.html) which allows you to modify the buffering options of any program.
See http://www.pixelbeat.org/programming/stdio_buffering/ for a good overview of the issue and some common solutions.

Resources