I am trying to build and publish a /nav_msgs/OccupancyGrid message to test another node that depends on actual data from a robot. Before I use real data, I just wanted to build a message from an array or matrix of numbers without any real sensors. How can I do that?
Thanks!
If you have a look at the nav_msgs/OccupancyGrid message definition you will see that the data is just stored as an array of int8s with some MapMetaData. So if you just need something filled in to test the other node without any assumptions about usefulness or plausibility of the data you can write a script that fills in random data into the data structure.
If the data needs to be somewhat useful and plausible you probably should have a look at the Map Server package. It allows you to generate a nav_msgs/OccupancyGrid from an image. This approach might overall even be easier than generating random data.
Related
I want to collect a pointcloud of a simulated space in gazebo. I have tried scanning the environment and saving the scans as individual pcd files and then concatenating them but this did not work. I have also tried to take the scans from Gazebo and visualise them in open3d but this ended up just being the same as concatenating the pcd files. I know that the issue is not being able to transform the messages correctly but I have not found a working method with clear steps to execute the transformation. I am doing this on Ros noetic and would really appreciate help.
You should be using rosbag record for saving topic data inside ros. This command simply records topic data, saves it to a file, and allows you to analyze or play it back later.
In your situation, you would also need to record the transform topic if you’re having tf issues.
To record data you can simple run a command such as: rosbag record /tf /my_scan_topic
Based on your comments what you actually want to do is: first combine multiple scans from the lidar into a single topic; i.e. create a single pointcloud from multiple. The easiest way might be to use the laser_assembler package. Since this is all in ros the transforms will be handled automatically for you. Then once you have all your scans assembled put it into a PCD file.
Is this correct?
What i basically want to ask is that- Is it correct to consider a linked list as a data source.
What happens in this program is that a text files contents are loaded into memory in a linked list. Which does all the processing work and then when the program quits the linked lists are written to the file. In that case is this DFD correct?
What you are asking is rather a Data flow diagram, or also known as dynamic/event view of design.
This diagram tells how on the UI ,events will proceed.
What i basically want to ask is that- Is it correct to consider a
linked list as a data source.
Yes of course, Sometimes it is good to proceed with Link List. LL provides a simpler and faster way to move through nodes having various attributes of a specific entity.
What happens in this program is that a text files contents are loaded
into memory in a linked list.
Basically its not a good idea to store data as text,it increases parsing overhead.The Node values in LL are rather stored in binary format in such a way that when a program loads them, it can quickly link all the interconnected nodes.
Which does all the processing work and then when the program quits the
linked lists are written to the file.
Difficult to understand what you want to ask, but if you are asking who does processing work then obviously its your program who parses the data saved in file.Your program will also dump changes to file if you make any alteration in previous data.
In that case is this DFD correct?
Hard to tell, untill whole requirement is known..
I have GPS track data from a logging device, in GPX format (or any other format, easily done with gpsbabel). I also have non-GPS data from another measurement device over the same time period. However, the measurement intervals of both devices are not synced.
Is there any software available that can combine the measurement data with the GPS data, so I can plot the measured values in a spatial context?
This would require matching of measurement times and interpolation of GPS trackpoints, combining the results in a new track file.
I could start scripting all of this, but if there are existing tools that can do this, I'd be happy to know about them. I was thinking that GPSBabel might be able to do this, but I haven't found how.
a simple Excel Macro would do your job
In desktop GIS software you could import the two data types in their appropriate formats (which you haven't specified) whether they are shapefiles or even simply tables. Then a table join can be undertaken based on attributes. By selecting the measurement times as the join fields then a table will be created where if the measurement times values are shared in both your types of data the rows will be appended to one another.
As the question states: how is it possible to process some dynamic videostream? By saying dynamic, i actually mean I would like to just process stuff on my screen. So the imagearray should be some sort of "continuous screenshot".
I'd like to process the video / images based on certain patterns. How would I go about this?
It would be perfect if there already was (and there probably is) existing components. I need to be able to use the location of the matches (or partial matches). A .NET component for the different requirements could also be useful I guess...
You will probably need to read up on Computer Visual before you attempt this. There is nothing really special about video that seperates it from still imgaes. The process you might want to look at is:
Acquire the data
Split the data into individual frames
Remove noise (Use a Gaussian filter)
Segment the image into the sections you want
Remove the connected components of the image
Find a way to quantize the image for comparison
Store/match the components to a database of previously found components
With this database/datastore you'll have information on matches later in the database. Do what you like with it.
As far as software goes:
Most of these algorithms are not too difficult. You can write them yourself. They do take a bit of work though.
OpenCV does a lot of the basic stuff, but it won't do everything for you
Java: JAI, JHLabs [for filters], Various other 3rd party libraries
C#: AForge.net
I have an external device that spits out UDP packets of binary data and software running on an embedded system that needs to read this data stream, parse it and do somethign useful. The binary data gets logged to a file as well. I would like to write a parser that can easily take the input directly from either the UDP stream, or a file, parse the data into a specific format and then direct the output to either a file (e.g. matlab dat file) or to another process that will do some real time processing. Are there any resources that would help me with this and what is the best way to go about this? I think it might make sense to use C++ streams but I'm not familiar with creating custom output streams. Does this seem like a good approach to take or is there a better way to go about it?
Thanks.
The beauty of binary data is that its is generally of very fixed format.
A typical method of parsing it is to declare a structure that maps onto the received packets, and then to just use type-casts to read the fields as structure elements.
The beauty is that this requires no parsing.
you have to be careful about structure packing rules, and endian-ness to make the structure map exactly the same way. Use of the C "offsetof" and "sizeof" macros is useful to emit some debug info to check that your structure is indeed mapping to what you think it is mapping.
Packing rules can typically be altered either by directives (such as #pragma's) or command line options. Endian-ness you are stuck with. If its different from what your embedded system uses, declare all the fields as bytes, or use something like the "ntoh" macro to do the byte swapping.
The New Jersey Machine Code Toolkit is a scheme for decoding arbitrary binary patterns. It was originally designed for decoding instruction sets, but it ought to be just fine for decoding message formats. You provide a description of the binary format, it synthesizes code to access the fields of that format (when valid). THus you can refer to message fields using generated function calls rather than think about where the field is or how it is encoded.