How to use namespace in ROS? - ros

I can not understand how the namespace works in ROS http://wiki.ros.org/Names
Can you hang a couple of real examples of how this works?
And the same question on the parameters http://wiki.ros.org/Parameter%20Server
What do these names mean?
Are the names of the package-node-parameter or what?

Namespaces are the best option to deal with name collision which is quite oft in robotics, especially when the system is going bigger and more complex...
Imagine you have a robot with 2 sensors for the distance, the front and back, then you can think I have 2 topics with the same info
distance=10 and distance=10
now what? how can a 3rd node knows which distance is which???
now using namespaces you can avoid that issue by just doing
back/distance=10 and front/distance=10

Related

Best Grasshopper plugin to analyse floor plans

I'm trying to figure out the best way to analyse a grasshopper/rhino floor plan. I am trying to create a room map to determine how many doors it takes to reach an exit in a residential building. The inputs are the room curves, names and doors.
I have tried to use space syntax or SYNTACTIC, but some of the components are missing. Alot of the plugins I have been looking at are good at creating floor plans but not analysing them.
Your help would be greaty appreciated :)
You could create some sort of spine that goes through the rooms that passes only through doors, and do some path finding across the topology counting how many "hops" you need to reach the exit.
So one way to get the topology is to create a data structure (a tuple, keyValuePair) that holds the curve (room) and a point (the door), now loop each room to each other and see if the point/door of each of the rooms is closer than some threshold, if it is, store the relationship as a graph (in the abstract sense you don't really need to make lines out of it, but if you plan to use other plugins for path-finding, this can be useful), then run some path-finding (Dijkstra's, A*, etc...) to find the shortest distance.
As for SYNTACTIC: If copying the GHA after unblocking from the installation path to the special components folder (or pointing the folder from _GrasshopperDeveloperSettings) doesn't work, tick the Memory load *.GHA assemblies using COFF byte arrays option of the _GrasshopperDeveloperSettings.
*Note that SYNTACTIC won't give you any automatic topology.
If you need some pseudo-code just write a comment and I'd be happy to help.

Explanation of Mapping structure in Veins

I try to understand and implement modifications to the Veins framework. Right now, I have some difficulties figuring out how the "Mapping" structure works. It is used to set the transmission power in "Mac1609_4.cc"
ConstMapping* txPowerMapping = createSingleFrequencyMapping(start, end, frequency, 5.0e6, power)
and to calculate received power, SNR and SINR in "Decider80211p.cc". Could you give some insight and some examples related to the structure manipulation?
The mapping structure is from MiXiM, as Veins initially forked that project. MiXiM, however, is deprecated now and should not be used anymore [2]. Unfortunately, there is no real documentation available (anymore).
As replacement, there is either INET, which also is supported by Veins, or, as it will be introduced in the next release, a much simpler representation of Signals, removing the Mapping structure [4].
If you still need to understand the structure, you can have a look at this paper where the authors explained the physical layer including the Mapping structure.

Zener Diode - What constitutes "Similar?"

I have very little experience with ECE in general and I am delving into using an Arduino for some small hobby type projects.
I was following an online guide, and the person who wrote says that I need:
"2 - 1N5227 or similar 3.6V biased zener diodes"
I have read up a bit on Zener Diodes and now understand what they do and what their purpose is. I am not able to tell what he means by similar in this context though. I purchased a Diode Kit that includes 4 types of Zener Diodes. They all have different part numbers and voltages.
The 4 I have are:
1N751 5.1V
1N4733 5.1V
1N4735 6.2V
1N4742 12V
Would any of those be usable in this context or should I order the specific model he states?
The guide being referenced is this, if it is helpful: http://www.instructables.com/id/RC-Transmitter-to-USB-Gamepad-Using-Arduino/
I really appreciate the time and assistance with this, this is a fun area to learn in!
In electronics and other engineering areas, similar refers to the property that stands out (in this case the voltage), in your case refers to looking for another zener diode whose voltage is similar to the example. As I see none of those diodes replaces the example.
Zener diodes has two parameters you need to match in the selection of a replacement (independently the manufacturer):
The Zener voltage (Vz) and the diode power.
For your application you will need a Zener diode of 3.6 V, and usually with 1/4 W to 1/2 W (depending the application power you will need) it will be enough.
You need also to calculate the limiting resistor for the Zener diode.
I recommend you to read the book of Albert Paul Malvino or similar to better understand.
Regards.

error detection on food packaging -using Open Cv

I am trying to determine when a food packaging have error or not error. Example
the logo " McDonald's " have error misprints or not, as the wrong label, wrong color..( i can not post picture )
What should I do, please help me!!
It's not a trivial task by any stretch of the imagination. Two images of the same identical object will always be different according to lightning conditions, perspective, shooting angle, etc.
Basically you need to:
1. Process the 2 images into "digested" data - dominant color, shapes, etcw
2. Design and run your own similarity algorithm between the 2 objects
You may want to look at Feature detectors in OpenCV: Surf, SIFT, etc.
Along a result I just found your question, so I think I come too late.
If not I think your problem car easily be resolved, it exists since years and is called Sikuli .
While it's for testing purposes, I have been using it in the same way as you need : compare a reference and a production image. Based on OpenCV it does it very well.

upper bound - display

This is an idea I got in to my mind,
All the display devices(screens which have pixels etc...) have an upper bound for the amount of various images they can generate.
as an example 1024*728 - 32 bit pixel display can only show (2^32)^(1024*768) etc... number of identical frames without duplicating any scene(view).
funny thing is, It's like we could pre generate all the films all the windows we have ever seen in our lives through screens etc...
the question here is can anybody use this abstract idea to create something useful? :D
You're talking of a number about
(2^32)^(1024*768) ~~ ((2^4)^8)^(10^6) ~~ 10^8^(10^6) ~ 10^8000000.
The number of atoms in universe is about
10^80 // http://en.wikipedia.org/wiki/Observable_universe#Matter_content
I think that there is no way we could pre-generate all the screens in our life.
Let me formulate another question. From a number this big, what can we do to reduce it? How to aggregate similar pictures in order to reduce the complexity?
Another nice question is: what kind of data structure we need to store all this information? Suppose we reduce the number of similar images to 10^10. What kind of structure can handle so many different kinds of pictures in an efficient way?
So given some extra information about the scenes you could generate you might be able to pull apart the scenes that no-one has ever seen.
So if you could take all the pictures out on the internet and the statistics about what was popular or viewed a lot then compute your all possible screens you could pull apart that was not viewed much.
With some basic rules about complexity of the image you might be able to come up with images that have not been seen before. Think 80% flesh tones might produce something coupled with a variance to show range might render people naked. :-)
Of course the computation of such an idea is vastly outside our potential. 2^32^(1024*768) is in the superexponential range which is outside the bounds of reality. I tried to compute it in ruby, and it just died. It would have been fun if it had actually worked. :-)

Resources