I was wondering what the best practices is for duplication of code between nodes.
For example:
I grouped together two packages with a metapackage called perception, namely; object detection and localization.
object detection package has a node -> ballDetection
localization package has a node -> linesDetection
As you can already assume, there is a lot of code duplication in these nodes. For example, if I want to display images in both nodes to the screen, I have the exact same code in both nodes.
Should I create a library that both nodes can use to display images to the screen? Or should I use nodes for everything? What is the best practice for situations like this?
The best practice would be to create a shared library containing the base code. You then just use the functions/classes from this library in your nodes.
In you CMakeLists.txt you can define a library in the following way:
## LIBRARIES: libraries you create in this project that dependent projects also need
catkin_package(
INCLUDE_DIRS include
LIBRARIES <library name>
)
...
## Declare a C++ library
add_library(<library name>
src/${PROJECT_NAME}/<cpp file of your library>
< ... additional files ... >
)
Of course don't forget that you need to link dependencies to the library.
target_link_libraries(${PROJECT_NAME}
${catkin_LIBRARIES}
)
I also suggest you read through CMakeLists.txt where you have all the instructions on declaring a library.
As for the structure, it doesn't really matter how do you package it. In your specific case I would suggest you to have one catkin package with library and two nodes (you can have multiple nodes in a package). If you think you need to have the library in a separate package, well go ahead. You only need to include to tell catkin you want to include that package as a dependency for the package where you want to use it. You can do this by writing
find_package(catkin REQUIRED COMPONENTS
...
<name of the package with the library>
...
)
Then you just include the headers from the package in your node (or whatever) files (C++). If you library is NOT header only, then don't forget to link the library against your node with
target_link_libraries(
<node name>
${catkin_LIBRARIES}
<library name>
...
)
I also suggest reading more about this topic at ros wiki.
Related
I currently have some c++ libraries that i would like to implement and build in ROS using catkin workspaces. My method so far has been to let each library be its own package, but I've recently run into a problem of circular dependencies with the packages. My approach to fix this has been to implement the libraries within a single package, but i somehow wish to keep the libraries separated and i therefore wonder if it is possible to structure the include folder for a ROS c++ package with subfolders?
The idea would look something like this
--catkin_ws
--src
--my_package
--include
--library_1
someheaderfile.h
--library_2
someotherheaderfile.h
..
--src
--library_1
somecppfile.cpp
--library_2
someothercppfile.cpp
CMakelists.txt
package.xml
I guess my main concern lies within breaking the catkin structure needed for proper compilation.
Thanks!
Solved:
As long as you properly structure your CMakeLists.txt with respect to the guide presented here: http://docs.ros.org/melodic/api/catkin/html/howto/format1/building_libraries.html, and your subdirectories lie within include/my_package/ everything works fine.
I have a solution which contains more than one Project. Project structure is as below.
/root
A.sln
A.nuspec
/ProjectB
projectB.csprj
projectB.nuspec
/ProjectC
projectC.csprj
projectC.nuspec
I have a few question.
1- What happens if I run nuget pack A.nuspec in root folder. Is there a way package all Project in a solution.
2- When I send the code TFS, with "NuGet Packager" I can use a regex to packege all sub Project as shown below. Is there a way to use such a regex in local environment.
3- Is it possible to create a nupkg contain both sub Project.
4- Is it possible to create a nupkg contain more than one dll. Can I put all dependency of the Project into nupkg.
1- What happens if I run nuget pack A.nuspec in root folder. Is there a way package all Project in a solution.
I don't believe there is a way to package all project in a solution by using .nupsec. That because the process of creating a package always begins with creating a .nuspec package manifest file that describes your package contents. This manifest drives the creation of the package as well as its usage when installed into a project not solution. And the ID and Version must be unique in the .nupsec, so we could not use the unique ID and Version to specify multiple projects.
2- When I send the code TFS, with "NuGet Packager" I can use a regex to packege all sub Project as shown below. Is there a way to use such a regex in local environment.
The answer is NO. I have test it on my local machine, the wildcard is treated as Illegal characters in the command line.
3- Is it possible to create a nupkg contain both sub Project.
If I understand you correctly, you mean the sub Project is reference project? If yes, the answer is yes. You can use the the option "IncludeReferencedProjects" to include referenced projects either as dependencies or as part of the package. please refer to this document for detail.
4- Is it possible to create a nupkg contain more than one dll. Can I put all dependency of the Project into nupkg.
Of course you can. You can refer to another thread which I have answered for more detail. If you want to put all dependencies of the Project into nupkg, you can use the <dependencies> to include all dependencies in the .nuspec.
<dependencies>
<dependency id="Newtonsoft.Json" version="9.0" />
</dependencies>
I'm trying to build and package LCM with Bazel. This works for the "build" part, but the end result is a library not usable by external consumers (i.e. "package" fails, because the package is broken).
LCM uses glib, which I am importing with pkg_config_package (gory details). More particularly, LCM uses glib internally, but does not expose this to users. This means that consumers should not need to link glib; liblcm.so should do that, and consumers should only need to link to LCM itself.
This all works great with upstream (which uses CMake and Does The Right Thing). Bazel, however, seems to be not linking liblcm.so to glib, for some unknown reason. If I build an executable with Bazel within the same overall environment, Bazel seems to know that users of LCM also need to link to glib. However, when I try to package this LCM for external use, it is broken, because liblcm.so isn't linked to glib, which forces consumers to deal with LCM's private glib dependency.
Why is Bazel not linking the LCM library to glib, and how do I fix it?
(p.s. We have similar issues with libbot...)
Apparently, this is a known issue: https://github.com/bazelbuild/bazel/issues/492.
I can't just make the cc_library a cc_binary, either, because — while that would fix the under-linking — then I can't use the library in other Bazel targets. Nor can I make a cc_binary that wraps a cc_library, because then internal and external consumers aren't using the same library.
Static libraries do not link with other static libraries. When building through Bazel, Bazel keeps track of the dependencies and will link against all dependent libraries when building the executable.
There's more information here about linking static libraries:
Linking static libraries to other static libraries
One interesting suggestion brought up is unarchiving both libraries and then creating a new library with all the .o files. This could possibly be achieved in a genrule.
I'm writing a post-build tool that needs the location of a list of target's jar files.
For these locations I have an aspect that runs on a list of targets (separately for each target using --aspects) and fetch the jar file path for each of them.
I've managed to get each jar file path in a custom output file (e.g. jar.txt) in each target's output folder.
But this will mean I would need to go over each jar.txt file separately to get the location.
Is there a way to accumulate the jar files paths in a single file?
Something like:
Try and write to the same output folder with append command in the aspect. I'm not sure if a shared output folder is possible.
Create a synthetic target which depends on all the relevant targets, then run an aspect on this target and accumulate the jars and only write them at the root after the recursion is back.
Are 1. or 2. valid options?
What is the recommended strategy to accumulate data in bazel aspects output files?
Bazel doesn't provide facitlities in Skylark for accumulating information between targets that are not related to each other in the target graph (e.g. ones that are mentioned on the command line next to each other).
One possibility would be to write a Skylark rule that depends on all the targets you usually mention on the command line and built that one; that rule will be able to collate the classpaths from each Java target to a single file.
Another possibility is to tell Bazel to write build events (that includes all the outputs of all targets the specified build pattern expands to) to a file using the --experimental_build_event_{json,text,binary}_file. (The "experimental" will be removed soon.). The files contain instances of this message:
https://github.com/bazelbuild/bazel/blob/master/src/main/java/com/google/devtools/build/lib/buildeventstream/proto/build_event_stream.proto
Natan,
If I understand correctly, you want to transitively propagate the information from each aspect node out into a single result. To do this, build the transitive set in your aspect rule implementation and pass it via the "provider" mechanism [^1]. I wrote up some examples on bazel aspects, perhaps you'll find it useful[^2].
https://github.com/pcj/bazel_aspects/blob/master/aspects.bzl#L94-L104
https://github.com/pcj/bazel_aspects
I am trying to integrate Twitter in my application. I import two .jar files with different names, but one package has the same name in both files. When I compile, it shoes following error.
Description Resource Path Location Type
D:\CustomClasses\ksoap2-j2me-core-prev-2.1.2.jar(org/kxml2/io/KXmlParser.class): Error!: Duplicate definition for 'org.kxml2.io.KXmlParser' found in: org.kxml2.io.KXmlParser
Assuming the two JARs are third party (not platform libraries), you should consider a more sophisticated compilation and packaging step. But before going down this path, check to see whether the JARs you are importing don't come in different forms -- ones that don't embed their dependencies.
Either way, have a step in your compilation to extract just the parts that you need from each JAR.
If you are not using build scripts but use an IDE for everything, set up a build script just to build your customized dependencies JAR.