ROS services are not building properly - ros

I am trying to build my custom ROS services. They are inside a another parent package
the structure is as follows:
|--catkine_ws
| |--src
| | |--Parent
| | | |--CMakeLists.txt
| | | |--package.xml
| | | |--ChildA
| | | | |--CMakeLists.txt
| | | | |--package.xml
| | | | |--srv
| | | | | |--SomeService.srv
| | | |--ChildB
The packages are building correctly and I am able to use them in other nodes and packags.
however when I try to use rossrv list the custom services do not appear. I think that this is causing some issues when I try to build my Simulink controller and it cannot find the service message definition.
Does any one have any idea what is going on?

I was able to fix the problem, while not obvious, the solution was rather simple. I had to change the slightly change the structure of the package by making the parent package a meta package then do some handling to make sure that the sub packages still had access to the cmakes to locate my external packages.

Related

Include path has been specified but still failed to include the header in the path in a Bazel C++ project

I have projects with a directory structure like that
---root
| |--src
| |--project1
| |--model
| | |--incude
| | | |--model
| | | |--modelA.hpp
| | | |--modelB.hpp
| | |--modelA.cpp
| | |--modelB.cpp
| | |--BUILD #1
| |...
| |--view
| |...
| |--common
| | |--include
| | |--common
| | |--data_type.hpp
| |--BUILD #2
|--WORKSPACE
As I have other package in this project and some of them use the same self-defined data type, I defined them in a package named common.
Now I include the data_type.hpp in file modelA.hpp
...
#include "common/data_type.hpp
...
Refering to the stage3 example in the tutorial, the BUID(#1) is like that
cc_library(
name = "modelA",
hdrs = "include/model/modelA.hpp",
deps = ["//src/project/common:data_type"],
copts = ["-Isrc/project/common/include"],
)
and the BUILD(#2) which defines the depedency module data_typeis like that
cc_library(
name = "data_type",
hdrs = ["include/common/data_type.hpp"],
visibility = ["//visibility:public"],
)
However, when I built the code, I got
src/project/model/include/model/modelA.hpp: fatal error: common/data_type.hpp: No such file or directory
Why I have defined copts = ["-Isrc/heimdallr/common/include"] but still got this error?
Please check the Header inclusion checking section of C/C++ Rules from the Bazel document. Relative to the workspace directory, all include paths should be created. Kindly refer to this issue for more information. Thank you!

Upgrading from Grails 2.2 to 4.0: Is there a cross-reference between old org.codehaus to new grails classes?

We are upgrading an application from Grails 2.2.0 to Grails 4. Obviously the biggest step is Grails 2 to Grails 3, which requires a lot of code changes.
Several pages in the Grails documentation (e.g. http://docs.grails.org/3.1.7/guide/upgrading.html) state:
All package declaration in sources should be modified for the new
location of the respective classes. Example
org.codehaus.groovy.grails.commons.GrailsApplication is now
grails.core.GrailsApplication.
We have a large set of codehaus classes. This comment sounds like the (old) codehaus classes have been moved to multiple packages (presumably all grails.?).
Is there a cross-reference, listing which grails package each codehaus class has been moved to?
NOTE: After doing most work without taking notes, these are x-references for some I did note, if that helps.
| From | To |
|--------------------------------------------------------------------------+--------------------------------------------------------------------+
| org.codehaus.groovy.grails.orm.hibernate.HibernateSession | org.grails.orm.hibernate.HibernateSession |
| org.codehaus.groovy.grails.web.json.JSONObject | org.grails.web.json.JSONObject |
| org.codehaus.groovy.grails.commons.GrailsDomainClass | grails.core.GrailsDomainClass |
| org.codehaus.groovy.grails.plugins.springsecurity.GrailsUser | grails.plugin.springsecurity.userdetails.GrailsUser |
| org.codehaus.groovy.grails.web.json.JSONArray | org.grails.web.json.JSONArray |
| org.codehaus.groovy.grails.plugins.springsecurity.GormUserDetailsService | grails.plugin.springsecurity.userdetails.GormUserDetailsService |
| org.springframework.security.core.authority.GrantedAuthorityImpl | org.springframework.security.core.authority.SimpleGrantedAuthority |
Is there a cross-reference, listing which grails package each codehaus
class has been moved to?
There is not. In part because it is not the case that each grails package has a codehaus counterpart, and vice versa.

Best practice for TFS hierarchy

In my company there are several divisions, and many projects done in different versions of visual studio so I have thought in making below tfs structure under only one project collection:
My_Project_Collection
|
|___ Division_1
| |
| |__ VS2010
| |__ VS2012
| |__ VS2013
| |
| |__ Team_Project_1
| | |__ Main
| | |__ Dev
| | |__ Release
| |
| |__ Team_Project_2
| |__ Main
| |__ Dev
| |__ Release
|
|___ Division_2
|
|___ Division_N
My question is: Is it worth to classify team projects by version of visual studio (VS2010, VS2012, VS2013 and so on) or is it not necessary?
Division_2, ... Division_N have the same structure as Division_1
I don't think that you do want to subdivide it like that, because subdividing it by VS versions doesn't really reflect reality (I would think you'll want to eventually move those projects to different versions of VSTS). Unless you think they'll be that way in perpetuity and the distinction provides a useful advantage, I think there will be more work to divide them like that. Consider using area paths for some of the concepts in your hierarchy.
E.g.
My_Project_Collection
|
|___ Division_1
| |
| |
| |__ Team_Project_1 - AP VS2013
| | |__ Main
| | |__ Dev
| | |__ Release
| |
| |__ Team_Project_2 - AP VS2010
| |__ Main
| |__ Dev
| |__ Release
|
|___ Division_2
|
|___ Division_N
See this SO for a similar question: Should I create a single VSTS team project or multiple? which, you're already doing by having one project collection from what I can gather.
And also check out this blog linked in that answer as well: Why You Should use a Single (Giant) TFS Team Project. The same benefits/disadvantages should apply... really just think about where you want your boundaries and have your structure reflect that.

Does Release Management 2013 rollback across tags

We're heavy users of tags and I'm confused how tags and rollbacks interact together.
I understand that rollbacks cascade (at least within a sequence) from this article:
http://incyclesoftware.com/2014/03/understanding-rollbacks-release-management/
But I'm not clear how this would interact when you use tags, i.e. we tag servers by what features are installed on them (web, database, service) and vary the mix of features depending on environment (i.e. DEV might have web & services running on the same machine, but UAT & PROD would have seperate machines)
So does the rollback go back across the tag boundaries? If for example your sequence looked like this
+--Database tag --+
| Backup DB |
| | |
| Update DB |
| | | <- Runs against SQL server
| +--Rollback--+ |
| | Restore DB | |
| +------------+ |
+-----------------+
|
+---Web Tag-------+
| Do Stuff | <- Runs against WEB server
+-----------------+
|
+---Service tag----+
| Backup |
| | |
| Install new ver | <- Runs against Service server
| | |
| Smoke test |
| | |
| +--Rollback----+ |
| | Replace with | |
| | backup | |
| +--------------+ |
+------------------+
Would a roll back inside the service tag cause the database tag to execute it's rollback? Do rollbacks cascade across sequences?
I haven't had time to set this up yet and test so I thought I'd ask the question instead.
By accident I've managed to test this out with a suitable release and roll back does roll back across the tags as #joerage says.
It appears I was wrong... faulty memory and all that. Rollbacks work across tag boundaries.
I generally recommend against using rollback blocks, since their behavior is generally backwards, unpredictable, and not immediately obvious. The current best practice is actually to not use agent-based releases at all, as they will not be portable to the forthcoming Release Management Service.

How to get MultiMarkDown to view tables in Sublime Text 2 OSX

I am trying to avoid using inline HTML to get tables working in my MD file. I have Markdown Preview and Table Editor installed via the package installer, and multimarkdown installed via homebrew, but I can't get the following text to display as a table:
| Left align adsf | Right align | Center align |
| :--------------- | ----------: | :----------: |
| This | This | This |
| column | column | column |
| will | will | will |
| be | be | be |
| left | right | center |
| aligned | aligned | aligned |
When I "Markdown Preview" it just displays like this:
| | | Left align adsf | Right align | Center align | | --- | --- | ---------------- | ----------- | ------------ | | | | ---------------s | ----------- | ------------ | | --- | --- | :--------------- | ----------: | :----------: | | | | This | This | This | | | | column | column | column | | | | will | will | will | | | | be | be | be | | | | left | right | center | | | | aligned | aligned | aligned |
I have switched the file type to MultiMarkdown (lower right portion of ST2 screen)
I have searched, and it appears some people have a build system, or other approaches I have been unable to get going. What am I missing? If a build system is needed, how do I set up one? I am mainly interested in viewing this in HTML, but wouldn't be opposed to other ways....
If you switch the parser to github, it'll work just fine.
Go to Prefrences > Package Settings > Markdown Preview > Settings - User and paste this code:
{
"parser": "github"
}
"If a build system is needed, how do I set up one?"
In OS X I would strong suggest getting the excellent Marked.app and then setting up a new build system in ST containing this trivial code
{
"osx": {"cmd": ["open", "-a", "Marked", "$file"]},
"selector": "text.html.markdown"
}
Then when you 'build' a markdown file (Cmd+B) you will get a preview generated in Marked.
Easy and elegant and well worth the cup-of-coffee price of Marked to avoid all the hassle of the plugin approach.

Resources