gcovr with Cobertura XML output gives absolute path of production files - jenkins

I have managed to show code coverage data with Cobertura XML in Jenkins.
Its works well, but I wants to see the source code in Jenkins, too.
I accidentally saw, that Jenkins is referring for the source files at my windows harddrive, but the files are on a Jenkins server (different PC). This happens maybe, because in Cobertura output XML, the "filename" refers to an absolute path and not relative from directory. The XML attribute looks like filename="C:/Users/PathToMySources".
How I can change filenames in Cobertura XML to be relative?
gcovr.exe is invoked like this:
gcovr.exe --filter ../MySources/ --branches --html-details Test.html --cobertura Test.xml
My Project look like this:
MyProject
|
|---Library A (has subfolders)
|
|---Library B (has subfolders)
|
|---TestFolder for Library A and B (has subfolders)
This Layout is like this because Library A, B and Testfolders are externals in svn for sharing the library to everyone in the company.
With gcovr I refered for example to Library B. In Library B, I have subfolders. Some Subfolders contains only headers because of template classes in C++.
Now I saw another issue.
With --filter, the header only classes are included in coverage output, with --root they are not included!

In this instance, you can fix things by changing --filter to --root.
Gcovr has a concept of a “root” directory which defaults to the current working directory. The root is used for various purposes like serving as a default filter, but also for deciding how file names are presented in reports.
If a file path is within the root directory, a path relative from the root directory is shown.
Otherwise, an absolute path is shown
It is unusual to ever see an absolute path in a gcovr report, unless you're trying to include coverage for a library outside of your project directory, or if you're using the --html-absolute-paths option.

Related

Copy explicit directory and files/subdirs

I am using the Copy to Slave plugin for Jenkins, I want to copy some artifacts back the the master node workspace. I was following the information in (Ant -- copying files and subdirectories from only one subdirectory on a tree. The problem is that it's a Node application so a lot of nested directories and some with the same name, and by using the ant fileset nomenclature I am getting back ALL subdirectories on the slave back to my master node that match that pattern.
So, to give a more detailed instruction,
workspace
\--dir_a
\--dir_b
|
\--dir_c
|
\--dir_d
\--dir_e
\-- file_1
\-- file_2
\--dir_f
|
\--dir_b
\-- file_3
You'll notice that dir_b is listed twice, directly in the workspace, but also again at workspace/dir_f/dir_b. What I want is only the one specific directory, all it's files and its subdirs, recursively:
workspace/dir_b
So, if I use Copy files back to the job's workspace on the master node, and specify:
**/dir_b/**
I get both:
workspace/dir_b/ and all it's files and subdirs
workspace/dir_f/dir_b and all it's files and subdirs
So, how to write the fileset to only get the one specific subdir and its contents?
I also tried using
*/dir_b/**
But got nothing back.
And trying this seemed closer
dir_b/**
but it only got me dir_b and it's files, not subdirectories or their files (though this didn't makes sense to me). Turns out this did work. I had expected the subdirs/files to have a newer timestamp upon the re-run, but they didn't. So I removed this dir/files and re-ran, and it recreated the subdir/files.
Help!?

How can I tell Which bazel aspect outputs are still relevant

As part of our efforts to create a bazel-maven transition interop tool (that creates maven sized jars from more granular sized bazel jars),
we have written an aspect that runs on bazel build of the entire bazel repo and writes important information to txt files outputs (e.g.: jar file paths, compile deps targets and runtime deps targets, etc.)
We ran across an issue where the repo's code was changed such that some of the txt file were not written anymore. But the old txt file from previous runs (before the code change) remained!
Is there a way to know that these txt files are no longer relevant?
You should be able to run with --build_event_json_file=file.json and try to locate generated artifacts. For example we use it on ci.bazel.io to locate actual test.xml file that were generated: https://github.com/bazelbuild/continuous-integration/blob/09975cbb487a84a62ca1e43aa43e7c6fe078f058/jenkins/lib/src/build/bazel/ci/BazelUtils.groovy#L218
The definition of the protocol can be found in build_event_stream.proto

Building a non-uberjar Docker image with leiningen

I have a clojure project that depends on a Java library, that does not work, when it gets included in an uberjar. (It needs different XML descriptors using the same filename in different JAR files.)
Everything I find on using Docker with leiningen depends on building and packaging a uberjar. That's also how I built all clojure Docker images so far.
Is there any leiningen plugin out there, that understands to package a Docker image using several jar files like io.fabric8/docker-maven-plugin does?
Whenever packaging (uberjar, war) the big file that is created contains .class files and a directory structure. Where are these XML files supposed to be (class)loaded from? You can experiment with packing manually. After all it (whether uberjar, war or jar) is just a zip file.
When you know exactly the layout you need SBT is flexible enough to insure you can package from the many input jar files. Unfortunately lein plugins will do things like always overwrite duplicates, and you can't control the packaging behaviour. I can't remember exactly the inflexibilities, but I couldn't control how the packaging process went, what decisions were made.
For doing it manually I use a Linux something called Archive Manager, which I found to be much better than what I used when on Windows. Doing it manually may be all you need. The downside of SBT of course being that you have to learn it, which includes a bit of Scala.
It needs different XML descriptors using the same filename in different JAR files.
Just thinking about this, is it that you need to append the contents of each file that is in a different jar into the one file that is in the uberjar? You can try it out. If it works and you need to package up often enough that manually creating and renaming a zip file every time becomes a pain, then I believe that SBT will be your best bet.
I have to package my container with the original jar file and then reference this jar in the classpath when starting the application
The classloader loads classes rather than jars. It is the container's job to unpackage all the things you give it, such as .class files, (uber)jars, wars. Any program that dynamically loads from the classpath is loading either classes or resources (things like .xml files). I suppose a .jar file could be a resource, in which case you would put the jar file in the uberjar. So it is still possible to package it up.

Using the Robot Framework plugin, how to copy files outside the log folder into Jenkins?

I am running Robot Framework tests through Jenkins, and the tests use a custom Python library to take screenshots, save them into a specific folder (that is not the log folder), and embed them into the log. In the Jenkins job, I have a post-build action set to publish the Robot results, and I can get Jenkins to copy image files that are generated within the log folder, but I can't figure out how to copy image files that are generated outside the log folder.
The project's %WORKSPACE% is d:\git\product\registration
The directory of Robot output is \log\patient_search (the log directory is generated inside the registration directory)
The directory where the other images are generated is d:\git\product\registration\verify\images
If I put *.png into the "Other files to copy" box, Jenkins will copy any images within the log\patient_search directory and they embed correctly into the log. What I have in that box now is *.png,%WORKSPACE%\verify\images\**, and I have tried using backslashes instead, using a relative path (..\..\verify\images\**\*.png), and various combinations of asterisks and slashes. Those images always show as broken links in the log.
Are these paths wrong, or can this just not be done for files outside of the log folder?
I believe Robot Framework plugin "Other files to copy" works only inside "Directory of Robot output". A workaround would be to create a build step where you execute a shell and copy files to Robot output directory.

Publishing GulpJs minified/concatenated javascript via MSDeploy and Teamcity

Apologies if this question has been asked before. I have found variants on this theme but nothing that seems to fit our particular configuration.
We have developed a custom GulpJS task which parses a .json file located inside our folder assets/javascript. This json file contains an array of relative paths to javascript files (both our own and library) in a specific order for minification. They are then outputted to the folder assets/javascript/build and concatenated. The javascript source files are in the project but the minified and concatenated versions of the scripts, in fact the entire build folder itself, are not included in the Visual Studio project.
Ideally, I would like to have a step in the MSDeploy configuration which would copy all the files in the javascript build folder to the destination. Otherwise I could potentially include another step in Teamcity to do so.
Has anyone successfully instituted a similar build configuration and could share some insight? I tried using the MSBuild copy task but that didn't seem to copy the files to the output location. One option that I am considering is including the minified scripts in the project file but this might potentially trip up other developers who don't have Gulp running in their development environments (hilarious as that might be)

Resources