I want to exclude all files that are underneath a target folder from translations and scans. I have tried to use the -exclude parameter but it doesn't seem to work.
[warning]: No files were excluded as the file patterns: [**/target/**/*] specified for -exclude option did not match any files.
HP Fortify Static Code Analyzer 6.42.0006 (using JVM 1.8.0_45)
Processing C:/path/ProjectName/target/test/test_fortify_exclusion2.jsp
Processing C:/path/ProjectName/target/test_fortify_exclusion.jsp
I have also tried **target** and **/target/**.
Turned out I needed to specify the root path, preferably the path to the project. So in this example, it would have been: -exclude "C:/path/ProjectName/**/target/**/*"
Related
I have managed to show code coverage data with Cobertura XML in Jenkins.
Its works well, but I wants to see the source code in Jenkins, too.
I accidentally saw, that Jenkins is referring for the source files at my windows harddrive, but the files are on a Jenkins server (different PC). This happens maybe, because in Cobertura output XML, the "filename" refers to an absolute path and not relative from directory. The XML attribute looks like filename="C:/Users/PathToMySources".
How I can change filenames in Cobertura XML to be relative?
gcovr.exe is invoked like this:
gcovr.exe --filter ../MySources/ --branches --html-details Test.html --cobertura Test.xml
My Project look like this:
MyProject
|
|---Library A (has subfolders)
|
|---Library B (has subfolders)
|
|---TestFolder for Library A and B (has subfolders)
This Layout is like this because Library A, B and Testfolders are externals in svn for sharing the library to everyone in the company.
With gcovr I refered for example to Library B. In Library B, I have subfolders. Some Subfolders contains only headers because of template classes in C++.
Now I saw another issue.
With --filter, the header only classes are included in coverage output, with --root they are not included!
In this instance, you can fix things by changing --filter to --root.
Gcovr has a concept of a “root” directory which defaults to the current working directory. The root is used for various purposes like serving as a default filter, but also for deciding how file names are presented in reports.
If a file path is within the root directory, a path relative from the root directory is shown.
Otherwise, an absolute path is shown
It is unusual to ever see an absolute path in a gcovr report, unless you're trying to include coverage for a library outside of your project directory, or if you're using the --html-absolute-paths option.
More specifically, I have a workspace directory containing several coding projects, which in turn contain node_module directories. I want to create a tar.gz of my workspace while excluding all node_module directories, but when I try something like:
tar -cvzf projects.tar.gz projects/ --exclude="node_modules"
It does not exclude any of the node_modules subdirectories. Is there a way to do this?
Thank your for your help
I'm going to have multiple .gcda files in my project. One for each test case. Because gcov will merge .gcda for each test case execution, I move each .gcda file to a different directory.
When calling gcov, I tried to specify .gcda (.\data\gcda\evenodd.gcda) and .gcno (.\data\gcno\evenodd.gcno) files.
I used -o data\ but it seems that gcov doesn't scan the path subdirectories.
Then, I tried to specify each file's path like this:
gcov -o data\gcno\evenodd.gcno data\gcda\evenodd.gcda evenodd.c
because I thought it would accept more than one path. Alas, the result is :
By putting the .gcda path first instead of .gcno, this is what I got:
Since the default location of these files is in the source file directory, I tried to put .gcno there and then specify .gcda path. It didn't work.
Based on those results, gcov won't receive more than one path for -o.
Apparently gcov will always look for .gcno and .gcda in pair when -o is specified. Is there a way to do this other than putting both of them in the same directory?
You should not move .gcno/.gcda files to different folder after generation because gcov will not be able to locate related source files. Moreover, gcov has to be run in exact same directory where test executable has been compiled.
I have a mutli-module project which I'm migrating from Maven to Bazel. During this migration people will need to be able to work on both build systems.
After an mvn clean install Maven copies some of the BUILD files into the target folder.
When I later try to run bazel build //... it thinks the BUILD files under the various target folders are valid packages and fails due to some mismatch.
I've seen deleted_packages but AFAICT it requires I specify the list of folders to "delete" while I can't do that for 200+ modules.
I'm looking for the ability to say bazel build //... --deleted_packages=**/target.
Is this supported? (my experimentation says it's not but I might be wrong). If it's not supported is there an existing hack for it?
Can you use your shell to find the list of packages to ignore?
deleted=$(find . -name target -type d)
bazel build //... --deleted_packages="$deleted"
#Laurent's answer gave me the lead but Bazel didn't accept relative paths and required I add both classes and test-classes folders under target to delete the package so I decided to answer with the complete solution:
#!/bin/bash
#find all the target folders under the current working dir
target_folders=$(find . -name target -type d)
#find the repo root (currently assuming it's git based)
repo_root=$(git rev-parse --show-toplevel)
repo_root_length=${#repo_root}
#the current bazel package prefix is the PWD minus the repo root and adding a slash
current_bazel_package="/${PWD:repo_root_length}"
deleted_packages=""
for target in $target_folders
do
#cannonicalize the package path
full_package_path="$current_bazel_package${target:1}"
classes_full="${full_package_path}/classes"
test_classes_full="${full_package_path}/test-classes"
deleted_packages="$deleted_packages,$classes_full,$test_classes_full"
done
#remove the leading comma and call bazel-real with the other args
bazel-real "$#" --deleted_packages=${deleted_packages:1}
This script was checked in under tools/bazel which is why it calls bazel-real at the end.
I'm sorry I don't think this is supported. Some brainstorming:
Is it an option to point maven outputs somewhere else?
Is is an option not to use //... but explicit target(s)?
Maybe just remove the bad BUILD files before running bazel?
With the sourceanalyzer, how can I provide multiple file/path exclusions during translation?
Following the example from: Fortify SCA exclude test folder\files
/src/main/xyz/pqr/Abc.java
/src/main/xyz/test/abc.xsd
/src/test/xyz/Xyz.java
I have tried adding multiple -exclude flags, as well as different delimiters, with no luck.
If you use the Scan Wizard and review the resultant .bat file, you can see how they are invoking sourceanalyzer. For your particular question, you can create an argument file like so:
-exclude "/src/main/xyz/pqr/Abc.java"
-exclude "/src/main/xyz/test/abc.xsd"
-exclude "/src/test/xyz/Xyz.java"
Name it something like Exclude.args and then invoke sourceanalyzer like so:
sourceanalyzer.exe -b MyBuild #Exclude.args
I was working from an Azure DevOps Pipeline using fortify Translate batchscript task. In this environment it worked to add multiple -exclude flags:
steps:
- task: BatchScript#1
displayName: 'Fortify Translate JavaScript'
inputs:
filename: '$(FORTIFYSCA)\sourceanalyzer.exe'
arguments: '-debug -verbose -b $(Build.ApplicationName) $(Build.SourcesDirectory)\**\*.js -exclude node_modules\**\* -exclude coverage\**\*'
I am using the VS 2015 addin for Fortify scan. I sorted out the files I want to skip. I created a new filter and saved the results in a separate folder apart from Critical, High, etc.
This way the selected files were avoided from the final result.