Fortify running a scan spanning several code repositories but generating a single report - fortify

I am working with a project that consists of several (3 or 4) git repositories. Each repository uses maven to build it.
I need to run fortify against all the repositories, but I want just a single fpr report containing the results for all the repositories not one per repository.
Is there a recommended way to do this?
Note: there is no overarching pom.xml that builds the entire application just individual pom.xml files for each repository.
Any tips would be appreciated.
We are using Fortify 16.11 on a Linux server.

Translate all the repositories into the same build id (-b <build_id>).
Once they are all translated, run the -scan command on that <build_id>

Related

How to get SonarQube to read from multiple repositories?

I have SQ setup on Jenkins, and the project I’d like to scan now has two separate repositories, and I essentially need the dependencies from both pom.xml’s to successfully scan one API.
How can I add a Jenkins step to scan one pom.xml in repo A, then scan pom.xml from repo B, all before it fails.
So in your case, you need repo A to be available since repo B has dependencies generated from repo A?
I guess that if you want two generate two separate scans, you can add a step in your Jenkins Job to clone and scan then the repo. But it's not really clean as you're mixing 2 CI/CD.
Maybe an idea would be to install the POM modules in a shared .m2 repository or something like a Nexus repo, so that all dependencies are shared among projects.
Sorry if this is not clear.

Repository manager that manages binary dll files (Embedded C/C++ project artifacts) and that integrates with Jenkins

Is there any Repository manager that manages the binary dll files and also integrates well with the Jenkins?
Can Nexus be used to manage the dll files as these files are created as a part of Embedded C/C++ Projects and not sure if Nexus Artifact Manager supports/integrates well with such Projects as it mainly supports the Java projects?
Is there a way to automatically manage the upload and download of such project artifacts from Nexus/other artifact managers without the use of POM file?
Suggest in case there are other Artifact Managers that supports binary artifacts.
Artifactory can be used to store any type of binaries.
Starting with Artifactory 4.0, you can create generic repositories which allows uploading packages of any type. You will not need to upload any POM files and Artifactory will not need to calculate any metadata (for example Maven metadata).
To deploy files you can use the REST API or the UI, for example:
curl -uUSER:PASS -T file.dll http://localhost:8081/artifactory/dll-local/path/to/file.dll
If you have a certain layout you would like to use for this repository you can create a custom layout and associate it with the repository. This can be useful for automatic snapshot/integration versions cleanup and other module management tasks.
Disclaimer: I'm affiliated with Artifactory
The Nexus repository manager is java oriented, but can be used to store any files you want. Binaries of all types or even just text configuration files.
To automate the file upload process, you can use maven from command line:
mvn deploy:deploy-file -DgroupId=com.you -DartifactId=file -Dversion=1.0 -Dpackaging=exe -Dfile=c:\out\file.exe -Durl=http://yourserver/nexus/content/repositories/releases -DrepositoryId=releases
Then, to get the file, you should be able to get it directly with the following URL:
wget http://yourserver/nexus/content/repositories/releases/com/you/file/1.0/file-1.0.exe
This is a simple approach to using Nexus as a general artifact repository.
I hope this helps.
The open source version of Nexus (Nexus OSS) is supports many repository formats out of the box including Maven, NuGet, NPM, RubyGems and others. Nexus just runs on Java (e.g. like Jenkins). It is not Java only...
Depending on how you plan to get the DLL files from the repository, different formats might be more or less suited to your usage. You could even use a custom format, but then you rely custom tools.
The scenarios I have seen at many customers are
using a Maven repo and pulling the files in either in a Maven build together with the Maven NAR Plugin (used for native development with C/C++)
using a Maven repo and pulling via plan HTTP GET calls using your scripting language/build tool of choice
using NuGet format and store the DLLs in NuGet packages in the repo and using nuget to retrieve them for the projects
All of these work well.

Jenkins + Tycho: propagating update sites

I'm wondering if there is an easy way to "publish" p2 update sites in Jenkins (built with Tycho) so that they can easily be accessed in downstreams jobs? Currently I'm doing it semi-manually using Jenkins support for copying artifacts between jobs, and then specifying a repository-mirror element in a job-specific settings.xml which refers to the artifacts copied into the job, but this is all a little tricky and requires configuring jobs and build settings in a number of different places.
Is there any nicer way short of using an external solution such as Artifactory?
The only solution involving a repository manager that I am aware of is to use a Nexus and the Unzip Plug-in. (Disclaimer: The Unzip Plug-in is provided by the Tycho project, of which I am a committer.)
With such a setup, you could have one job deploy an update site to Nexus, and the next job use the update site via the unzip URL of the deployed site. Example: If the site was deployed under the GAV project.abc:site:1.0.0-SNAPSHOT, you could then access it via http://<nexus>/content/repositories/<unzip-repo-name>/project/abc/site/1.0.0-SNAPSHOT/site-1.0.0-SNAPSHOT-unzip/.
Note that you are slightly less flexible with such a setup that with what you have set up now: You need to have a version number for what your upstream project is building, so this may become tricky if you have multiple feature branches developing towards the same release version.
If you don't need this, you have the benefit of getting a portable build of your downstream project, i.e. developers build the project in the same way as your Jenkins does.

Jenkins - Can master node access source code held on slave?

I have been conducted successful tests generating code coverage data on a c++ project using gcov, gcovr and Cobertura Jenkins plugin. In this simple project the build was done on the master node. In jenkins I could drill down into the coverage report to see the coverage at line level.
Now we are trying to expand the project into a real use case. In this distributed setup, we have a master node running jobs on a multitude of slaves. The coverage report works as before, except the source code display is not available.
Clearly this is because the report is shown by the master node, but the source is only checked out on the slave.
Is there a way to overcome this? Do I need to copy the source from the slave or can I get the master to do its own SVN checkout to have a parallel source tree?
The way I've accomplished this in the past is to use the Copy to Slave plugin which can copy files from the slave back to the master. However I've used it to copy unit test results back which are fairly small XML files. If your source tree is really large, it might take a while.
https://wiki.jenkins-ci.org/display/JENKINS/Copy+To+Slave+Plugin

Jenkins - Is it necessary to have a repository for running a multi configuration project

I am new to Jenkins. I am trying to configure Jenkins to run my Selenium tests on multiple browsers. So I thought multi-configuration project would be a best choice. I am using Sauce labs for performing cross-browser testing.
My selenium source code is on my local system directory. I have not uploaded the same to any of the repositories. I have configured a multi-configuration project with a custom workspace pointing to my local source code, and selected "none" in Source code management section.
Now, when I build the job, the job creates workspace for each browser combination. Eg: <project workspace>\SELENIUM_DRIVER\Windows 2003firefox16 and <project workspace>\SELENIUM_DRIVER\Windows 2003internet explorer8. But the files are not copied to each of these workspaces automatically. I need to copy my files manually into these directories for it to work.
Is it necessary to have Repositories like SVN, CVS or Git for this to work. Or is there a way I can run the build from my local system?
For this to work, a Repository is not required,
but you do need to have some good way to access your artifacts and selenium code.
Suggest you drop the artifacts on a shared drive as a preliminary step,
and also put your selenium source-code on a shared drive, as a practice -
this will allow you to run multiple tests from multiple machines.
Cheers

Resources