How to add Fortify in CI/CD processes - fortify

can anyone help me? I am using Fortify Software Security Center 19.2.0.3191 and I need to add it in my CI/CD processes.
I can't find an api for starting scanning process
I can't find how after scanning create task in my Task manager, I have an api in task manager but don't know how to run it from fortify after scanning
I can't find an api for receiving scanning report

Related

How to get google dataflow running jobs in java using client library

I am trying to get all jobs from a project using Google client library for dataflow. I am able to fetch metrics using job Id. But unable to get all jobs inside a project, any code snippet will be very helpful. We can use Apache beam runner as well. There is a method list all jobs in Apache runner but I am unable to use it.
You will want to use this API: https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs/list
This should have an example showing how to use the Java client: https://cloud.google.com/dataflow/docs/samples/dataflow-v1beta3-generated-JobsV1Beta3-ListJobs-sync.

Spring Cloud Dataflow Server Lock the Jar Locally

I am designing the batch workflow with SCDF on Windows OS. When I test the code on my local machine, I deploy and run the Spring batch job jar locally by registering the jar using file URL. The problem is that whenever I want to rebuild my batch job jar, I cannot delete the jar that has already registered on the SCDF server as the OS warned me that the jar is being used by a Java program (even when the batch job is not running at that time).
It is quite inconvenient for developers to shut down the SCDF server every time when they want to rebuild the jar and replace the existing jar. Is there any workaround, or am I missing any configuration?
Thanks in advance for the advice.
I see this is an inconvenience but unfortunately this is expected when using file:// based resources. One alternative is to install your app as maven artifact in your local and refer them as maven:// based resources.

Automating Fortify Audit Workbench

Does Fortify Audit Workbench have any command-line options that would allow me to put it in a cron job and run it daily?
The scan takes over two hours, I would like it to run overnight and see the results in the morning.
Jason
Audit Workbench is the GUI front end for the underlying SCA engine (sourceanalyzer)
If you know how to scan your code though the commandline you can create a windows batch file or bash script to execute it.
The hardest part will be to come up with translation command. That is going to be language and project specific.
Your script should have a minimum of 3 steps
Clean
Translate
Scan
There is a fourth optional step to upload the scan results to your SSC instance. This step is utilizing the fortifyclient command.
References:
sourceanalyzer -h
HPE Security Fortify Static Code Analyzer User Guide, provides an overview of the scan process and examples depending on language and/or build tool.
HPE Security Fortify Software Security Center Installation and Configuration Guide chapter 10 talks about using the fortifyclient tool to communicate with SSC.
Without any further information, we cannot help you with the actual commands.
sourceanalyzer is the commandline tool
I run this (as a Windows batch file) as
sourceanalyzer -b 1234 devenv "VsSolution.sln" /REBUILD release
The other answers are correct, but there's an easier way. There's a Scan Wizard that creates a batch script for you. You point it at your project, answer some questions, and it creates a script. Check a box and it'll also upload to SSC.
Scan Wizard is located in /bin. It may also be in your Start menu, next to Audit Workbench.
Note: Sometimes I have to modify the script. But if you're able to scan using the Fortify button in Visual Studio, then the default script usually works.

Issue with visual studio build agent behind a proxy

I am currently setting up continuous integration using Visual Studio Team Services with onsite build agents, but I am having issues with my company's proxy.
I have tried adding the .proxy file but my company's proxy is still blocking it (it is a very old proxy).
Speaking to my infrastructure guys they can bypass the proxy but need all the urls that the build agent calls.
Unfortunately I can not find a list online of all the urls that it requires, I know it needs the following:
https://xxxxxxxxx.visualstudio.com
https://xxxxxxxxx.vssps.visualstudio.com
Does anyone know all the other urls that an onsite build agent calls?
It's hard to tell, the agent itself uses a number of URI's connect, the ones I know of are these at least:
account.visualstudio.com
account.vsrm.visualstudio.com
account.vssps.visualstudio.com
app.vssps.visualstudio.com
But then there are a number of tasks that need download access as well, e.g.
npm needs access to www.npmjs.com
Sonar Qube needs to download the sonar runner
NuGet needs access to www.nuget.org to restore packages
...
Then depending on which extensions you use, you may need additional ones
My Snyk task needs access to snyk.io for example
The easiest way to find them all is to setup a build agent outside of your company network and monitor the traffic with fiddler. To get an answer from the source I recommend to post an issue on the vsts-agent github repo.

Using Beam SDK in Cloud Dataflow

We are currently using Google's Cloud Dataflow SDK (1.6.0) to run dataflow jobs in GCP, however, we are considering moving to the Apache Beam SDK (0.1.0). We will still be running our jobs in GCP using the dataflow service. Has anyone gone through this transition and have advice? Are there any compatibility issues here and is this move encouraged by GCP?
Formally Beam is not yet supported on Dataflow (although that is certainly what we are working towards). We recommend staying with the Dataflow SDK, especially if SLA or support are important to you. that said, our tests show that Beam runs on Dataflow, and although that may break at any time, you are certainly welcome to attempt at your own risk.
Update:
The Dataflow SDKs are now based on Beam as of the release of Dataflow SDK 2.0 (https://cloud.google.com/dataflow/release-notes/release-notes-java-2). Both Beam and the Dataflow SDKs are currently supported on Cloud Dataflow.
You can run Beam SDK pipelines on Dataflow now. See:
https://beam.apache.org/documentation/runners/dataflow/
You'll need to add a dependency to pom.xml, and probably a few command-line options as explained on that page.

Resources