I have a project has the following format:
Application A:
--Module 1
--Module 2
--Module 3
I build and manage each module individually, but all of them are aggregated into a single system. I'd like use Sonar to view code analysis at the 'Application A' level, but the scans will run individually with CI builds for each module. Is there a way to configure the sonar runner to achieve this? I tried configuring the modules appropriately, but if I first run a scan of module 1 and then a scan of module 2, the project only shows module 2 in the web UI. I'm not sure if there's a way around it and building all projects at the same time is not currently feasible. Any thoughts are appreciated. Thanks!
Related
Trying to get Github actions working for a specific workflow involving Zephyr and West. Currently, I have West initializing via this marketplace plug-in with the Docker version to save time. My Zephyr project runs on an nrf9160 chip and builds locally by running west build -b nrf9160dk_sgnns in the project folder.
For now, everything works up until the building part, which(I believe) has to call West in the project folder and not in the parent folder. However, how do I chain the events in this case? Currently, I call West build like this
- name: West Build
uses: 'docker://zmkfirmware/zephyr-west-action-arm:latest'
id: west-build
with:
args: |
'cd " project "'
'build "-b nrf9160dk_sgnns"'
However, this results in an error if I do it like the action suggest the system can't find my board, thus how could I perform this build action in my project folder correctly?
I tried setting the working-directory however this results in the error working-directory cannot be used with uses, with
I have a single Agent pool having 4 agent machines. I am building my code and it is getting successfully using a single agent out of 4.
Achieve: I want to achieve compilation and testing using Unified Agent Pool.The same pool I want to use for Testing.
I create a Release definition and create an agent phase: Selected the option Execute on Multiple Agents using same pool I used in Build Agent. (Concept is achieving the exact functionality of Unified Agent).
Create the Visual Studio test V2 task and in the Search folder as used $(BuildOutput) . Test Assemblies as : test.dll !\obj* and selected RUN TESTS in Parallel on multicore machines.
Output:
Build run successfully and when it automatically trigger the release definition it shows these errors:
First error: No artifacts are available in the build 47777.
2018-07-16T13:19:38.0507114Z ##[error]Error: Preparing the test sources file failed. Error : Error: No test sources found matching the given filter '*test*.dll,!\obj**'
2018-07-16T13:19:38.0507114Z ##[error]Error: Preparing the test sources file failed. Error : Error: No test sources found matching the given filter '*test*.dll,!\obj**'
Question: am I going in right direction for an implementation of Unified Agent using VSTest v.2.
What should i do for resolving these errors and going into right direction.
Thanks!
This is the key problem:
No artifacts are available in the build 47777.
Your build is not publishing any artifacts. Your build has to use the Publish Artifacts task to publish the build outputs in order to make them available in a release definition.
When artifacts are successfully published to a build, there is an "Artifacts" tab that appears on the build summary that will allow you to browse and validate the build outputs.
I have a Grails project that includes a separate Gradle project containing utility Java/Groovy classes that are shared among many projects. I am using a multi-project build so that I can develop both projects simultaneously.
The utility classes in the included project are not being hot-swapped / hot-reloaded / auto-reloaded / spring-loaded (I guess there are a lot of synonyms for this concept). This means that every time I make a change in one of the utility classes from the included project, I have to restart the Grails application.
Does anyone know how to make the subproject use the hot-swapping feature that Grails uses? It looks from this comment that Grails does this using Spring Boot's spring-loaded feature: https://github.com/spring-projects/spring-boot/issues/43#issuecomment-24723710
I have even tried setting up the subproject as suggested in this tutorial on using the spring-loaded feature: http://mrhaki.blogspot.com/2015/09/spring-sweets-reload-classes-spring.html. But when I run gradle -t classes, I end up with this output:
Continuous build is an incubating feature.
:compileJava UP-TO-DATE
:compileGroovy FAILED
FAILURE: Build failed with an exception.
* What went wrong:
java.lang.UnsupportedOperationException (no error message)
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 1.832 secs
Waiting for changes to input files of tasks... (ctrl-d then enter to exit)
The build fails, but the process is still running. So I tried starting the Grails app and checking if the utility classes were being hot-swapped, but they weren't. I hope there is a solution that doesn't require starting a separate process with gradle -t classes before starting the Grails app, but either way, if anyone has any ideas, I'd love to hear them.
I have created integration tests as a maven multi module project. Each module represents an integration tests. When I do a build on jenkins it runs all the test , I couldnt find the option to run a single module ( in my case a test).
It seems very easy. In the Jenkins Configuration under Build check on the option to Build modules in parallel . This will help you to run individual module.
My Test project is a Maven project and it has a structure like :
BusinessGroupModuleParentTests
SomeBusinessLogicIntegrationTest
SomeOtherBusinessLogicIntegrationTest
I can invoke each test individually now
I have a sbt project with 4 modules: module-a, module-b, module-c, module-d.
Each module can be packaged as a WAR. I want to set up a deployment on Jenkins that would build only one of the 4 modules and deploy it to a container.
In detail, I want to have 4 Jenkins jobs - job-a, job-b, job-c, job-d, each building only the defined module (a to d).
For now, I am using clean update test package as the command for the Jenkins sbt build, but this results in packaging all 4 modules that is not necessary.
I already tried project -module-a clean update test package but with no luck.
You may also like to execute project-scoped clean and test tasks as follows:
sbt module-a/clean module-a/test
The solution is slightly shorter and clearer as to what project the following commands apply to.
You don't need to execute update task since it's implicitly executed by test as described in inspect tree test.
There's a way to make it cleaner with an alias. Use the following in the build.sbt:
addCommandAlias("jenkinsJob4ModuleA", "; module-a/clean; module-a/test")
With the alias, execute jenkinsJob4ModuleA to have the same effect as the above solution.
Quote the argument to project, i.e. project module-a, and don't use a dash before the name of the submodule.
The entire command line for the Jenkins job would than be as follows:
./sbt "project module-a" clean update test