How to save build logs from Bitbucket Pipelines - bitbucket

Currently, the way I have my pipelines setup is having each step run on 'self.hosted' runners. I do it this way so I don't have to spend Bitbucket build minutes. Every time the pipeline runs, I can see the containers being created and terminated on my Docker application. The only debugging method I have found is by clicking on the container with the "build" word in it. Inside the container, I can see all the logs; every project build step and tests.
I'm trying to find a way to save these build logs into a text file and download it as an artifact. Bitbucket has a guide to generate support logs but none of the logs are build logs.

Related

What should be mentioned in build stage and deploy stange in Jenkins script...what is difference between deploy and build state.?

What should be mentioned in build stage and deploy stange in Jenkins script...what is difference between deploy and build state.?
Try to search it on Google for better understanding..but no luck
Deploy should mean take all of my artifacts and either copy them to a server, or execute them on a server. It should truly be a simple process.
Build means, process all of my code/artifacts and prepare them for deployment. Meaning compile, generate code, package, etc

How to get the core dump file if a jenkins build crashes?

We have a Jenkins pipeline, it builds an executable and runs unit tests. Sometimes the build crashes at some unit tests. As it's hard to reproduce it locally, so I'd like to get the core dump file if it happens again on Jenkins.
To do that, one idea is we can run the docker in privileged mode and override core_pattern so I can get the core in my build workspace and store it as an artifact. But doing that, we can only allow one build at a time as we don't want other containers running on that same Jenkins build node to pick up the temporary core_pattern change. Will that work?Or is there a better alternative?

Implementing a continuous integration pipeline in Jenkins using Zephyr, Bitbucket and Docker (Windows)

First post here, so ignore the newbie details about the question, the format will get better :)
My question has two questions: first it it doable? and second if eventually yes, any tips, recommendations on how to do this.
I have a software piece written in c in Zephyr RTOS (on a nrf 52840 board) and version-controlled in Bitbucket. Im trying to implement a Jenkins CI pipeline that fetch the code with newly pushed changes from Bitbucket and build it to check for errors and then report.
Now, to build that code in Zephyr I need a build environment and my solution is to run a docker container with zephyr image than is able to build that code and report back if everything looks good or not.
So basically my pipeline in jenkins will look like:
Fetch code from Bibucket.
run docker container with zephyr image that build the code
report back result to Jenkins.
What I have done so far:
Get bitbucket and Jenkins to connect. Have a container with zephyr image running that I got from docker hub. The image is zephyrprojectrtos/ci. Inside the container Im able to git clone my repos, still trying to figure out how to build the code and also if its possible to run something like a git clone inside a docker container but from a jenkinsfile. Any tips here? is it possible to pass a git clone command to a docker container from a jenkinsfile? or Do i have to include all (if possible) in the docker run command when running the container so it runs it and automatically checks out SW and build and report results back.
Im new to all this, Zephyr, Docker, Jenkins and I have no idea if this will work or not and if there a way around that is much simpler.
Thanks for your attention

How to totally avoid creating Jenkins Artifacts after each build?

I am running my wedriverio(selenium wrapper in javascript) tests on Jenkins
After each build the jenkins creates and attaches artifacts which is taking very long time (the test cases complete in 2 minutes, but the artifact steps take about 1 hr).
I also noticed that artifact is allure-report.zip
Is there any significance of this artifact if I already have console logs and allure-reports generated?
How to not generate and attach artifact after each build?
Jenkins has no control over the artifacts being created after starting a build via the execute shell command. The build itself is what creates artifacts. Parts of the build process that can also create artifacts are post-build actions such running tests or plugins.
I suggest you familiarize yourself with your Jenkins job to locate what creates the allure_report.zip file.
With Jenkins you can control which artifacts you want to preserve and make available easily on the UI via the Archive the artifacts in Post-build Actions. This does not create the artifacts. It simply tags and archives them as something special to be available outside of the workspace. If this is the step you think is slow (attaching the generated allure_report.zip file), you can remove it from the list of files to archive.

Jenkins - Docker integration - Use Jenkins to build Docker images and push to the registry

I am currently working on integrating Docker with Jenkins and I am currently trying to figure out the following pipeline:
Whenever a Dockerfile is updated in GIT, trigger a Jenkins Job to do the following
Build the Docker image
List item
Test, Verify the Docker image
Version the image - Prod, testing etc.
Push the image to the registry
If the image is not built, have a proper mechanism to get the logs
From my research, I found that we have 2 different plugins for Jenkins for Docker integration - Build step plugin and Docker build publish plugin. As far as I could see, I could not see any plugins or workflow to test the image before pushing it to the repository. Since we are doing this from the scratch, I would like to know the best tried and tested workflow.
Any help appreciated.
We applied the same mindset like "git flow" to the creation of docker images. In our solution, there was no need for testing the image itself. We solved that splitting up the Build in to a "Source-Build" producing artifacts and a downstream job e.g. "Runtime-Build" only packaging the artifacts into the runtime and pushing into the registry. At this point the whole stack is delivered to a "Release-Stage" for automatic testing.
To test the image there's a tool called Anchore.
Then, if you want to integrate other types of tests before building the Docker image, you can integrate for example Sonarqube with Jenkins and do a static analysis of the source code. Full example at: https://pillsfromtheweb.blogspot.com/2020/05/integrate-sonarqube-static-analysis-in.html

Resources