Azure storage plugin in jenkins create folder structure on azure blob - jenkins

I'm having my Jenkins freestyle job which builds and deploy the angular project on azure blob storage.
Everything worked fine, but when a job succeeded, it creates a full folder structure on the blob.
Because my angular project build is in a subfolder. I provide a full path to the files which I need in my azure blob.
Jenkins post-build action
and it gives me directory structure in the azure blob.
Azure blob storage
I need my angular build files(assets, js, etc.) directly in the $web blob.
Actual requirement

Looks like this is a bug of windows azure storage Jenkins plugin which is maintained by Visual Studio China Jenkins Team as mentioned here. So may be report bug or feature request as instructed here.
As a workaround you may try Azure CLI Jenkins plugin and accomplish your requirement by using az storage blob upload-batch Azure CLI command in your Jenkins job or you may also try with AzCopy utility as well. (or) If you use Jenkins pipeline then you can do something like below. So overall it's like not a good solution for freestyle job but better approach via pipeline job.
dir(‘AAA\BBB\CCC\DDD\EEE\’) {
azureUpload … filesPath: 'files'…
}
Hope this helps!!

Related

How integrate pull request Azure Devops Repository With Jenkins

I have a Git repository on the Azure Dev-ops server and use Jenkins for continuous integration build.
I want to know that how a specific branch like master Jenkins can automatically run the build and then notify the user via a shell log that the build was successful or not?
Microsoft seems to have the thing pretty well documented, Create a service hook for Azure DevOps Services and TFS with Jenkins
Set up the Jenkins job, set up the TFS / Azure DrevOps ServiceHook, off to the races.
We have it working fine for Jenkins 2.x and AzureDevOps on-prem. Best to use service accounts with limited necessary permissions on both sides.

Implement Jmeter/taurus with Openshift

I am Implementing Jmeter/taurus for performance testing for microservices. We are using Openshift PaaS solution to run all microservices. I am able to deploy jmeter/taurus inside Openshift using jenkins pipeline and generated the taurus report using jmx report in the container. My requirement is to publish the taurus report to Jenkins, rather than storing it to cloud storage or nexus. Can someone advise me what should be best approach to publish performance report for developers on Jenkins or any other optimal way to publish.
I found something by googling where they Jenkins agent was deployed inside Openshift and checkout the test suite Git repo into the agent's workspace just want to make sure if this is the best approach for my scenario. Our Jenkins master is running on Google cloud platform VM's with some dynamic slaves.
Thanks in Advance!
According to Dump Summary for Jenkins Plugins Taurus User Manual Chapter, you just need to add reporting module definition to your YAML configuration file like:
reporting:
- module: final-stats
dump-xml: stats.xml
And "feed" this stats.xml file to Jenkins Performance Plugin
That's it, you should get Performance Report added to your build dashboard. Check out How to Run Taurus with the Jenkins Performance Plugin article for more information if needed.

Migrate Jenkins jobs from Cloud Bees to another Jenkins server

I have a Jenkins server at CloudBees server and it has a lot of jobs.
I have created new Jenkins server at AWS Ec2 instance.
Now, I need to migrate all Jenkins jobs from CloudBees to New Jenkins Server(AWS EC2instance)
How can I do this task? Is there any way to migrate all jobs by CLI?
Use Backup Plugin or thinBackup
You first need to ensure that you do not use proprietary CloudBees features (RBAC, Folders+ plugins). This is the only thing that's really specific to migrating from a CloudBees Jenkins.
After that, standard steps for migrating Jenkins apply:
ensure that you have same plugins installed on the new Jenkins
align credentials and credentials-IDS
API tokens need special handling
After that, you can just copy all $JENKINS_HOME/jobs/*/config.xml files (if using folders, copy recursively).
You can also copy job configs via CLI or REST API, but usually the fastest way is to copy directly on filesystem level.

Isolating Secrets for Pipelines in Jenkins

We are implementing a GitOps like CI/CD in Jenkins. Where we are deploying to Openshift/Kubernetes. For sake of simplicity lets say we have only 2 repositories:
First with the application source code , there is also Jenkinsfile in the source that defines the build. (that also pushes images to a repository.)
We ha a second repository where the deployment pipeline is defined (jenkinsfile). This pipeline deploys image to production (think "kubectl apply").
The problem is that the pipeline (2) needs to access credentials that are used to authenticate (against kubernetes api) to productions. We thought to store these credentials in Jenkins. Where we don't want in same Jenkins the first (1) pipeline to have access to these production credentials.
How could we solve this with Jenkins? (How to store these credentials)
thank you
Just to capture from the comments, there's effectively an answer from #RRT in another thread ( https://stackoverflow.com/a/42721809/9705485 ) :
Using the Folders and Credentials Binding plugin, you can define credentials on the folder level that are only available for the job(s) inside this folder. The folder level store becomes available once you made the folder.
Source: https://support.cloudbees.com/hc/en-us/articles/203802500-Injecting-Secrets-into-Jenkins-Build-Jobs
Another example of adding scoped credentials (this one for dockerhub credentials) is https://liatrio.com/building-docker-jenkins-pipelines/

jenkins-as-code: purpose of jobs

I want to use Jenkins and store the configuration and the pipeline in my SCM(e.g. git). To do so, I created a directory, let's say "jobs" in the root of my project where I will store jobs.groovy files written as JobDSL plugin files.
Should I do all the things in a single job file, like fetching the source code, testing it, maybe building Docker images if necessary, then deploying on AWS cloud? Or for each operation, should I create different jobs? If so, then how can I create a pipeline using these job files?
look at jenkins configuration as code plugin. following link would be helpful
https://github.com/tomasbjerre/jenkins-configuration-as-code-sandbox

Resources