Is there a way to add global tool configurations for artifactory and aws in jenkins through cli?
I'm trying to write chef cookbook for automating creation of Jenkins job, but I don't know how to add credentials for tools.
Credentials don't depend on the tools which will use them.
You can register credentials in general through the JENKINS Credentials Plugin API: see for instance
How to create jenkins credentials via the REST API? (similar to jenkins_api_client issue 162)
update Jenkins credentials by script
You can then use those credentials in association with a Jenkins Job.
Your question is twofold.
Credentials
The Jenkins chef cookbook offers a jenkins_credentials resource, which allows you to pipe credentials (using Jenkins API internally) into your Jenkins instance.
Global Tool Configuration
You can use the jenkins_script resource of the same cookbook to execute any Groovy script. This allows you to configure your Jenkins instance. You now just have to figure out exactly, what the code is to, e.g., select the previously defined credentials. But the code looks similarly to the example given in the cookbook's README.
Related
We're setting up multiple more or less static servers in AWS. These are primarily configured via Ansible and that's also the ultimate source of truth when it comes to their existence, grouping, host names and IPs. But then there's Jenkins deploying configuration files to these servers based on new commits added to a git repository.
I'm having an issue with listing the target servers directly in a Jenkinsfile. How shall I proceed? Which are the most common ways of dealing with this?
I understand this is mostly an opinion based topic. But maybe there's a particular Jenkins feature which I don't know about...?
Thank you.
This is very subjective. Following are a few ways to do this.
Store the details somewhere accessible after the Ansible step. Possibly commit to a Github repo and retrieve these details within the Jenkins Job.
Using AWS APIs/CLI to retrieve server details. You can either set up AWS CLI in Jenkins Agent or use something like AWS Step Plugin.
Do an API call to Jenkins after the Ansible script is executed and update the server details in the Job itself.
I am using Jenkins with my Linux box. I have two bitbucket repositories. I am trying to create a Jenkins pipeline.
Repo_A - Jekinsfile resides here.
Repo_B - Project's source code resides here.
I want Jenkins to take configurations (Jenkinsfile) from Repo_A. Also, I want Jenkins to clone my source code to /some/random/directory. The credentials are stored in Jenkins, and can be used for both repos.
How can I use multiple repos in a single pipeline? Can somebody please tell me how to do this?
You need to use ssh authentication, it is suported by Genkins
Here is similar question: how to setup ssh keys for jenkins to publish via ssh
To add new repository
I'm new to Jenkins world, I have a usecase where I have setup a jenkins pipeline using JenkinsFile. As part of deployment stage, we will invoke a few ansible script in the backend to get the image deployed into Kubernetes cluster running in cloud environment. The script expects few secrets in environment variable, so I like to understand which is the best option to handle secret in Jenkins, do I need them to enter into jenkins credentials and read them in jenkins environment tag like below. Or It is safe to get the value from the user using input plugin when executing the pipeline, but if I get from user then I would not able to completely automate pipeline will wait until user input the secret. Could you help in safe way to handle credentials.
pipeline{
agent any {
environment {
SECRET_VALUE=credentials('SECRET_VALUE_FROM_JENKINS_CREDENTIALS')
}
}
}
It depends on your use case, Indeed both approaches as you mentioned above will work.
There shouldn't be any problem in keeping your secrets as Jenkins credentials, in my case, all my secrets are in the Hashicorp vault and my Jenkins credentials point to the vault location as an example...
- usernamePassword:
scope: GLOBAL
id: serviceUser
username: svc_admin
password: "${secret/xyz/service_user/password}"
description: My secret service user
The Jenkins deployment is via JCasC.
As jenkins admin I can say it is safe to store credentials in jenkins.
Just create credentials in jenkins and use in a pipeline. Also it's nice to have mask password plugin installed in jenkins, which will mask credentials in jenkins jobs' output.
We are setting up the job to generate executable file by gathering different components (All these tagged) , We need a way to get these components based on the name of the build, I know copy artifacts will do but i would like to put this on script, Is there way (Api or something else) can download archived artifacts? once all these components present it is easy to create a installer
I have tried there are multiple curl and wget commands which accept username and password , But I need something without username and password as script runs on jenkins workspace we dont need to pass the password
If you want to interact with Jenkins via scripts there are two ways:
1. Jenkinspipeline
With Jenkinspipelines you can define the builds with Groovy scripts whereby you can use copyArtifact via the Groovy DSL. Its not actually a script its a build definition defined with a Groovy DSL. This should be the way when a Jenkins Job is gathering stuff from other Jobs.
https://jenkins.io/doc/book/pipeline/
2. Jenkins CLI
With the Jenkins CLI you can interact with Jenkins via a shell script. This should be the way when you want to gather stuff from outside of Jenkins.
https://wiki.jenkins.io/display/JENKINS/Jenkins+CLI
If Jenkins is secured then I think you will have to provide credentials when using the Jenkins CLI. With Jenkinspipelines you don't need credentials, because they are executed in Jenkins. But you need to define permissions on the Jenkins Jobs (or in the Pipelines) so that Jenkins Jobs can access Artifacts of other Jobs. (CopyArtifactPermission)
I have a Jenkins server at CloudBees server and it has a lot of jobs.
I have created new Jenkins server at AWS Ec2 instance.
Now, I need to migrate all Jenkins jobs from CloudBees to New Jenkins Server(AWS EC2instance)
How can I do this task? Is there any way to migrate all jobs by CLI?
Use Backup Plugin or thinBackup
You first need to ensure that you do not use proprietary CloudBees features (RBAC, Folders+ plugins). This is the only thing that's really specific to migrating from a CloudBees Jenkins.
After that, standard steps for migrating Jenkins apply:
ensure that you have same plugins installed on the new Jenkins
align credentials and credentials-IDS
API tokens need special handling
After that, you can just copy all $JENKINS_HOME/jobs/*/config.xml files (if using folders, copy recursively).
You can also copy job configs via CLI or REST API, but usually the fastest way is to copy directly on filesystem level.