Problem in using ansible plugin in jenkins - jenkins

I installed Ansible plugin in Jenkins and I configured.
I created ansible.cfg, hosts and the playbook files.
I pushed those files with directory in Bitbucket.
The issue is when I start a build in Jenkins it says "skipping: no hosts matched" or I have already tested the directory manually and it works.
This is jenkins configuration:
This the error message:
When i execute the ansible-playbook command directly from the folder in Desktop it works.
This is when i executed the command in the jenkins directory

I think you need to provide a credentials owner. When you run it manually you are identified (with an attached ssh key) but Jenkins needs some permissions.

Related

Using Remote Repos in Jenkins

I've successfully connected to a Bitbucket repository within my Jenkins job. My issue is that I haven't been able to find any information about how to access/use the files that the repository contains. I added an "Execute Shell" step after connecting the repo, but don't know where the files the repo contains are. I tried cd'ing into /NameOfRepo/NameOfSubfolder but it says the file/directory does not exist in the console output when I run the job. Where does Jenkins store files it has gained access to that live in a remote repository? Do I need to use shell commands to clone my repo to a specific location?

Kubernetes fails inside jenkins pipeline

I'm trying to run kubectl commands inside jenkins pipeline but they are failing. Outside in powershell window they work fine but in the pipeline, they show this when doing:
kubectl cluster-info --v=99
I've tried adding --token $TOKEN (jwt generated) following some other thread's recommendation but didn't work. Anyone know why this is happening and any way to bypass it? All these commands work fine when ran outside the jenkins pipeline.
The problem was jenkins actually uses a different home directory and so even if your kubectl work in command line, it won't run if jenkins runs it from the pipeline as it doesn't have access to the credentials from the user directory.
So find your .kube config folder, usually in C:/users/ and then copy and paste that folder in the $JENKINS_HOME directory. The jenkins home directory can vary depending on how you installed it (for windows installers, it gets put in an obscure location inside System32). Once done, then jenkins will have access to the same certificates you use natively to run kubectl commands and it will have full access.

Jenkins using aws is not recognized as an internal or external command - Windows 10

Jenkin Project - I configure the Build setup in Batch file in below statement
aws s3 cp ./dist/first-ci-project s3://first-ci-cd-project --recursive
but result is
'aws' is not recognized as an internal or external command,
But Command Prompt i run the above command its working good. Pl help me.
Are you running the aws command in a jankins slave? If so please check whether aws cli command tool is installed on the slave and the location of the aws cli is added to the environment variables.
If you are running on the jenkins master, same as above. Please check if you have added the aws cli folder location is added as environment variable under jenkins master.
I had the similar issue, I had make sure that the user with which aws CLI is installed is same as the one which Jenkins is using during the runtime

How to automatically copy artefacts from Jenkins job into given network drive?

I'd like the files generated by Jenkins script to be automatically copied into a given directory on the local network.
Is there a plugin or script for doing that ?
In my case it worked by using SCP in a build step "Execute shell". The remote server needs to be accessible via ssh.
If your artifacts are the result of a maven build, maybe a Nexus Repository Manager is what you are looking for. A simple mvn deploy would do the job.

Copy files from Bitbucket via Jenkins to a production server

I have got my code files in bitbucket and have configured a jenkins build job to run where there is a change in the bitbucket repository. At the end of which it has to copy the files from the repo to a directory located at a production server from where the application is running.
Is there a away to copy the files from repo to a server using a script inputed to jenkins?
I asume you have the files in the workspace of the job. How about copying the files via the command line? If you want to do so, insert a batch block for windows nodes or a shell block for linux nodes and use
cp original_file new_file
You have 2 possibilites:
Run a slave on the production server
In this case you run a slave on the production server, which connects to your master jenkins. The slave has to be run under a user, which is able to write the directory, where you want to copy the files too.
2 Variations of this possibility:
You can execute the clone (checkout) of the bitbucket repository on the master and then use stash to make the files accessible on the slave, running on the production server (https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#stash-stash-some-files-to-be-used-later-in-the-build).
You run the whole pipeline on the slave, which is running on the production server, which means the production server needs access to bitbucket.
There are several possibilities to connect a slave to a master: https://wiki.jenkins.io/display/JENKINS/Distributed+builds#Distributedbuilds-Differentwaysofstartingagents
Use remote copy possibilites
You copy the files with eg. scp in linux.
This has some security implications:
You have to add the password of the production to the jenkins credential store and pass it to the copy command
if using keys (recommended). You have to add the private key to the jenkins credential store and pass it to the command.

Resources