Running grunt from Jenkins as `root` user - jenkins

I need Jenkins to execute some shell commands every time someone makes a push. One of them is grunt prod but it's only working if i execute it as the root user. If i try to use it as jenkins user i get the following:
[4mRunning "ngAnnotate:production" (ngAnnotate) task[24m
[33mWarning: Unable to write "public/dist/application.js" file (Error code: EACCES). Use --force to continue.[39m
Anyone has any idea how to solve this?
Thank you!

You can add your jenkins user say jenkins to /etc/sudoers and allow it to execute sudo without asking for password by:
jenkins ALL = NOPASSWD: /bin/sh, /path/to/your/script
Then you can use sudo to run the script:
sudo /path/to/your/script
If you must execute with the user root, I think first you should allow root to login via SSH, then use Jenkins SSH plugin to login as root and execute your script.
Hope it helps.

Related

how I can change permission for a folder in jenkins?

I have jenkins working in local server on my mac and for a job, I need to run a script shell witch it need to read a json file. I tryied to put it in workflow-lib file. But when I run the job for testing, the script return an error witch says :
/Users/****/.jenkins/workflow-libs/testCollections: Permission denied
Build step 'Run a shell script' marked build as failure
Finished: FAILURE
So I know that the script can read the json file but I don't know how I can give it this permission.
thank you for helping
Please follow this command to give the permissions.
sudo chown -R <jenkins user>:<jenkins group> /jenkins_root_path
here <jenkins user> and jenkins group is the user and group which you are running jenkins under.
If the jenkins user is a member in the sudoers list:
sudo rm -rf /Users/****/.jenkins/workflow-libs/testCollections
But this can be dangerous.
I would highly suggest you run the Jenkins process as the jenkins user/group and not muck around in there as your own user account.
This will ensure your file permissions are proper and that the jenkins process only has access to the areas it needs as well as letting you spin up new slaves without worrying about permissions and custom settings.
You can always use sudo to become the 'jenkins' user to work with the files.

Jenkins Pipeline gcloud problems in docker

I'm trying to set up jenkins pipeline according to
this article but instead use google container registry to push the docker images to.
The Problem: The part which fails me is this jenkinsfile stage block
stage ('Push Docker Image To Container Registry') {
docker.image('google/cloud-sdk:alpine').inside {
sh "echo ${env.GOOGLE_AUTH} > gcp-key.json"
sh 'gcloud auth activate-service-account --key-file ./service-account-creds.json'
}
}
The Error:
Please verify that you have permissions to write to the parent directory.)
ERROR: (gcloud.components.update) Could not create directory [/.config/gcloud]: Permission denied.
I can't run any command to do with gcloud as the error above is what i get all the time.
I tried create the "/.config" directory manually logged into the aws instance and open up the permission of the folder to everyone but that didn't help either.
I also can't find anywhere how to properly setup google cloud for jenkins pipeline using docker.
Any suggestions are greatly appreciated :)
It looks like it's trying to write data directly into your root file system directory.
The .config directory for gcloud would normally be in the following locations for username and/or root user:
/home/yourusername/.config/gcloud
/root/.config/gcloud
It looks like, for some reason, jenkins thinks the parent directory should be in /.
I would try checking where your cloud sdk config directories are on the machine you are running this on (and for the user the scripts runs as):
$ sudo find / -iname "gcloud"
And look for location similars to those printed above.
Could it be that the Cloud SDK is installed in a none standard location on the machine?

Jenkins pipeline job gets triggered as anonymous but not as an user/Admin

Jenkins Pipeline job doesn't trigger pipeline job using jenkins cli. When i run jenkins as anaonymous this works, but when i create a user/admin it fails.
I have a job A which has parameters and passes the same to Pipeline Job. This is a Master-slave setup. This is how i run:
sudo java -jar /home/user/jenkins-cli.jar -s $JENKINS_URL build pipeline_job -p parameter_Name="$parameter_Name" -p parameter_Name2="$parameter2_Name"
1.) I tried using options, "-auth" , "-username -password" but doesn't work.
errors:
No such command: -auth
No such command: -ssh
2.) Another option is paste the public key in SSH section http://jenkin_url/me/configure , but still it fails
error:
java.io.IOException: Invalid PEM structure, '-----BEGIN...' missing
Is there i am missing anything ?
I Found the solution,
1.) used SSH CLI.
In my case i was using master-slave environment, connection was made using SSH keys vice-versa. In order to trigger the build using Jenkins CLI, place the SSH keys both public & private and place them in http://jenkinsURL/user/username/configure
Here username= the one used to connect the nodes.
Trigger the job as below:
java -jar /home/username/jenkins-cli.jar -s $JENKINS_URL -i /home/username/.ssh/id_rsa build JOBNAME
Note: This is one way but cloudbees doesn't encourage this approach.
2.) There is new approach i.e., using API token authentication.
Go to http://jenkinsURL/user/username/configure
Copy the API token
trigger the build as below:
java -jar /home/username/jenkins-cli.jar -s $JENKINS_URL -auth username:apitoken /home/username/.ssh/id_rsa build JOBNAME
Note: For using API token option, download the latest jar file

Clone Bitbucket repository without starting ssh-agent

I've set up Jenkins on my VPS and created a job which is set to execute a shell script containing the following command: ssh -T git#bitbucket.org
It turns out that i get a "Permission denied (publickey)." response because ssh-agent is not started. This can be solved by adding these two lines to the shell script:
eval `ssh-agent -s`
ssh-add ~/.ssh/bitbucket_key
However, I don't like to add these tho lines to every Jenkins item when I need to clone a repository. I would expect that if I log in via ssh to my VPS and change to the Jenkins user to execute the two lines myself, this would no longer be necessary. Unfortunately, this is not the case. I can successfully run ssh -T git#bitbucket.org myself, but the Jenkins job still fails without the two extra lines.
Is there a way to avoid this behavior, i.e. a way in which I only have to start the ssh-agent and add my key to it once instead of every time I want to clone a repository? I cannot imagine that it would be a good practice to start (a new) ssh-agent every time I want to clone and build my code.
Use ssh_config in your ~/.ssh/config. It has simple syntax as you can read in manual page for ssh_config. For your case should be enough
Host bitbucket.org
IdentityFile ~/.ssh/bitbucket_key

Jenkins shell script to git pull to production directory

I have a really simple Jenkins setup where it pulls down updates from BitBucket and runs some tests which all work lovely.
Jenkins and the testing website are both the same server so I want it to head off to my live website directory and pull down the repository that its just tested.
When I try and access : /var/www/vhosts/mysite/httpdocs/whatever/ I get a script error stating that this isn't a directory.
What would be the best way to do this?
Error is as follows:
[Pheme CI] $ /bin/sh -xe /tmp/hudson5490778292870793122.sh
+ cd /var/www/vhosts/mysite.co.uk/httpdocs/
/tmp/hudson5490778292870793122.sh: line 2: cd: /var/www/vhosts/mysite.co.uk/httpdocs/: Not a directory
Build step 'Execute shell' marked build as failure
Finished: FAILURE
Edit this appears to be a permissions issue will update when its sorted!
Right simply put this comes down to the Jenkins user not having access to the directory I needed it to, I simply
chown jenkins <dir>
and it all works fine! There is probably a better way to do this.

Resources