I want to add stage with one step to my Jenkinsfile pipeline to upload an apk to google drive and then getting the shareable link of uploaded file.
Have you checked this? Looks like it could be of some use
Here are some general steps to add a stage that uploads a file to google cloud:
Download jenkins google Google OAuth Credentials plugin and Google Cloud Storage
plugin:
https://plugins.jenkins.io/google-oauth-plugin/
https://plugins.jenkins.io/google-storage-plugin/
Create a credentials parameter (in jenkins/configure) with type: "Google Service Account from private key", if you do not have a Service account for the project, create it using a credentials json obtained by doing: https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating
Create a GCP bucket: https://cloud.google.com/storage/docs/creating-buckets
In jenkins, go to "pipeline-syntax" and create the build step, instruction can be found in: https://plugins.jenkins.io/google-storage-plugin/
example for the jenkins file upload code (generate by step 4):
googleStorageUpload bucket: 'gs://my-jenkins-bucket-from-part-3', credentialsId: 'my-jenkins-credentials-from-part-2', pattern: 'my_file_to_upload.zip'
Note that you might have issues with bucket permissions, configure them correctly or open the bucket for public access (not recommended):
in GCP, Activate Cloud Shell:
gsutil defacl ch -u allUsers:R gs://my-jenkins-bucket-from-part-3
gsutil acl ch -u allUsers:O gs://my-jenkins-bucket-from-part-3
Related
I have Jenkins file to deploy my application into EKS cluster. From jenkins side i installed AWS credential plugin and I added Jenkins credential my secret key and access key values into the box.
Next when I'm running Jenkins build deployment stage falling with below error .
Unable to connect to the server: getting credentials: exec: executable aws not found
It looks like you are trying to use a client-go credential plugin that is not installed.
I faced similar issue and found it was a PATH settings issue. Basically aws is not found in PATH . What you could do is add "env" to the code and see what PATH values are in console output. To set the PATH correctly
Manage Jenkins -> Configure System -> Global properties -> Environment variables: name=PATH, value= (Ex: /usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin/ )
I'm a little bit of a newb, with this CI/CD container stuff so please correct me anywhere I'm wrong.
I can't seem to find out how to send by npm build files created on my jenkins instance (workspace) to a remote server. I have a pipeline that successfully pulls in my github repo, does all my fun npm stuff (npm install, test, build). I see my build dir in my jenkins instance /workspace.
My environment is as follows. We have a server where docker (with Portainer) is installed. Jenkins is running in a container with a volume mounted (my react build dir goes here). No issues with the pipeline or building etc. I just can't figure out how to push my artifacts from my jenkins workspace directory to my 'remote' dev server.
I can successfully open a console in my jenkins container (portainer as the jenkins user) and scp files from the workspace directory using my remote server creds(but password is necessary).
I installed and used "Publish Over SSH" Jenkins plugin and get a successful "Test Configuration" from my setup.
I created my RSA keys on the REMOTE machine (that I'm trying to push my build files to).
I then pasted the private key (created without a password) into the plugin at the 'Use password authentication, or use a different key' section. Again, I get a successful test connection.
In my pipeline the last step is deploying and I use this command
sh 'scp -r build myusername#xx.xx.xx.xx:/var/files/react-tester'
I get a 'Permission denied (publickey,password).' error. I have no password associated with the rsa key. I tried both ways, creating the rsa key on the remote machine as my remote user, and the jenkins machine as the jenkins user. I've read examples of people creating the keys both ways, but not sure which user/machine combo to create the keys and paste to which section of the 'Publish Over SSH' plugin.
I'm out of ideas.
First, go to "Manage Jenkins" > "Credentials", add a new SSH credential of type "SSH Username with private key" and fill the "Username" and your private key (generate one if you haven't done it yet) fields (you can also upload one). Don't forget that you have to copy the generated public key to the ${SSH_USERNAME}/.ssh/authorized_keys file on the remote server.
I'm assuming you're using a scripted or DSL pipeline here. In your code, after you've builded your application, you can push it to your server adding this step to your pipeline:
pipeline {
stages {
stage("Pushing changes to remote server") {
steps {
script {
def remote_server = "1.2.3.4"
withCredentials([sshUserPrivateKey(credentialsId: 'my-key', keyFileVariable: 'SSH_KEY', passphraseVariable: '', usernameVariable: 'SSH_USERNAME')]) {
sh "scp -i \${SSH_KEY} build/ ${SSH_USERNAME}#${remote_server}:/var/files/react-tester/"
}
}
}
}
}
}
Best regards.
I have a simple jenkins job that just runs aws ssm send-command and it fails with:
"An error occurred (AccessDeniedException) when calling the SendCommand operation: User: arn:aws:sts::1234567890:assumed-role/jenkins-live/i-1234567890abc is not authorized to perform: ssm:SendCommand on resource: arn:aws:ssm:us-east-1:1234567890:document/my-document-name"
However, the IAM permissions are correct. To prove it, I directly SSH onto that instance and run the exact same ssm command, and it works. I verify it's using the instance role by running aws sts get-caller-identity and it returns arn:aws:sts::1234567890:assumed-role/jenkins-live/i-1234567890abc which is the same user mentioned in the error message.
So indeed, this assumed role can run the command.
I even modified the jenkins job to run aws sts get-caller-identity first, and it outputs the same user json.
Does jenkins do some caching that I am unaware of? Why would I get that AccessDeniedException if that jenkins-live user can run the command otherwise?
First, install the AWS Credentials and AWS Steps plugins and register your AWS key and secret access key in Jenkins credential store. Then, the next steps depends if you're using a freestyle or a declarative/scripted pipeline.
If you're using a freestyle pipeline: On "Build Environment", click on "Use secret text(s) or file(s)" and follow the next steps. After that, you're gonna have your credentials as variables in your pipeline;
If you're using a declarative/scripted pipeline: Enclose your aws calls with a withAWS block, something like this:
withAWS(region: 'us-east-1', credentials: 'my-pretty-credentials') {
// let's explode something
}
Best regards.
I have a set of automated tests that are ran in a jenkins pipeline, testcode is located in gitlab.
The section where I pull code from gitlab looks like this:
I use gitlab credentials that were already present there (since other project use the same gitlab credentials).
I use a simple jenkinsfile that is located in the test codebase to run the script from here. This is roughly how it looks:
agent {
kubernetes {
defaultContainer 'jnlp'
yaml """
apiVersion: v1
kind: Pod
metadata:
labels:
application: auto_ish
spec:
containers:
- name: node
image: node:12.14.1
command:
- cat
tty: true
"""
}
}
stages {
stage('Build') {
steps {
container('node') {
sh '''
npm install
'''
}
}
}
stage('Test') {
steps {
container('node') {
sh 'node_modules/.bin/wdio ./test/config/wdio.conf.acc.js --suite SmokeTest --mochaOpts.grep=#smoke'
}
}
}
My problem:
The codebase of my automated tests has recently been moved to github, and I have trouble getting it to work in jenkins. For github I have a personal access token that I need to use, so no private key like I had for gitlab. I have tried adding this token to the credentials manager, but after adding it doesnt show up in the dropdown.
I followed some walkthroughs that told me to install github plugins for jenkins and then set my personal access token in jenkins configuration like this:
I tested the connection,and it worked.
From here on, I have no idea how to proceed. I just want to pull the code from the codebase to run the project. No need to trigger builds when code is pushed to github, since my tests are triggered when other jobs finish.
But since my credentials are not in the credentialsmanager, I cannot just add the new repo here. This also means I cannot refer to my jenkinsfile here.
I read somewhere I had to refer to my github project here:
I did this, butI think this will not be enough. I figure I need to pull the code from the new repo somewhere, but I have no idea where.
Which brings me to my question: where and how do I pull the code from my github repo, using the personal acces token/github server I specified?.
You can configure Jenkins instance with Github with the help of SSH key
You Just have to create SSH public and private keys and past public key in
Github > Settings > SSH/GPC key > Add public key
Make sure you will not add any space and new line
Save and exit from Github
Now go to Jenkins
* Start to configure your project, or Go to credentials > System > Global credentials > Add credentials a page will open
In Kind drop-down select SSH Username with private key
Check private key radio button and then press Add key button, a textarea will open paste your private key in that textarea, Make sure you copy the private key and while pasting not adding any space in it. Make sure you select the whole key, Begin and End text of key also.
Now Save and while Configuring Project Source Code Management tab, you will find credentials and a drop-down, select the new configured key from that dropdown jenkinsSSH.
Make sure your clone your Github repo using SSH not HTTPS
and build the application. this will work
for more reference watch this video tutorial
https://www.youtube.com/watch?v=mGXGIOpKAos&list=PLhW3qG5bs-L_ZCOA4zNPSoGbnVQ-rp_dG&index=9
[Update]
To clone git repository using Personal Access Token, you can use following format
https://user:token#github.exampleco.com/org/repo.git
like
git clone https://user:token#github.exampleco.com/org/repo.git
there is one more question same as this, and he provided one solution, might help you
Git Clone in Jenkins with Personal Access Token idles forever
please have a look
After some intense googling I found the answer, which proved to be a lot easier then I thought:
Apparently a personal access token can be used like a password, as far as jenkins is concerned atleast. I added new credentials to the credential manager, chose type 'username and password', put in a non existing username ('user') and put the personal access token in the password field.
This way I could choose the credentials from the dropdown like I did before, and the project was cloned without issues
Im trying to upload artifacts to an s3 bucket after a successful build, but i cant find any working example to be implemented into a stage/node block.
any idea ( s3 plugin installed, jenkins v2.32 )?
node {
sh 'echo ""> 1.jar'
archiveArtifacts artifacts: '1.jar', fingerprint: true
// upload to s3 bucket ???
}
Detailed steps:
Install Pipeline AWS Plugin.
Go to Manage Jenkins -> Manage Plugins -> Available tab -> Filter by 'Pipeline AWS'.
Install the plugin.
Add Credentials as per your environment. Example here:
Jenkins > Credentials > System > Global credentials (unrestricted) -> Add
Kind = AWS Credentials
and add your AWS credentials
Note the ID
Then in your Pipeline project (Similar to the code I use)
node {
stage('Upload') {
dir('path/to/your/project/workspace'){
pwd(); //Log current directory
withAWS(region:'yourS3Region',credentials:'yourIDfromStep2') {
def identity=awsIdentity();//Log AWS credentials
// Upload files from working directory 'dist' in your project workspace
s3Upload(bucket:"yourBucketName", workingDir:'dist', includePathPattern:'**/*');
}
};
}
}
Looking at the Pipeline Steps documentation on the Jenkins website, it shows that the Pipeline AWS Plugin provides an s3Upload step.
Try this:
s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')
I think it is easier to show the direct plugin documentation URL.
You can find the plugin documentation here.
As you are looking for a way to upload files to S3, here are some examples.