I am migrating from Jenkins 1.x to Jenkins 2. I want to build and deploy application using Jenkinsfile.
I am able to build gradle application, but I am confused about deploying application via AWS Codedeploy using Jenkinsfile.
Here is my jenkinsfile
node {
// Mark the code checkout 'stage'....
stage 'Checkout'
// Get some code from a GitHub repository
git branch: 'master',
credentialsId: 'xxxxxxxx-xxxxx-xxxxx-xxxxx-xxxxxxxx',
url: 'https://github.com/somerepo/someapplication.git'
// Mark the code build 'stage'....
stage 'Build'
// Run the gradle build
sh '/usr/share/gradle/bin/gradle build -x test -q buildZip -Pmule_env=aws-dev -Pmule_config=server'
stage 'Deploy via Codedeploy'
//Run using codedeploy agent
}
I have searched many tutorial but they're using AWS Code deploy plugin instead.
Could you help me deploying application via AWS Codedeploy using Jenkinsfile?
Thank you.
Alternatively you can use AWS CLI commands to do code deployment. This involves two steps.
Step 1 - Push the deployment bundle to S3 bucket. See the following command:
aws --profile {profile_name} deploy push --application-name {code_deploy_application_name} --s3-location s3://<s3_file_path>.zip
Where:
profile_name = name of AWS profile (if using multiple accounts)
code_deploy_application_name = name of AWS code deployment application.
s3_file_path = S3 file path for deployment bundle zip file.
Step 2 - Initiate code deployment
The second command is the used to trigger code deployment. See the following command:
aws --profile {profile} deploy create-deployment --application-name {code_deploy_application_name} --deployment-group-name {code_deploy_group_name} --s3-location bucket={s3_bucket_name},bundleType=zip,key={s3_bucket_zip_file_path}
Where:
profile = name of your AWS profile (if using multiple accounts)
code_deploy_application_name = same as step 1.
code_deploy_group_name = name of code deployment group. This is associated with your code deploy application.
s3_bucket_name = name of S3 bucket which will store your deployment artefacts. (Make sure that your role that performs code deploy has permissions to s3 bucket.)
s3_bucket_zip_file_path = same as step 1.
Related
I'm a little bit of a newb, with this CI/CD container stuff so please correct me anywhere I'm wrong.
I can't seem to find out how to send by npm build files created on my jenkins instance (workspace) to a remote server. I have a pipeline that successfully pulls in my github repo, does all my fun npm stuff (npm install, test, build). I see my build dir in my jenkins instance /workspace.
My environment is as follows. We have a server where docker (with Portainer) is installed. Jenkins is running in a container with a volume mounted (my react build dir goes here). No issues with the pipeline or building etc. I just can't figure out how to push my artifacts from my jenkins workspace directory to my 'remote' dev server.
I can successfully open a console in my jenkins container (portainer as the jenkins user) and scp files from the workspace directory using my remote server creds(but password is necessary).
I installed and used "Publish Over SSH" Jenkins plugin and get a successful "Test Configuration" from my setup.
I created my RSA keys on the REMOTE machine (that I'm trying to push my build files to).
I then pasted the private key (created without a password) into the plugin at the 'Use password authentication, or use a different key' section. Again, I get a successful test connection.
In my pipeline the last step is deploying and I use this command
sh 'scp -r build myusername#xx.xx.xx.xx:/var/files/react-tester'
I get a 'Permission denied (publickey,password).' error. I have no password associated with the rsa key. I tried both ways, creating the rsa key on the remote machine as my remote user, and the jenkins machine as the jenkins user. I've read examples of people creating the keys both ways, but not sure which user/machine combo to create the keys and paste to which section of the 'Publish Over SSH' plugin.
I'm out of ideas.
First, go to "Manage Jenkins" > "Credentials", add a new SSH credential of type "SSH Username with private key" and fill the "Username" and your private key (generate one if you haven't done it yet) fields (you can also upload one). Don't forget that you have to copy the generated public key to the ${SSH_USERNAME}/.ssh/authorized_keys file on the remote server.
I'm assuming you're using a scripted or DSL pipeline here. In your code, after you've builded your application, you can push it to your server adding this step to your pipeline:
pipeline {
stages {
stage("Pushing changes to remote server") {
steps {
script {
def remote_server = "1.2.3.4"
withCredentials([sshUserPrivateKey(credentialsId: 'my-key', keyFileVariable: 'SSH_KEY', passphraseVariable: '', usernameVariable: 'SSH_USERNAME')]) {
sh "scp -i \${SSH_KEY} build/ ${SSH_USERNAME}#${remote_server}:/var/files/react-tester/"
}
}
}
}
}
}
Best regards.
I'm trying to implement iOS pipeline to Azure DevOps using Fastlane. I have already have Fastlane in my project and successfully deploy beta and pilot versions. My problem is that when I run below script on Azure pipeline, It can't pass match clone part. Therefore, can't fetch certificates, provision profiles etc..
P.S: iOS_Certificates repo is different than project repo.
I'm getting timeout error after 1 hour. I think It is about authentication to
pool:
vmImage: 'macos-latest'
steps:
- script: |
fastlane match development --clone_branch_directly --verbose
fastlane beta
displayName: 'Build iOS'
Related code in MatchFile:
git_url("git#ssh.dev.azure.com:v3/myteam/myproject/certificates_repo")
storage_mode("git")
type("development")
EDIT: I'm trying to fetch a repo inside same project inside Azure DevOps (not GitHub or somewhere else). I'm getting timeout error, so no specific error even I run --verbose on match command.
From your information, you are using the SSH key as the authentication method.
Since you are using the macos-latest(microsoft-hosted agent) as build agent, the private key of ssh key will not exist on the target build machine.
So it can't authenticate and gets stuck. As you said, it will run 60 minutes and cancel. I could also reproduce this issue.
You could try to create a self-hosted agent and run the build on it.
In this case, you need to ensure that the private key exists on the machineļ¼ then you could authenticate through the ssh key.
On the other hand, you can authenticate with username and password.
For example(matchfile):
git_url "https://organizationname#dev.azure.com/organizationname/projectname/_git/reponame"
type "development"
app_identifier 'xxx'
username "member#companyname.com" #This will be the git username
ENV["FASTLANE_PASSWORD"] = "abcdefgh" #Password to access git repo.
ENV["MATCH_PASSWORD"] = "password" #Password for the .p12 files saved in git repo.
I am new to Jenkins. I have a project that runs Django and React on the same port which is running on AWS EC2 machine. My Database runs on RDS. Now I am trying to implement pipeline and automatic deployment using Jenkins. All the tutorials I was reading or watching suggest using AWS CodeDeploy service.
This is my sample Jenkinsfile now that tests my React code. As now the main intention is to do automatic code deployment so I am not thinking about testing python here. I just want whatever my code to be deployed on EC2 automatically
pipeline {
agent any
stages {
stage('Install dependency') {
steps {
sh "yarn install"
}
}
stage('Test project') {
steps {
sh "yarn test"
}
}
stage('Build project') {
steps {
sh "yarn build"
}
}
}
}
I know about AWS CodeDeploy plugin on Jenkins. But it uses S3 and AWS Code Deploy. What I am really trying to achieve is, Can I do something like this
I will build my project in Jenkins and it will send my project automatically to the ec2 machine and make it live.
OR
Can I connect to my ec2 instance and do the same that we do manually. Like - Login to instance, fetch the code from git, merge it, build it, restart the service
I'm trying to create a Jenkins Pipeline or group of itens to help me create a custom CI/CD for my projects and right now i'm stuck at the deploy part, i want to deploy on the same server that my jenkins is running (Windows Server/IIS). I would also like to know how to deploy to another server (Windows Server/IIS), this second one would be my production env.
I have managed to clone, build and archive using two approaches with Jenkins:
Pipelines
I have managed to create a pipeline that will clone my project, execute my build and then archive the artifacts from my build. The problem is, how do i deploy the artifact now?
This is my pipeline script
node {
stage('git clone') {
// Get some code from a GitHub repository
git 'my-git-url'
}
stage('npm install') {
bat label: 'npm install',
script: '''cd app
npm install'''
}
stage('gulp install') {
bat label: 'gulp install',
script: '''cd app
npm i gulp'''
}
stage('gulp production') {
bat label: 'gulp production',
script: '''cd app
gulp production'''
}
stage('create artifact') {
archiveArtifacts artifacts: 'app/dist/**',
onlyIfSuccessful: true
}
}
Freestyle projects
I have managed to create a project that will build and then archive the artifact using Execute shell build step and the Archive the artifacts post-build actions. How can i deploy the artifact using this approach? On this case i'm trying to trigger a second freestyle project to execute the deploy.
According to your question : "I want to deploy on the same server that my jenkins is running (Windows Server/IIS)" .. and comments I will suggest some approaches.
Windows
Use windows as operative system for production environments is not recommended. Linux is the only and the best choice.
IIS
I don't recommed IIS to deploys static assets. You need something more light and scalable. You could use :
nodejs with pm2 (https://expressjs.com/en/starter/static-files.html)
nginx (https://medium.com/#jgefroh/a-guide-to-using-nginx-for-static-websites-d96a9d034940)
apache (http://book.seaside.st/book/advanced/deployment/deployment-apache/serving-files)
docker
Deploy on IIS
Deploy static assets on IIS is just copy and paste the files on some folder and point IIS configurations to that folder:
https://www.atlantic.net/hipaa-compliant-hosting/how-to-build-static-website-iis/
Basic deploy on IIS using Jenkins
After your build commands, you just need to copy the build results (css.js.html.etc) and paste to some folder like c://webapps/my-app (pre-configured in IIS).
You can do this using a simple shell execution in free style project or pipeline script like https://stackoverflow.com/a/53159387/3957754
You could use this approach to deploy your static assets on the same server that your jenkins is running.
Advanced deploy on IIS using Jenkins
Microsoft has a tool called MSDeploy. Basically is a command line tool to deploy apps on remote IIS:
msdeploy.exe -verb:sync -source:contentPath="" -dest:contentPath=""
More details here:
https://stackoverflow.com/a/12032030/3957754
Note: You can't run MS deploy commands that talk to the MSDeploy service on the same machine
Jenkins Agent
Jenkins agent is an application that runs on a remote server, not where jenkins master node runs.
https://wiki.jenkins.io/display/JENKINS/Step+by+step+guide+to+set+up+master+and+agent+machines+on+Windows
Your master jenkins could use an agent in the remote or localhost IIS and execute jenkins jobs with copy and paste approach.
Im trying to upload artifacts to an s3 bucket after a successful build, but i cant find any working example to be implemented into a stage/node block.
any idea ( s3 plugin installed, jenkins v2.32 )?
node {
sh 'echo ""> 1.jar'
archiveArtifacts artifacts: '1.jar', fingerprint: true
// upload to s3 bucket ???
}
Detailed steps:
Install Pipeline AWS Plugin.
Go to Manage Jenkins -> Manage Plugins -> Available tab -> Filter by 'Pipeline AWS'.
Install the plugin.
Add Credentials as per your environment. Example here:
Jenkins > Credentials > System > Global credentials (unrestricted) -> Add
Kind = AWS Credentials
and add your AWS credentials
Note the ID
Then in your Pipeline project (Similar to the code I use)
node {
stage('Upload') {
dir('path/to/your/project/workspace'){
pwd(); //Log current directory
withAWS(region:'yourS3Region',credentials:'yourIDfromStep2') {
def identity=awsIdentity();//Log AWS credentials
// Upload files from working directory 'dist' in your project workspace
s3Upload(bucket:"yourBucketName", workingDir:'dist', includePathPattern:'**/*');
}
};
}
}
Looking at the Pipeline Steps documentation on the Jenkins website, it shows that the Pipeline AWS Plugin provides an s3Upload step.
Try this:
s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')
I think it is easier to show the direct plugin documentation URL.
You can find the plugin documentation here.
As you are looking for a way to upload files to S3, here are some examples.