I'm using Jenkins to create a build for my Node.js project. I use Grunt to build my project and I use a plugin called config-leaf so that I can encrypt my Gruntfile prior to putting it in our repo since there are sensitive things in that file.
When Jenkins downloads my code, I need to be able to decrypt the file stored in Git using npm run decrypt to run the decryption script. I am prompted for the password by the command. How can I have Jenkins enter the password when it reaches this point?
Related
I have a Jenkins job that gets the code from version control and builds (like what a normal pipeline do), I was doing is that after building the project, I download the build and use FTP to transfer that build to the client's server then I unzip it and then copy the whole build because I copy whole build my application's down time is very high. (I have to use FTP because as a service provider we have some limitations and can't change this policy)
What I wanted to do is that Jenkins know what is changed when it is building so Jenkins will create a package with all the changes and with the correct path where the file should go, and I can download that package and copy that package and just run the package so whatever was changed only that should get updated.
Is that possible? Is there any plugin that I can use?
This really depends on the build tool/language you are using to build you application. I dont think there is a generic jenkins plugin.
Other idea would be to upload your package to a local Nexus server. Download after the next build and the compare the files from old and new build. With this information you can create a patch package for your clienst server.
I am trying to setup a fairly simple CI/CD toolchain in TravisCI for a PHP project using composer libraries, resulting in deployment on a baremetal server via rsync.
Steps are:
Getting the code from the Github Repo upon git push.
Run composer install to get the dependencies.
(Perform Unit tests - Integration tests) - Not setup yet
Lint, codequality steps
Deploy the code to a remote apache server via rsync, using ssh keys.
Toolchain works OK so far, but I can't seem to get my head around on how the SQL migrations (in Doctrine or Phinx) can be executed automatically on the remote server.
Is the strategy of executing doctrine:migrations:migrate via ssh as the last step on the deploy section of TravisCI the best choice, or is there another better option? How do you deploy your migrations?
Thanks a lot
I once deployed to Heroku using Travis.
It was for a project using Laravel.
Because Heroku is sofisticated I have been able to tell it (from its configuration) to migrate your database after you have deployed.
However, with a classic rsync server you would need to connect to it from travis using SSH in order to migrate. (If your are as lazy as me and want to automate everything).
According to this doc you can add a after_deploy or after_success step. From this step you would run your ssh commands and migrate your database.
Apparently you can even run commands or a script via ssh so it might not be that hard. Look at the following: https://www.shellhacks.com/ssh-execute-remote-command-script-linux/
You have to pay EXTRA attention at what you put in your github repo in order to avoid security troubles with your rsync server.
Whether use this way to provide credentials to your Travis Job or that way
I already searched a lot for any solution but I couldn't find anything...
I try to save the console output of my current build [last build] on my slave server, which executes the build.
I saw that the log file is stored on master server as well so I hope this is possible for my slave, too.
I already tried to parse the .html document of the master server [http://'myIp'/job/'jobname'/lastBuild/consoleFull] with a python but it didn't work successful in my build process..it just works after the build is complete.
Is there any opportunity to save the console output on my slave server OR on a network drive?
I want to add this step in my build process, too, so it would be nice to save the output as a post-build-action.
OS: both servers (slave and master) are running on Win7 64bit
Thanks for your help!
Michael
Here is a solution to write the console log in your workspace and do a copy on a network drive.
To get a copy of the console log, you can use Console log plugin.
You have to build this plugin from the sources and install it manually from the Manage plugins section :(
Some instruction: https://wiki.jenkins-ci.org/display/JENKINS/Plugin+tutorial
I've installed this plugin on my Jenkins server (LTS 1.625.3) and it works well.
Next, you can add this post-build step to get the console log:
Finally, you can use the Publish Over CIFS plugin to copy the log file on a network share (with a post-build task).
I was abled to build with jenkins on a Build Server and was able to get the publish files on local.
However, I am unable to deploy these to a diferrent application server when I specify the path '\192.168.1.51\MyPublishedFiles' in the publish profile.
This is probably because the app server uses login credentials.
I have the login credentials but cannot find a plugin which can help me copy the files to another server.
What strategy / Plugin can I use do to the same ?
I have an Active Directory plugin installed that allows logging in to Jenkins portal with my directory credentials by typing them in on the Login page.
But is there a way to automatically log in users if they are on a domain-joined machine? (Obviously, given the browser is configured to allow providing credentials to the site.)
There's a thread on Jenkins Dev group in which a guy mentions that he developed an SSO plugin that worked for him on Windows. He posted it on GitHub under name NegotiateSSO.
First you need to build it to get the hpi file. (Clone the project to your machine, cd into the directory and run mvn (Maven) in it).
The problem is that when I tried to install it, it broke my Jenkins configuration section/page. There's an exception happening inside the plugin that breaks the entire page.
So it didn't work for me, but hopefully it will get fixed at some point and we'll be able to use it.