Is there any way I can integrate Jenkins with Chef Server so that I can create jobs to be executed? I don't want to have to go to the Chef Workstation.
Do I need to install jenkins on a separate server or the chef workstation?
Can I execute all the recipes directly from the Jenkins console?
Yes. You can create jobs in jenkins and can be integrated with chef. Knife search would be the best option. you can run chef-client to run recipes through jenkins to any machine.
Look into this chef-plugin. It might help. I have not tried it.
I hope this helps.
We managed to have this working except automatic deployment (I am doing some research if there is such thing yet).
What I recommend is having Jenkins slave connecting to your master (i.e. using Swarm plugin). This slave could be machine (i.e. Ubuntu server edition) with QEMU/KVM virtualization, Ruby, Vagrant (or other tool), ChefClien, TestKitchen, RSpecs (chefspecs) and Foodcritic. There is Chef Development Kit that has many of this.
What build do:
Checkout your chef repository
Run `kitchen test all' (or only subset of your suites)
If status is 0 then execute rspec --format doc (from yours cookbook directory
If status is 0 then execute foodcritic -f
If status is 0 then you can upload your cookbook knife upload cookbooks/my_cookbook
What we haven't figure out yet is:
Update (increase) version of cookbook (if doing release)
Still quite error prone approach using status and shell scripts in jenkins build
More resources:
Foodcritic CI
try these Jenkins plugins:
https://github.com/zhelyan/jenkins-chef-api
together with
https://github.com/zhelyan/jenkins-chef-wrapper
Related
My development setup is such that for every svn checkin code is built,unit tested, packaged and published in Artifactory. Now I want to automate my deployment process & run integration(Selenium) test as part of this process. I am thinking of using Puppet to managed the deployment
Is puppet the correct tool for this
What is the process I should use to trigger puppet master to initiate a fresh installation on agents, I couldn't find any Jenkins plugin that would actually trigger puppet. One option is to call
puppet apply ...
as a Jenkins post build task
Any suggestions welcome, thank you.
Have a look at this Selenium Jenkins article from Saucelabs, a service that automates cross-browser testing. Though they are a vendor with a service to sell, the article covers how to do Selenium testing yourself with Jenkins. It also exposes common pain points you are likely to run into with this approach.
A Puppet master doesn't serve the function of orchestrating client convergences. Take a look at Mcollective. This is a tool that will allow you to trigger puppet runs on target systems from a Jenkins agent via script commands.
Some Mcollective getting started material:
http://www.slideshare.net/PuppetLabs/presentation-16281121
http://puppetlabs.com/mcollective
I'm running Jenkins on one server and want to use chef and automatically install a snapshot (including runtime artifacts etc) on a separate server.
Currently Jenkins will use ssh to invoke chef on the seperate machine. Is there a better way?
Maven is also involved in this.
I've found that majority of "Deploy" type plugins are lacking in customization. We use "Execute" (bash or batch) build steps to trigger deployment scripts on remote machines (written in house, be they Puppet, Chef, or plain bash/batch).
The correlation between builds and deployments is achieved through "Promotions" and explained in detail here:
How to promote a specific build number from another job in Jenkins?
I am brand new with Openstack and Chef tools.
I am trying to setup a Continuous Delivering Process where I imagine something like following:
From Jenkins create a Pipeline where we have Jobs:
Job1: compiles, runs unit test + static analysis and deploys RPM build/artifacts into Artifactory.
Job2: Download RPM files from Artifactory and save them all together into a Yum Repository.
Job3: Clean and Recreate in Openstack the Lab infrastructure (Routers, Private Networks, Nodes with a clean image). After that, clean and re-register those Nodes in the Chef-Server specifying the run-list cookbooks that each node will have.
Job4: Runs Functional and Integration Test using infrastructure created in Job3. Publish results.
The doubt I have is how to implement Job3, the ways I see to implement this is using in Jenkins configuration Openstack command lines as nova and neutron, and for Chef also using knife and chef-client command, but for all that I shall have access to OpenStack controller server and all Chef Nodes.
Is there a more tidy way to implement this without just using command lines, something like Jenkins Plugins, Chef recipes or some other way?
What I don't like of adding in Jenkins configuration is that is not under version control, I would like something like chef recipes that perform all Openstack and Chef infrastructure setup and have those recipes under version control. But I am not sure how to implement all this with recipes and how then they will be applied from Jenkins.
It is correct the idea I have or there is other ways to implement this approach?
Thank you for the help.
For provisioning and orchestrating application infrastructure, I would recommend using Heat. A single YAML file describes your desired application environment.
The openstack documents describe how nova servers can be configured using chef at boot time using a cloud-init.
Hope this helps
Also consider using CloudMunch which integrates into Openstack to deliver continuous delivery and deployments.
Disclaimer: I work at CloudMunch.
I have Jenkins, Artifactory and 3 environments (development, test and production).
When a developer commits something from the development-environment it's compiled and tested in the test-environment. And the builds and artifacts of it are stored in Artifactory.
Now I wanna go the next step and with the help of puppet manage the environments and deploy the artifacts from Artifactory to the production-environment.
But I need some hints to start:
Where is the best place to install puppet? On the same server as artifactory as they work togehter? Puppet configures the environments as well.. so not sure if there is something that has to be consider.
Are there any configurations I need to keep in mind before or during the installation of puppet? Especially in context with artifactory and jenkins.
Thanks for any hint/ help.
Disclaimer: This is just how I do it :)
I don't see any benefit to putting Artifactory on the same server as Puppet Master, and that seems like a bad idea.
In Artifactory I have a virutal repository that includes only our product artifacts that I care about. I have a separate generic web server that hosts various things puppet nodes and sometimes people in our company need to download. That server is also a forward proxy to the virtual repo in artifactory.
The local web server syncs to an external aws server nightly. Internally, nodes that need to download bits get them from our local server. Externally, they download from the cloud server (really an auto-scaling cluster).
This makes it fairly straightforward to write puppet manifests/custom types that can download, md5, and install artifacts on nodes. Even slicker for Linux would be to build packages, but I don't currently.
I also use Foreman as a Puppet ENC. Versions of software are configured as Foreman parameters on a global, group, and, if need-be, node level. To deploy a new version of of an application war, someone just logs into Foreman, sets the parameter, and waits for Puppet to do its job (or logs into the node and forces a Puppet run if they need to).
Hope that gives you some ideas.
To start with, your first decision needs to be whether you want to run puppet in master-agent (client/server) mode, or in "masterless" mode. The getting started documentation over at puppetlabs.com is solid and it is worth a following the tutorials.
Regardless of your decision, you will need to install puppet on every server that runs puppet - the same binary runs master, agent, apply, etc.
Puppet 101 masterless example assuming redhat-ish OS:
# Install puppet
sudo yum install puppet -y
# Create basic manifest
echo "notify {'hello world':}" > hello.pp
# Run masterless scenario:
sudo puppet apply hello.pp
Expected result:
notice: hello world
notice: /Stage[main]//Notify[hello world]/message: defined 'message' as 'hello world'
notice: Finished catalog run in 0.03 seconds
We're trying to setup Jenkins, but we are having a couple issues.
We have a "Jenkins Server" (Master) and have connected it to Fisheye. Jenkins is able to get the Git repo and run the tests.
Is there some kind of built in process for Jenkins to give it capabilites to SSH into a server and run commands like "git pull origin master" ?
Yes there is ssh capabilities in Jenkins. You can add a build step for either running SSH or sending files over SSH (you have to define target server in config). Theres also a post build plugin for sending artifacts over ssh which can be used to also execute remote commands.
I'd recommend a book by John Smart which covers Jenkins setup. Its at http://www.wakaleo.com/books/jenkins-the-definitive-guide
I think you want the SSH Plugin for Jenkins. This will let you define SSH servers in your global configuration, and then define commands to be run before and after the build.