Currently travis supports linux (ubuntu?) and mac.
I'm currently exploring nix. I think it's a powerful way to declare the global state of your system. It's available at various level:
nix: package
nixos: machine
nixops: deployment
With the current state, I could install nix the package manager on a linux vm and get all the packages I need, great!
In my ideal world, I could do the same at the machine or at the deployment level (machine[s]).
So my question is: When will travis-ci will support NixOS? When will travis-ci will support multiple machine setup (let's say with NixOps) ?
it would be nice to be able to share /nix/store between travis builds. but for nixpkgs we're already try to build changes using nox tool.
https://github.com/NixOS/nixpkgs/blob/master/.travis.yml
Related
Absolute beginner in DevOps here. I have a Gitlab repo that I would like to build and run its tests in the Gitlab pipeline CI.
So far, I'm only testing locally on my machine with a specific runner. There's a lot information out there and I'm starting to get lost with what to use and how to use it.
How would I go about creating a container with the tools that I need ? (VS compiler, cmake, git, etc...)
My application contains an SDK that only works on windows, so I'm not sure building on another platform would work at all, so how do I select a windows based container?
How would I use that container in the yml file in gitlab so that I can build my solution and run my tests?
Any specific documentation links or suggestions are welcomed and appreciated.
How would I go about creating a container with the tools that I need ? (VS compiler, cmake, git, etc...)
you can install those tools before the pipeline script runs. I usually do this in before_script.
If there's large-ish packages that need to be installed on every pipeline run, I'd recommend that you make yourown image, with all the required build dependencies, push it to GitLab and then just use it as your job image.
My application contains an SDK that only works on windows, so I'm not sure building on another platform would work at all, so how do I select a windows based container?
If you're using gitlab.com - Windows runners are currently in beta, but available for use.
SaaS runners on Windows are in beta and shouldn’t be used for production workloads.
During this beta period, the shared runner quota for CI/CD minutes applies for groups and projects in the same manner as Linux runners. This may change when the beta period ends, as discussed in this related issue.
If you're self-hosting - setup your own runner on Windows.
How would I use that container in the yml file in gitlab so that I can build my solution and run my tests?
This really depends on:
previous parts (you're using GL.com / self hosted)
how your application is built
what infrastructure you have access to
What I'm trying to say is that I feel like I can't give you a good answer without quite some more information
This is a not related to code fix, but a general approach for test automation.
I have a test automation written in javascript which runs perfectly on my machine as well as my local jenkins.
Now, i want to use my company's server(centOS) and jenkins so that it is accessible to everyone in my organization.
Issue: nodejs version in company's server need update to run my automation, but server team wont do it since they are not sure if any other functionality used be other teams may start to break because of the upgrade.
Have you faced this situation. Do you have different servers for core code and automation scripts. Please suggest.
This is a complex situation that really depends on many variables. I would recommend using an agent that contains the proper version of Nodejs. With this solution you can leave the current build server how it is but you can also use the exact version of node you need. This will require an extra server/VM with the Jenkins slave software but this will remove the need to change the master server.
The solution my company went with is using Jenkins 2.x with Declarative pipelines and ephemeral Docker containers for builds. This allows you to use any Docker image such as the official Node image. You can pin a version and build it with that. With this there is no need to worry about the version on the server. Jenkins Master doesn't even need to actually build.
I'm trying to convince my Configuration Manager to use Jenkins with RTC (witch we already use for Source Control, manual builds and many other things), as my boss asked.
I'd like to know why to use the Jenkins with RTC. I know RTC can do Continuous Integration, but what Jenkins can add to it?
RTC doesn't have a build engine in itself. It has a build engine toolkit which abstract the actual build engine (BuildForge, Hudson, Jenkins, ...)
If you want to do continuous integration from RTC, you need a build engine.
The official one is IBM BuildForge (not free).
Since RTC4, Jenkins (free) is also officially supported, with an RTC plugin for Jenkins to install on the Jenkins side (as well as the Build System Toolkit in the RTC downloads)
What Jenkins can add is:
a greater familiarity with the tool
a great list of plugins
the possibility to experiment locally (even without RTC)
a good connectivity with other source control tools (SVN, Git) in addition of Jazz RTC.
I have Jenkins running on a Cloudbees Fedora 17 node. I need my job to be able to install certain packages to build my project correctly for deployment but my yum install commands fail because the jenkins user does not have the correct permissions.
I cannot SSH into the box or use the jenkins CLI to assign root permissions to use sudo and Cloudbees doesn't appear to enable the Script Console. Neither can I run the yum command as with su because it expects the administrator password which I cannot enter remotely.
What can I do?
I am not aware that you can install additional software on the CloudBees Jenkins master node. But you can request from CloudBees support that additional software packages will be installed on the automatically created CloudBees build nodes.
As alternative you can also create your own build nodes (called OPE in CloudBees). This is helpful for a lot of cases such as specific software requirements (such as closed source software which requires a license) or just to be much more flexible when it comes to require packages installed on it.
Since you didn't mention what kind of packages are missing: There are a lot of frameworks that provide much more ruby, java, python, go etc. versions as any Linux distribution. CloudBees provides documentation for a lot of them on http://dev-at-cloud-docs.cloudbees.com/docs/dev-at-cloud-docs-1.1/Build+Tools.html
This is the problem with hosted solutions like what Cloudbees offers. If you need access to operating system level permissions on the host server, then you have little choice but to host Jenkins yourself, or to obtain a different licensing structure with CloudBees in order to have a VPS, or some other isolated but still SaaS hosted solution.
I am setting up the build system for a team that produces APIs used on several platforms and architectures. There has been a lot of work already spent on setting up Ant to build all of the Java code, so I would prefer to stick with Ant if possible.
Where I am stumped is how to build the C++ software. Here are the platforms and languages I need to support:
Java - Linux - 32bit & 64bit: Ant
Java - Windows - 32bit & 64bit: Ant
C++ - Linux - 32bit & 64bit: Ant w/CppTasks (question #1)
C++ - Windows - 32bit: (question #2)
Note: C++ on Windows is MS Visual Studio C++ projects.
I think the answer to question #1 is CppTasks because that seems to be the standard way to build C++ from Ant.
For question #2, I could also use CppTasks, but the developers will be compiling in Visual Studio, so it seems beneficial to use their Visual Studio project for building, which means calling MSBuild from Ant.
Has anyone tried this before and has a good solution for building Java & C++ on both Linux and Windows?
Do you use a Continuous Build System like Jenkins?
With Jenkins, your builds can be automatically triggered by check in/commit, time of day, and/or on command. The great thing about Jenkins is that you can have it automatically build all of the various versions of your software.
For example, you'll probably need to run make on Linux C++ but use msbuild on Windows systems, and you'll need to trigger a build on a Linux machine and one for a Windows machine. Jenkins can be setup to do this automatically. When a commit happens, all your various builds on all of your systems can be triggered at once. Then, you can store the builds you need on Jenkins and have your users actually pull the type they need off the project they need.
There are so many ways this could be setup, but the easiest is to simply create four separate jobs (One for Java 32bit, Java 64bit, C++ Linux, and C++ Microsoft). You don't necessarily need a separate Microsoft Java build (at least in theory), but there's nothing stopping you.
You can have a single Jenkins server run "slave" jobs on other build systems, so you could have Jenkins live on the 64Bit Linux system, but use a 32bit Linux system as a slave to do the 32bit build, and call a Windows slave to do the Visual Basic build. That way, all of your jobs are located in a central place, but you can use the environments you want.
If you've never used a Continuous Build system, download Jenkins and play around with it. It's free and open source, and very, very easy to use. You can run it on any machine that has a JDK or JRE 1.6. If you download the Windows version, it even comes with the JRE already built in.
Your best bet is to use a continuous build system and allow it to handle the mess. By the way, there's also Bamboo, CruiseControl, and Hudson (which was split from Jenkins a few months ago)
TeamCity should fit the bill very well. It supports Ant and MSBuild natively and has a pretty good cross plartform story (written in Java but excellent integration with e.g. Win).
Dont see any benefit in wrapping you Win MSBuild-based builds in yet another build system.
The list for this looks a little bit different (in my opinion)
Java -Maven for all platforms
C++ - Maybe Maven as well (Check http://duns.github.com/maven-nar-plugin/).