How to install (complex) dependencies in Travis-CI? - travis-ci

I would like to setup a documentation CI build, i.e. a build that requires nothing more than ASCIIDOC, TeX, XSLT (Saxon) et cetera.
Now I am aware of [1] which states that regular apt commands can be used for hopefully installing any of this dependencies.
But how do do so? It appears cumbersome to change .travis.yml, push a build and start again if there was a typo or other error in the install command.
Thus I was looking into 'travis console' to (somehow) interactively test the setup dependency process - with no luck.
What is the recommended way of setting up dependencies (packages)?
Edit:
The document generation process is driven by a simple hand crafted Makefile. The Makefile invokes various programs, especially asciidoc, python, TeX, DBLaTeX, libxslt, Saxon. Basic TeX is not enough as some fancy TeX packages are required as well. The installation of DBLaTeX is naturally cumbersome.
[1] http://docs.travis-ci.com/user/installing-dependencies

If you want to run Travis locally on your own virtual machine, you may want to look at Travis Build. Travis Build allows you to generate the shell script that performs the Travis build. Setting this up is a bit cumbersome and may not be worth it unless you have a very complicated build.
The documentation build that you're describing seems relatively straightforward (although you're not giving us much details). I'd say you should be able to put those dependencies together by trial and error.
There's also a middle ground between Travis Build and pure trial-and-error. Use Vagrant to set up a virtual machine with Ubuntu Precise (same version as Travis is using). Then figure out which packages you need to install (apt-get install ...) to get your build running on the virtual machine. Then replicate those steps in your .travis.yml and you should be good to go.

Related

BentoML: how to build a model without importing the services.py file?

Is it possible to run bentoml build without importing the services.py file during the process?
I'm trying to put the bento build and containarize steps in our CI/CD server. Our model depends on some OS packages installed and some python packages. I thought I could run bentoml build to package the model code and binaries that are present. I'd leave the dependencies especification to the contanairize step.
To my surprise the bentoml build process tried to import the service file during the packaging and the build failed since I didn't have the dependencies installed in my CI/CD machine.
Can I prevent this importing while building/packaging the model? Maybe I should ignore the bento containarize and create my bento container by hand and just execute the bentoml serve inside.
I feel that the need to install by hand the dependencies is doubling the effort to specify them in the bentofile.yaml and preventing the reproducibility of my environment.
This is not currently possible. The community is working on an environment management feature, such that an environment with the necessary dependencies will be automatically created during build.

yarn workspaces & monorepo & monobuild and how to build only what changed and dependencies?

There are a few things done in monorepos/monobuilds (you can do a monorepo with no monobuild) that make things very nice but I don't see how yarn workspaces solves it just yet. One of the main ones is I do not see how yarn workspaces can do this part of a mono build process (very typical for scale)
git status to figure out which files changed
map those files to projects that have changed
build those projects and projects that depend on those and projects that depend on those
I am a little confused there. As a monobuild scales up, we really desire build times of a server change is under 3 minutes and changes to a library that may affect all projects would take a long time as it builds the entire repo (unless we split it out to different machines and the build time goes way down again).
Don't think there is necessarily one answer here but a number of things to consider in the context of your project:
If your project is really humungously large, consider someting like Bazel which is a bit complex but allows for incremental building and testing.
There are some specific tools to help with building large projects quickly. For instance, for JavaScript, there are Turborepo and Nx.
Yarn Workspaces or npm workspaces can generally help with enabling better monorepo build processes by allowing us to run build scripts only for a subset of workspaces. They won't solve the problem though of figuring out what to build when, they just provide us with the basic building block of running scripts selectively.
Finally a bit of Bash/Git/Makefile magic will probably be required. The following git command for instance can help us determine if files in particular paths have changed since the last commit git diff --quiet HEAD~1 HEAD -- [paths]. Note though this can can create a few annoying edge cases, especially if builds fail and we risk missing out on builing projects that we should build.
There are plugins for some CI/CD platforms that wrap the Git commands in a somewhat easier to use way. For instance, I have used the GitHub action has-changed-path and I think there was a plugin for BuildKite too, but I cannot find the link to that.
Generally I think it will be challenging to have a monorepo setup that avoids installing dependencies for all modules/workspaces and compiling all code. But I think it is possible to get to scale up to a few hundred thousand lines of code and hundreds of dependencies and keep install and compile times under 2-3 min using TypeScript in Yarn - when making good use of TypeScript project references and using something like Yarn Zero Installs.

Using build system to run tests or interact with clusters

What is the purpose of projects like these below that use Bazel for things other than building software?
https://github.com/bazelbuild/rules_webtesting
https://github.com/bazelbuild/rules_k8s
Are they just conveniently providing environment for run command (as opposed to building portable executables) or am I missing something?
The best I can tell is that Bazel could be used to run only subset of E2E tests based on knowledge what changed.
Disclaimer: I have only cursory knowledge about k8s and docker.
Bazel isn't just used for building and testing, it can also deploy, as you've discovered with the rules in those projects.
The best I can tell is that Bazel could be used to run only subset of E2E tests based on knowledge what changed.
Correct, but also extend tests to deployments. If you've only changed a single string in your Go binary that's injected into an image, Bazel is able to use rules_k8s, rules_docker, and rules_go to:
incrementally and reproducibly rebuild the minimum set files to
build the new Go executable
create a new image layer containing the Go executable (without using Docker)
push the image to the registry
redeploy changed pod(s) into cluster
The notable thing is that if you didn't change the source file, Bazel will always create an image with the same digest due to its reproducibility. That means that you can now trust the deployment workflow to not redeploy/restart a pod even if you do a bazel run twice or more.
For more, check out this BazelCon 2017 talk: Using Bazel for Fast, Correct Docker Deployments w/ Databricks
Fun fact: you can also use bazel run to start a REPL for your build target, from 0.15.0 onwards. Haskell and Scala rules use this.

Does Travis CI support Hack?

I've written code completely in Hack, and I would like to use Travis CI to test my builds on various HHVM versions with Hack enabled. Does Travis CI support Hack when I select HHVM as the testing platform or is it just PHP?
It provides both hhvm, which can run Hack, and hh_client which checks it.
However, Travis is running Ubuntu 12.04. This means you can only get HHVM 3.6 and will be unable to use any of the more recent features in both HHVM and Hack.

Makefiles in iOS build using jenkins

I'm new to makefiles and jenkins.Is there any guide on how to write makefile to run build and the unit test together using jenkins.?
You can definitely use Makefiles to build and run both your application/library and tests.
Here is a good guide to Makefiles:
http://mrbook.org/tutorials/make/
It should help you with writing a simple makefile. For more information, Google is your friend.
Another good guide is here:
http://www.cs.swarthmore.edu/~newhall/unixhelp/howto_makefiles.html
Remember, jenkins and makefiles are completely unrelated. You can use Jenkins with makefiles, and use makefiles without jenkins. One is a continuous integration system, the other just another way of building your software.
You can go ahead and use Xserver as suggested in the other post, but Jenkins has advantages that many other systems don't: it is extensible using a whole host of plugins, has a large user and developer community and is used for multiple types and styles of projects in various languages. While your project is purely for iOS, there are other things in Jenkins you could take advantage of from the available plugins list.
There is an XCode plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Xcode+Plugin
Maybe this helps too:
http://programming.oreilly.com/2013/04/upward-mobility-automating-ios-builds-with-jenkins.html
But maybe you are better off using Xserver if you try to do continuos integration:
https://developer.apple.com/library/ios/documentation/IDEs/Conceptual/xcode_guide-continuous_integration/200-Adopting_a_Continuous_Integration_Workflow/adopt_continuous_integration.html

Resources