Travis is not uploading cache directory? Why? - travis-ci

On this build, I have a cache directory in the settings,
where I install all dependencies, strangely travis does not execute before_cache and does not upload the cache?
My repository uses the settings as explained for caching:
https://travis-ci.org/gabyx/ApproxMVBB/jobs/256581132
Its strange that travis reports the following in the log
Setting up build cache
$ export CASHER_DIR=$HOME/.casher
0.14s$ Installing caching utilities
0.00s
0.61sattempting to download cache archive
fetching master/cache-linux-precise-49ed8168a954ef68babd02034884f858ddc4fe3ca0368ba1270828dc239856eb--compiler-gpp.tgz
fetching master/cache--compiler-gpp.tgz
could not download cache
0.00s
0.45sadding /home/travis/ApproxMVBBCache to cache
creating directory /home/travis/ApproxMVBBCache
Anybody has any idea?

Problem solved: I was sourcing the bash scripts, and have done an exit 0 in build.sh which exits the build.

Related

How to stop Bazel from trying to download packages in offline environment

Bazel is trying to download packages on python test. I've wrote a simple python code, and a test file testing it.
I'm running `bazel test //test:python-test and I get the following error:
/Path/to/build/external/bazel_tools/tools/jdk/build:305:1: no such package '#remotejdk_linux//': java.io.IOException: error downloading [ unknown host: mirror.bazel.build and referenced by '#bazel_tools//tools/jdk:remote_jdk'
Now, that's obviously a problem in my workspace, where we work offline. Is there any way to work offline with bazel?
Using the following flags will force bazel to use your local java:
--host_javabase=#bazel_tools//tools/jdk: absolute_javabase --define=ABSOLUTE_JAVABASE=/path/to/my/jdk
You can add them to your local .bazelrc file to write shorter command-line
You can manually download requested artifact and put it in cache before calling build. Bazel will not download artifact if it's already exists in local cache.

Gradle build in docker jenkins slave

I am trying to create a jenkins slave for building gradle lambda projects. Jenkins slave is throwing the below error while building the project.
Exception in thread "main" java.lang.RuntimeException: Could not create parent directory for lock file /gradle/wrapper/dists/gradle-4.2.1-bin/dajvke9o8kmaxbu0kc5gcgeju/gradle-4.2.1-bin.zip.lck
at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:43)
at org.gradle.wrapper.Install.createDist(Install.java:48)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:107)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)
/home/jenkins/workspace/ddoa-subprod/lf-security-gateway2/lf-security-gateway2
FAILURE: Build failed with an exception.
* What went wrong:
Failed to load native library 'libnative-platform.so' for Linux amd64.
Please help me in understanding the issue and let me know how to fix the same.
To fix this error: What went wrong: Failed to load native library 'libnative-platform.so' for Linux amd64. do the following:
Check if your Gradle cache (**~user/.gradle/**native folder exist at all or not).
Check if your Gradle cache (~user/.gradle/native folder exist and the file in question i.e. libnative-platform.so exists in that directory or not).
Check if the above folder ~user/.gradle or ~/.gradle/native or file: ~/.gradle/native/libnative-platform.so has valid permissions (should not be read-only. Running chmod -R 755 ~/.gradle is enough).
IF you don't see native folder at all or if your native folder seems corrupted, run your Gradle task ex: gradle clean build using -g or --gradle-user-home option and pass it's value.
Ex: If I ran mkdir /tmp/newG_H_Folder; gradle clean build -g /tmp/newG_H_Folder, you'll see Gradle will populate all the required folder/files (that it needs to run even before running any task or any option) are now in this new Gradle Home folder (i.e. /tmp/newG_H_Folder/.gradle directory).
From this folder, you can copy - just the native folder to your user's ~/.gradle folder (take backup of existing native folder in ~/.gradle first if you want to) if it already exists -or copy the whole .gradle folder to your ~ (home directory).
Then rerun your Gradle task and it won't error out anymore.
Gradle docs says:
https://docs.gradle.org/current/userguide/command_line_interface.html
-g, --gradle-user-home
Specifies the Gradle user home directory. The default is the .gradle directory in the user’s home directory.
Note: using gradle <sometask> -g <a_dynamic_folder_ex_jenkins_workspace> will always work as Gradle will create fresh .gradle cache in that -g defined folder, but doing this, it'll not reap the true benefit of Gradle's cache concept.
If you are using a version 3.4 if Gradle, then it could possibly be this issue.
To fix it, you can try to update your Gradle distribution to version 3.5 or higher, where this issue was solved.
I ran the command as sudo and it went through fine

NPM: Only install missing - how to speed up npm install

I have a lot of devdepencencies in my npm script. npm install takes a few minutes the first time, that's ok.
But since I'm integrating with TFS build server, it only needs to npm install once. After that, npm install is just wasting time because it takes 2-3 minutes to just determin the packages are already installed. Also, it seems to always reinstall the packages with -g global flag, even when existing.
How can I make it check if packages exist, and if so, skip npm install?
You can use npm-cache as an alternative way if you use on-premise build agents for build.
It is useful for build processes that run [npm|bower|composer|jspm]
install every time as part of their build process. Since dependencies
don't change often, this often means slower build times. npm-cache
helps alleviate this problem by caching previously installed
dependencies on the build machine. npm-cache can be a drop-in
replacement for any build script that runs [npm|bower|composer|jspm]
install.
How it Works
When you run npm-cache install [npm|bower|jspm|composer], it first
looks for package.json, bower.json, or composer.json in the current
working directory depending on which dependency manager is requested.
It then calculates the MD5 hash of the configuration file and looks
for a filed named .tar.gz in the cache directory ($HOME/.package_cache
by default). If the file does not exist, npm-cache uses the system's
installed dependency manager to install the dependencies. Once the
dependencies are installed, npm-cache tars the newly downloaded
dependencies and stores them in the cache directory. The next time
npm-cache runs and sees the same config file, it will find the tarball
in the cache directory and untar the dependencies in the current
working directory.
And you can also try with npm-install-missing.
However, if you are using VSTS Hosted Build Agent, then you cannot do this since every time you queue a build with Hosted Build Agent, a clean build agent is assigned for the build. That means there is no dependency package installed on the agent. You need to perform a complete npm install.

How to ask sbt to only fetch dependencies, without compiling?

Is there a way to only download the dependencies but do not compile source.
I am asking because I am trying to build a Docker build environment for my bigger project.
The Idear is that during docker build I clone the project, download all dependencies and then delete the code.
Then use docker run -v to mount the frequently changing code into the docker container and start compiling the project.
Currently I just compile the code during build and then compile it again on run. The problem ist that when a dependencie changes I have to build from scratch and that takes a long time.
Run sbt's update command. Dependencies will be resolved and retrieved.

In Jenkins, is there a way to persist npm packages so I don't have to install them in each build?

I'm using Jenkins (CloudBees) to build my project, and this runs some scripts in each build to download some node packages using npm.
Yesterday the npm registry server was having troubles and this blocked the build cycle of the project.
In order not to depend on external servers, is there a way to persist my node_modules folder in Jenkins so I don't have to download them in every build?
You can check the package.json file and backup node_modules directory.
When you start next build in jenkins, just check package.json file and node_modules backup, if package.json file is not changed, just using previous backup.
PKG_SUM=$(md5sum package.json|cut -d\ -f 1)
CACHED_FILE=${PKG_SUM}.tgz
[[ -f ${CACHED_FILE} ]] && tar zxf ${CACHED_FILE}
npm install
[[ -f ${CACHED_FILE} ]] || tar zcf ${CACHED_FILE} node_moduels
above is quite simple cache implementation, otherwise you should check the cache file is not damaged.
CloudBees uses a pool of slaves to support your builds, and by nature you can have builds to run on various hosts, so start with a fresh workspace. Anyway, we try to allocate a slave that you already used to avoid download delays - this works for all file stored in workspace.
I don't think this would have prevented issue with npm repository being offline anyway.

Resources