I have a lot of devdepencencies in my npm script. npm install takes a few minutes the first time, that's ok.
But since I'm integrating with TFS build server, it only needs to npm install once. After that, npm install is just wasting time because it takes 2-3 minutes to just determin the packages are already installed. Also, it seems to always reinstall the packages with -g global flag, even when existing.
How can I make it check if packages exist, and if so, skip npm install?
You can use npm-cache as an alternative way if you use on-premise build agents for build.
It is useful for build processes that run [npm|bower|composer|jspm]
install every time as part of their build process. Since dependencies
don't change often, this often means slower build times. npm-cache
helps alleviate this problem by caching previously installed
dependencies on the build machine. npm-cache can be a drop-in
replacement for any build script that runs [npm|bower|composer|jspm]
install.
How it Works
When you run npm-cache install [npm|bower|jspm|composer], it first
looks for package.json, bower.json, or composer.json in the current
working directory depending on which dependency manager is requested.
It then calculates the MD5 hash of the configuration file and looks
for a filed named .tar.gz in the cache directory ($HOME/.package_cache
by default). If the file does not exist, npm-cache uses the system's
installed dependency manager to install the dependencies. Once the
dependencies are installed, npm-cache tars the newly downloaded
dependencies and stores them in the cache directory. The next time
npm-cache runs and sees the same config file, it will find the tarball
in the cache directory and untar the dependencies in the current
working directory.
And you can also try with npm-install-missing.
However, if you are using VSTS Hosted Build Agent, then you cannot do this since every time you queue a build with Hosted Build Agent, a clean build agent is assigned for the build. That means there is no dependency package installed on the agent. You need to perform a complete npm install.
Related
I would to build a Docker image using multi-stage.
We are using yarn 2 and Zero installs feature which stores dependencies in .yarn/cache under zip format.
To minimize the size of my Docker image, I would like to only have the production dependencies.
Previsously, we would do
yarn install --non-interactive --production=true
But by doing that with a former version of yarn, we don't benefit from the .yarn/cache folder and it takes time to download dependencies whereas there are already here but not readable by the former version of yarn.
Is there a way to tell yarn 2 to get only production dependencies from the .yarn/cache folder and put it into another one ? Thus I could copy this folder inside my image and save time and space.
What are the best practices when using development modules such as Mocha or Browserify, which are needed during the build process but not in the built artifact itself? The only options I can see are:
Pre-arranging these modules to be installed globally on the build server (not likely possible in my case)
Run npm install -g explicitly as part of the build process for each module (Feels somewhat wrong to install globally)
Don't use the --production flag on npm install (which forces these modules to be present in the final artifact, increasing its size and defeating the purpose of dev/production dependencies)
Is there a way to only download the dependencies but do not compile source.
I am asking because I am trying to build a Docker build environment for my bigger project.
The Idear is that during docker build I clone the project, download all dependencies and then delete the code.
Then use docker run -v to mount the frequently changing code into the docker container and start compiling the project.
Currently I just compile the code during build and then compile it again on run. The problem ist that when a dependencie changes I have to build from scratch and that takes a long time.
Run sbt's update command. Dependencies will be resolved and retrieved.
I've setup a .Net TFS Build vNext build to run the following:
npm install
gulp
visual studio solution build
The build is confgured to clean automatically before checking for sources like so:
I have two problems with this build but this question is about problem #2.
The npm install step fails to install phantomjs because the command node install.js can't find node, even though node is in the System path.
Running builds configured to Clean fails because of the path too long error below.
[error]The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
I would like for the Clean setting to just work and clean up these longs paths, but I don't know how to make it work properly.
When I tried to re-create this same build in TeamCity the Clean option works just fine. Also, I know that if I had access to the script that is running the clean that it could just use a *rmdir /S /Q to remove these paths. Hopefully the VSO build can just run this or something that works.
[EDIT] - I have confirmed that this is fixed in the latest Build agent version (1.85.1)
I've been told by MS folks that this is currently a bug in the vNext builds and will be fixed in the next release of the product. I'll update this answer once I see this fix released.
From an admin console, run: npm install npm -g
Run npm version or npm -v to make sure it worked. Also note if you have node installed, you want to browse to the node directory and update it from there.
Also note vso agents sometimes store a local version. Browse to agent/tasks/npm.
I'm using Jenkins (CloudBees) to build my project, and this runs some scripts in each build to download some node packages using npm.
Yesterday the npm registry server was having troubles and this blocked the build cycle of the project.
In order not to depend on external servers, is there a way to persist my node_modules folder in Jenkins so I don't have to download them in every build?
You can check the package.json file and backup node_modules directory.
When you start next build in jenkins, just check package.json file and node_modules backup, if package.json file is not changed, just using previous backup.
PKG_SUM=$(md5sum package.json|cut -d\ -f 1)
CACHED_FILE=${PKG_SUM}.tgz
[[ -f ${CACHED_FILE} ]] && tar zxf ${CACHED_FILE}
npm install
[[ -f ${CACHED_FILE} ]] || tar zcf ${CACHED_FILE} node_moduels
above is quite simple cache implementation, otherwise you should check the cache file is not damaged.
CloudBees uses a pool of slaves to support your builds, and by nature you can have builds to run on various hosts, so start with a fresh workspace. Anyway, we try to allocate a slave that you already used to avoid download delays - this works for all file stored in workspace.
I don't think this would have prevented issue with npm repository being offline anyway.