npm link dev packages when using docker dev containers - docker

Use npm link for authoring multiple packages simultaneously in docker dev containers
PkgA is a dependency of PkgB, I'm making changes to both. Goal is to be able to link PkgA in PkgB without publishing each small update and re-installing. npm|yarn link solve this, but I'm developing in docker containers.
https://github.com/npm/npm/issues/14325

1. Create a directory on the host machine to serve as the global repo
(I like to make a docker dir and put all of my volumes in it)
mkdir -p ~/docker/volumes/yalc
2. Mount the volume in both (or more) dev containers
https://code.visualstudio.com/docs/remote/containers-advanced
devcontainer.json
...
"mounts": ["source=/Users/evan/docker/volumes/yalc,target=/yalc,type=bind,consistency=cached"],
...
and rebuild the container
3. Install yalc and publish the package (In dependency repo container)
https://www.npmjs.com/package/yalc
npm i yalc -g
yalc publish --store-folder /yalc
--store-folder tells yalc to publish the repo to our volume
4. Link to the package in consuming repo
consider adding yalc to .gitignore first:
.yalc
yalc.lock
Run the link command
npm i yalc -g
yalc link PkgA --store-folder /yalc
Where PkgA is the name of the package as defined in it's package.json

Related

update solidity version in docker container

I installed oyente using docker installation as described in the link
https://github.com/enzymefinance/oyente using the following command.
docker pull luongnguyen/oyente && docker run -i -t luongnguyen/oyente
I can analyse older smart contracts but I get compilation error when I try it on newer contracts. I need to update the version of solc but I couldn't.
On the container the current version is
solc, the solidity compiler commandline interface
Version: 0.4.21+commit.dfe3193c.Linux.g++ .
I read that the best way to update it is to use the command npm so I executed the following command but I am getting errors cause I assume npm version is not new also.
docker exec -i container_name bash -c "npm install -g solc"
I would appreciate, cause I am trying to sole this for hours now. Thanks in advance,
Ferda
Docker's standard model is that an image is immutable: it contains a fixed version of your application and its dependencies, and if you need to update any of this, you need to build a new image and start a new container.
The first part of this, then, looks like any other Node package update. Install Node in the unlikely event you don't have it on your host system. Run npm update --save solc to install the newer version and update your package.json and package-lock.json files. This is the same update you'd do if Docker weren't involved.
Then you can rebuild your Docker image with docker build. This is the same command you ran to initially build the image. Once you've created the new image, you can stop, delete, and recreate your container.
# If you don't already have Node, get it
# brew install nodejs
# Update the dependency
npm update --save solc
npm run test
# Rebuild the image
docker build -t image_name .
# Recreate the container
docker stop container_name
docker rm container_name
docker run -d --name container_name image_name
npm run integration
git add package*.json
git commit -m 'update solc version to 0.8.14'
Some common Docker/Node setups try to store the node_modules library tree in an anonymous volume. This can't be easily updated, and hides the node_modules tree that gets built from the image. If you have this setup (maybe in a Compose volumes: block) I'd recommend deleting any volumes or mounts that hide the image contents.
Note that this path doesn't use docker exec at all. Think of this like getting a debugger inside your running process: it's very useful when you need it, but anything you do there will be lost as soon as the process or container exits, and it shouldn't be part of your normal operational toolkit.

Is there a way i can include .deb package in Docker file

I created a xyz.deb package which, after installation, provides an application. I am trying to create a Docker container with FROM ubuntu:20.04.
How do I add my xyz.deb package in the Dockerfile and install it so that container comes ready with the application xyz.
The COPY command in a Dockerfile lets you copy external files into the container. You can then install the .deb file as you would on your local system with a RUN command.
Simple example:
COPY ./xyz.deb /
RUN dpkg -i /xyz.deb

Build docker images without exposing secrets to the registry

What works:
I do have a php-fpm docker container hosting an PHP application that is using composer for managing dependencies. Jenkins builds the container, what also runs composer install and pushes it to the registry.
What should work:
I want to include a private package from git with composer, what requires authentication. Therefore the container has to be in posses of secrets that should not be leaked to the container registry.
How can I install composer packages from private repositories without exposing the secrets to the registry?
What wont work:
let Jenkins run composer install. It is necessary for the dev environment to have the dependencies installing while building.
copy in and out the ssh key during build as that would save it to the layers.
What other options do I have?
As there might be better solutions out there, mine was to use docker multi stage builds to have the build process in an early layer that is not included in the final image. That way the container registry never sees the secrets. To verify that I used dive.
Please see the Dockerfile below
FROM php-fpm
COPY ./id_rsa /root/.ssh/id_rsa
RUN chmod 600 /root/.ssh/id_rsa
RUN wget https://raw.githubusercontent.com/composer/getcomposer.org/76a7060ccb93902cd7576b67264ad91c8a2700e2/web/installer -O - -q | php -- --quiet
COPY ./src /var/www/html
RUN composer install
FROM php-fpm
COPY --from=0 /var/www/html/vendor /var/www/html/vendor

Speed Up NPM Build in Jenkins

We have Jenkins running within ECS. We are using pipelines for our build and deploy process. The pipeline uses the docker plugin to pull an image which has some dependencies for testing etc, all our steps then occur within this docker container.
The issue we currently have is that our NPM install takes about 8 minutes. We would like to speed this process up. As containers are being torn down at the end of each build then the node_modules that are generated are disposed of. I've considered NPM caching but due to the nature of docker this seemed irrelevant unless we pre-install the dependencies into the docker image (but this triples the size of the image almost). Are there simple solutions to this that will help our NPM install speeds?
You should be using package caching but not caching node_modules directly. Instead you mount cache directories that your package installer uses, and your install will be blazing fast. Docker does make that possible by allowing you to mount directories in a container, that persist across builds.
For yarn mount ~/.cache or ~/.cache/yarn
For npm mount ~/.npm
docker run -it -v ~/.npm:/.npm ~/.cache:/.cache /my-app:/my-app testing-image:1.0.0 bash -c 'npm ci && npm test`
Note: I'm using npm ci here, which will always delete node_modules and reinstall using exact versions in the package-lock.json, so you get very consistent builds. (In yarn, this is yarn install --frozen-lockfile)
You could set up a Http proxy and cache all dependencies (*)(**).
Then use --build-arg to set HTTP_PROXY variable:
docker build --build-arg HTTP_PROXY=http://<cache ip>:3128 .
*: This will not work improve performance on dependencies that need to be compiled (ie: c/c++ bindings)
**: I use a Squid container to share cache configuration
In my case it was a bunch of corporate software installed in my computer apparently some anti virus analyzing all the node_modules files from the container when I mounted the project folder on the host machine, what I did was avoid mounting node_modules locally. Immediately sped up from 25 min to 5.
I have explained what I did with a possible implementation here. I have not used the package-lock.json but the npm ls command to check for changes in the node_modules folder so that I could potentially skip the step of re-uploading the cached modules on the bind mount.
#bkucera 's answer points you in the right direction with the bind mount, in general the easiest option in a containerized environment is to create a volume storing the cached packages. These packages could be archived in a tarball, which is the most common option, or even compressed if necessary (files in a .tar are not compressed).

Build with docker into host directory

I'm fairly new to docker, but I'm trying to see if I can use it to build the frontend app for a project using it and take the built app and hand it off to another tool.
So ideally, I'd like to:
1) Setup environment using Dockerfile.
2) Run npm run build
What i'm not sure is how can I access the build folder from the container from my host?
My docker file is:
FROM kkarczmarczyk/node-yarn:latest
WORKDIR /app
ADD . /app
RUN yarn --ignore-engines
RUN yarn run build
Then I do:
docker build -t build-app
From the prompts it looks like it builds properly, but I don't know how to get the built app from the container. Its building to a /dist folder on the container.
How can I access it from the host?
You need to mount a volume to your host machine, which allows you to share that particular directory, bidirectional with the container.
You could do this something like
docker run -v <host-path>:<container-path> <image-id>
Refer this answer. docker mounting volumes on host

Resources