yarn workspace deploy into a docker image - yarn-workspaces

I am using yarn workspaces and I have this packages in my package.json:
"workspaces": ["packages/*"]
I am trying to create a docker image to deploy and I have the following Dockerfile:
# production dockerfile
FROM node:9.2
# add code
COPY ./packages/website/dist /cutting
WORKDIR /cutting
COPY package.json /cutting/
RUN yarn install --pure-lockfile && yarn cache clean --production
CMD npm run serve
But I get the following error:
error An unexpected error occurred:
"https://registry.yarnpkg.com/#cutting%2futil: Not found"
#cutting/util is the name of one of my workspace packages.
So the problem is that there is no source code in the docker image so it is trying to install it from yarnpkg.
what is the best way to handle workspaces when deploying to a docker image?
Say my structure is:
root
|_node_modules
-----|_ package_1
-----|_package_2
-----|_package_3
-----|_deploy_pkg
I want to deploy deploy_pkg but I have no idea how to create a bundle I can just copy into the docker image

Related

Docker Build Image issue while running in Azure Pipelines

I am unable to build the Docker images and receiving an authentication error when attempting to build them.
When I copy the SSH key locally and run the build, it is successful. However, when attempting to run it through the pipelines, it fails with the error below. Can someone please help me with this?
I have tried building the Docker image locally and it was successful. Please assist me in building the Docker image through the Azure pipeline.
Dockerfile:
FROM node:16.13.2 as build
WORKDIR /usr/local/app
COPY ./ /usr/local/app/
RUN npm i -g npm#8.1.2
RUN npm cache clear --force
RUN npm install --legacy-peer-deps
RUN npm run build
FROM nginx:latest
COPY --from=build /usr/local/app/dist/angular-app /usr/share/nginx/html
EXPOSE 80

Docker build of strapi image not completing, stuck at localhost address

I am trying to create the docker image of my strapi project with cloud hosted mongodb atlas database. Below is my dockerfile code
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
RUN npm run start:develop
CMD ["npm","start"]
I am running the below code to build the docker file
docker build .
I am not receiving any error but the problem is building of image is not completing, it sticks at http://localhost:1337. How can I resolve this? I have attached the screenshot . TIA :)
Your RUN npm run start:develop step is never ending since it is running the server.
You can either write that step in your CMD and remove your existing CMD ["npm","start"], or you can simply remove that step. It depends on your case.
Try the following Dockerfile:
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm","start"]
or
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm"," run", "start:develop]

What is the RUN command in Dockerfile for install vuetify?

I expected and tried to include it in Dockefile directly. Here is my whole dockerfile:
FROM node
# make the 'app' folder the current working directory
WORKDIR /app
# copy both 'package.json' and 'package-lock.json' (if available)
COPY package*.json ./
# install project dependencies
RUN npm install
RUN npm i --save #koumoul/vuetify-jsonschema-form
RUN npm install --save axios vue-axios
RUN npm install vuetify#1.5.8
# copy project files and folders to the current working directory (i.e. 'app' folder)
COPY . .
But got
Module not found: Error: Can't resolve 'vuetify' in '/app/src/views'
It is not good practice to install separately from package.json. You should just include it in your package.json.. But I am going to teach you a technique for testing cases like this.
You can run first the image on your own docker run -it node bash then do there what you want to run. You can also apply bind mount so the files that you needed are included like docker run -it -v=$(pwd):/usr/src/app node bash.. With this you can practice everything that you are trying to run in your Dockerfile more directly

yarn install inside docker image with yarn workspaces

I am using yarn workspaces and I have this packages in my package.json:
"workspaces": ["packages/*"]
I am trying to create a docker image to deploy and I have the following Dockerfile:
# production dockerfile
FROM node:9.2
# add code
COPY ./packages/website/dist /cutting
WORKDIR /cutting
COPY package.json /cutting/
RUN yarn install --pure-lockfile && yarn cache clean --production
CMD npm run serve
But I get the following error:
error An unexpected error occurred:
"https://registry.yarnpkg.com/#cutting%2futil: Not found"
#cutting/util is the name of one of my workspace packages.
So the problem is that there is no source code in the docker image so it is trying to install it from yarnpkg.
what is the best way to handle workspaces when deploying to a docker image.
This code won't work outside of the docker vm, so it will refuse in the docker, too.
The problem is you have built a code, and copy the bundled code. The yarn workspaces is looking for a package.json that you don't have in the dist folder. The workspaces is just creating a link in a common node_modules folder to the other workspace that you are using. The source code is needed there. (BTW why don't you build code inside the docker vm? That way source code and dist would also be available.)
Here is my dockerfile. I use yarn workspaces and lerna, but without lerna should be similar. You want to build your shared libraries and then test the build works locally by running your code in your dist folder.
###############################################################################
# Step 1 : Builder image
FROM node:11 AS builder
WORKDIR /usr/src/app
ENV NODE_ENV production
RUN npm i -g yarn
RUN npm i -g lerna
COPY ./lerna.json .
COPY ./package* ./
COPY ./yarn* ./
COPY ./.env .
COPY ./packages/shared/ ./packages/shared
COPY ./packages/api/ ./packages/api
# Install dependencies and build whatever you have to build
RUN yarn install --production
RUN lerna bootstrap
RUN cd /usr/src/app/packages/shared && yarn build
RUN cd /usr/src/app/packages/api && yarn build
###############################################################################
# Step 2 : Run image
FROM node:11
LABEL maintainer="Richard T"
LABEL version="1.0"
LABEL description="This is our dist docker image"
RUN npm i -g yarn
RUN npm i -g lerna
ENV NODE_ENV production
ENV NPM_CONFIG_LOGLEVEL error
ARG PORT=3001
ENV PORT $PORT
WORKDIR /usr/src/app
COPY ./package* ./
COPY ./lerna.json ./
COPY ./.env ./
COPY ./yarn* ./
COPY --from=builder /usr/src/app/packages/shared ./packages/shared
COPY ./packages/api/package* ./packages/api/
COPY ./packages/api/.env* ./packages/api/
COPY --from=builder /usr/src/app/packages/api ./packages/api
RUN yarn install
CMD cd ./packages/api && yarn start-production
EXPOSE $PORT
###############################################################################

Compile webpack on docker production server

I am setting up docker for my React/Redux app, and I was wondering how to set it up in such way, that in production, on container setup, webpack compiles my whole code with production configuration, and then it removes itself, or something like that. Because the only thing I will need for my project is production code, and a simple node server that will serve it.
I'm not sure if I explained it well, since docker and webpack are still new things for me.
EDIT:
Alternatively I can even serve everything with an apache server, but I want everything to compile and setup just when I run docker-compose.
If I understand correctly, you want to trash your node dev dependencies from your image after your npm run build during the docker build.
You can do it but there is a little trick you must be aware of.
Each line in your Dockerfile result in a new step in the image and is pushed with the image.
So, if you execute in your Dockerfile :
RUN npm install # Install dev and prod deps
RUN npm run build # Execute your webpack build
RUN npm prune --production # Trash all devDependencies from your node_modules folder
Your image size will contains :
The first npm install
The npm run build
The result of the npm prune
Your image will be bigger than just :
RUN npm install # Install dev and prod deps
RUN npm run build # Execute your webpack build
Wich contains :
The first npm install
The npm run build
To avoid this problem you must do in your dockerfile :
RUN npm install && npm run build && npm prune --production
That way you will get a minimalistic image. With :
The npm run build
The result of the npm prune
Your final Dockerfile will be some sort of :
FROM node:7.4.0
ADD . /src
RUN cd /src && npm install && npm run build && npm prune --production # You can even use npm prune without the --production flag
ENV NODE_ENV production

Resources