Could not find a required file - docker

I'm trying to run docker container with create-react-app App. App works fine and here's how my dockerfile looks like.
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
# install and cache dependencies
COPY package.json ./package.json
COPY ./build/* ./public/
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
# start
CMD ["npm", "start"]
When I run docker im getting error
> my-app#0.1.0 start /
> react-scripts start
Could not find a required file.
Name: index.js
Searched in: /src
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! my-app#0.1.0 start: `react-scripts start`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the my-app#0.1.0 start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2019-07-14T08_29_30_761Z-debug.log
has anybody have any idea?

npm start is for webpack - which serves you as the dev server. you are still using directly the src files, not the minified build (dist), which will only be used on production.
#Dockerfile.dev:
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
COPY package.json ./package.json
#use the minified build file for production, not now - npm start is for development.
#COPY ./build/* ./public/
#install dependencies:
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
#copy your project files: (also bad for development, use volume(https://docs.docker.com/storage/volumes/) instead)
COPY . .
# start
CMD ["npm", "start"]

(This builds on #EfratLevitan's answer, but is a little more production-oriented. Their answer will be better if you want to use Docker as a core part of your development flow.)
If you have a working Webpack setup already, its output is static files that can be served up by any Web server. Once you've successfully run npm run build, you can use anything to serve the resulting build directory – serve it as static content from something like a Flask application, put it in a cloud service like Amazon S3 that can serve it for you, directly host it yourself. Any of the techniques described on the CRA Deployment page will work just fine in conjunction with a Docker-based backend.
If you'd like to serve this yourself via Docker, you don't need Node to serve the build directory, so a plain Web server like nginx will work fine. The two examples from the image description work for you here:
# Just use the image and inject the content as data
docker run -v $PWD/build:/usr/share/nginx/html -p 80:80 nginx
# Build an image with the content "baked in"
cat >Dockerfile <<EOF
FROM nginx
COPY ./build /usr/share/nginx/html
EOF
# Run it
docker build -t me/nginx .
docker run -p 80:80 me/nginx
The all-Docker equivalent to this is to use a multi-stage build to run the Webpack build inside Docker, then copy it out to a production Web server image.
FROM node:12.2.0-alpine AS build
WORKDIR /app
COPY package.json yarn.lock .
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
COPY . .
RUN npm run build
FROM nginx
COPY --from=build /app/build /usr/share/nginx/html
In this model you'd develop your front-end code locally. (The Webpack/CRA stack has pretty minimal host dependencies, and since the application runs in the user's browser, it can't depend on Docker-specific networking features.) You'd only build this Dockerfile once you wanted to run an end-to-end test with all of the parts running together before you actually pushed out to production.

Related

How to install #swc/core-linux-musl on windows, to make it work in docker container?

I'am working on windows. I docernize Next.js with Typescript app.
Here is my dockerfile:
FROM node:alpine
# create directory where our application will be run
RUN mkdir -p /usr/src
WORKDIR /usr/src
# copy our files into directory
COPY . /usr/src
# install dependences
RUN npm install
EXPOSE 3000
ENTRYPOINT ["npm", "run" ,"dev"]
During development I bind host catalogue to container by --mount type=bind,source=d:/apps/library.next.js/,target=/usr/src. When I start container I get error: error - Failed to load SWC binary for linux/x64, see more info here: https://nextjs.org/docs/messages/failed-loading-swc.
That's fine, I understand error and know what to do. To fix this I need to install #swc/cli #swc/core #swc/core-linux-musl, but I can't doing it because npm complain:
ERR! code EBADPLATFORM npm
ERR! notsup Unsupported platform for #swc/core-linux-musl#1.2.42: wanted {"os":"linux","arch":"x64"} (current: {"os":"win32","arch":"x64"})
How to install it on windows, or how to change docker setup to make it work? I must install it locally then it will be linked (by binding !) into container.
My workaround for now is to get into container by docker exec -it <id> /bin/sh then manually type npm install -save-dev #swc/cli #swc/core #swc/core-linux-musl. But doing that every time I recreate container is annoying.
The docs state: The -f or --force will force npm to fetch remote resources even if a local copy exists on disk. And it should be in the docs v6 legacy, the one you posted and v8 version. (See the section after --package-lock-only. It comes with an example npm install sax --force). So, you shouldn't have issues with that every time your container is recreating.

Docker build of strapi image not completing, stuck at localhost address

I am trying to create the docker image of my strapi project with cloud hosted mongodb atlas database. Below is my dockerfile code
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
RUN npm run start:develop
CMD ["npm","start"]
I am running the below code to build the docker file
docker build .
I am not receiving any error but the problem is building of image is not completing, it sticks at http://localhost:1337. How can I resolve this? I have attached the screenshot . TIA :)
Your RUN npm run start:develop step is never ending since it is running the server.
You can either write that step in your CMD and remove your existing CMD ["npm","start"], or you can simply remove that step. It depends on your case.
Try the following Dockerfile:
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm","start"]
or
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm"," run", "start:develop]

run npm ci hangs on docker build ubuntu

I'm trying to build a docker file for an ionic project, on ubuntu virtualbox. Here's the dockerfile:
# Build
FROM beevelop/ionic AS ionic
# Create the application directory
WORKDIR /usr/src/app
# Install the application dependencies
# We can use wildcard to ensure both package.json AND package-lock.json are considered
# where available (npm#5+)
COPY package*.json ./
RUN npm --verbose ci
# Bundle app source
COPY . .
RUN ionic build
## Run
FROM nginx:alpine
#COPY www /usr/share/nginx/html
COPY --from=ionic /usr/src/app/www /usr/share/nginx/html
My problem is that the build gets stuck on step 4 (RUN npm --verbose ci) It starts downloading some packages, but then it hangs at some point.
I tried different solution:
npm clean cache
npm config set registry http://registry.npmjs.org/
removing package-lock.json
But nothing works, any help will be greatly apprecited. Thanks in advance
To whoever experienced this problem, it was to due to internet connection. Keep trying until it downloads all the packages.

Docker build failing with error at array-slice

I'm running a docker build in windows and npm has failed in a step called array-slice. I installed the dependency manually and re-ran the build. Then it failed on buffer-from.
It says it cannot access the dependency url in the error - but it works when I test it in the browser - which seems to me that my docker build process is having trouble to access some urls, for some reason...
npm WARN server#1.0.0 No description
npm WARN server#1.0.0 No repository field.
npm ERR! network timeout at: https://registry.npmjs.org/array-unique/-/array-unique-0.3.2.tgz
Is there any necessary additional config to make sure npm can access those dependencies?
This is my Dockerfile content:
FROM node:12-alpine
WORKDIR /usr/app
COPY package.json package-lock.json ./
RUN npm install
COPY . .
EXPOSE 3333
CMD ["npm", "run", "dev"]
It was just an installation issue on windows... Removed everything, including virtualbox, and installed it all again... seems to work fine now...
The only problem is that my app is still not exposed to the port I've requested... but that's another question.

Versioning of a nodejs project and dockerizing it with minimum image diff

The npm version is located at package.json.
I have a Dockerfile, simplified as follows:
FROM NODE:carbon
COPY ./package.json ${DIR}/
RUN npm install
COPY . ${DIR}
RUN npm build
Correct my understanding,
If ./package.json changes, is it true that the writable docker image layers changes are from 2 to 5?
Assuming that I do not have any changes on npm package dependencies,
How could I change the project version but I do not want docker rebuild image layer for RUN npm install ?
To sum up, the behavior you describe using Docker is fairly standard (as soon as package.json has changed and has a different hash, COPY package.json ./ will be trigerred again as well as each subsequent Dockerfile command).
Thus, the docker setup outlined in the official doc of Node.js does not alleviate this, and proposes the following Dockerfile:
FROM node:carbon
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
But if you really want to find ways to avoid rerunning npm install from scratch most of the time, you could just maintain two different files package.json and, say, package-deps.json, import (and rename) package-deps.json and run npm install, then use the proper package.json afterwards.
And if you want to have more checks to be sure that the dependencies of both files are not out-of-sync, it happens that the role of file package-lock.json may be of some help, if you use the new npm ci feature that comes with npm 5.8 (cf. the corresponding changelog) instead of using npm install.
In this case, as the latest version of npm available in Docker Hub is npm 5.6, you'll need to upgrade it beforehand.
All things put together, here is a possible Dockerfile for this use case:
FROM node:carbon
# Create app directory
WORKDIR /usr/src/app
# Upgrade npm to have "npm ci" available
RUN npm install -g npm#5.8.0
# Import conf files for dependencies
COPY package-lock.json package-deps.json ./
# Note that this REQUIRES to run the command "npm install --package-lock-only"
# before running "docker build …" and also REQUIRES a "package-deps.json" file
# that is in sync w.r.t. package.json's dependencies
# Install app dependencies
RUN mv package-deps.json package.json && npm ci
# Note that "npm ci" checks that package.json and package-lock.json are in sync
# COPY package.json ./ # subsumed by the following command
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
Disclaimer: I did not try the solution above on a realistic example as I'm not a regular node user, but you may view this as a useful workaround for the dev phase…

Resources