How to run gRPC microservice with nestjs in a docker container - docker

I'm setting up a Nest JS microservice with gRPC in a docker container. I have installed the grpc package from npm but when i start the container i get an error message that the "grpc package is missing" how do i make the package available in the container
I have tried to install the grpc package with the RUN command in the dockerfile but i keep getting the same error.
FROM node:10.15.3
WORKDIR /usr/src/app/auth
COPY package*.json ./
RUN npm install
RUN npm install --save grpc
COPY . .
EXPOSE 3001
it works normally outside the container, but for some reason its not working when i run it in a container

What worked for me was adding the
RUN npm rebuild grpc --force
after the npm install

Related

How to install #swc/core-linux-musl on windows, to make it work in docker container?

I'am working on windows. I docernize Next.js with Typescript app.
Here is my dockerfile:
FROM node:alpine
# create directory where our application will be run
RUN mkdir -p /usr/src
WORKDIR /usr/src
# copy our files into directory
COPY . /usr/src
# install dependences
RUN npm install
EXPOSE 3000
ENTRYPOINT ["npm", "run" ,"dev"]
During development I bind host catalogue to container by --mount type=bind,source=d:/apps/library.next.js/,target=/usr/src. When I start container I get error: error - Failed to load SWC binary for linux/x64, see more info here: https://nextjs.org/docs/messages/failed-loading-swc.
That's fine, I understand error and know what to do. To fix this I need to install #swc/cli #swc/core #swc/core-linux-musl, but I can't doing it because npm complain:
ERR! code EBADPLATFORM npm
ERR! notsup Unsupported platform for #swc/core-linux-musl#1.2.42: wanted {"os":"linux","arch":"x64"} (current: {"os":"win32","arch":"x64"})
How to install it on windows, or how to change docker setup to make it work? I must install it locally then it will be linked (by binding !) into container.
My workaround for now is to get into container by docker exec -it <id> /bin/sh then manually type npm install -save-dev #swc/cli #swc/core #swc/core-linux-musl. But doing that every time I recreate container is annoying.
The docs state: The -f or --force will force npm to fetch remote resources even if a local copy exists on disk. And it should be in the docs v6 legacy, the one you posted and v8 version. (See the section after --package-lock-only. It comes with an example npm install sax --force). So, you shouldn't have issues with that every time your container is recreating.

Docker build of strapi image not completing, stuck at localhost address

I am trying to create the docker image of my strapi project with cloud hosted mongodb atlas database. Below is my dockerfile code
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
RUN npm run start:develop
CMD ["npm","start"]
I am running the below code to build the docker file
docker build .
I am not receiving any error but the problem is building of image is not completing, it sticks at http://localhost:1337. How can I resolve this? I have attached the screenshot . TIA :)
Your RUN npm run start:develop step is never ending since it is running the server.
You can either write that step in your CMD and remove your existing CMD ["npm","start"], or you can simply remove that step. It depends on your case.
Try the following Dockerfile:
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm","start"]
or
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm"," run", "start:develop]

Docker build failing with error at array-slice

I'm running a docker build in windows and npm has failed in a step called array-slice. I installed the dependency manually and re-ran the build. Then it failed on buffer-from.
It says it cannot access the dependency url in the error - but it works when I test it in the browser - which seems to me that my docker build process is having trouble to access some urls, for some reason...
npm WARN server#1.0.0 No description
npm WARN server#1.0.0 No repository field.
npm ERR! network timeout at: https://registry.npmjs.org/array-unique/-/array-unique-0.3.2.tgz
Is there any necessary additional config to make sure npm can access those dependencies?
This is my Dockerfile content:
FROM node:12-alpine
WORKDIR /usr/app
COPY package.json package-lock.json ./
RUN npm install
COPY . .
EXPOSE 3333
CMD ["npm", "run", "dev"]
It was just an installation issue on windows... Removed everything, including virtualbox, and installed it all again... seems to work fine now...
The only problem is that my app is still not exposed to the port I've requested... but that's another question.

libnode-dev installation within docker container

I'm trying to run a node.js application.
I can run it without problems directly on my raspbian buster.
Within a docker container running on the same raspberry pi, I have no such luck.
Dockerfile:
FROM balenalib/raspberry-pi2-debian-node:10-stretch-run
RUN sudo apt-get update
RUN sudo apt-get -y install g++ python make git
WORKDIR /usr/src/app
COPY package.json package.json
RUN JOBS=MAX npm install --production
COPY . ./
CMD ["npm", "start"]
But when I run the same node.js code within a docker container, I'm getting a libnode.so.64 error.
pi#raspberrypi:~/rpi-lora-sensorified/data $ docker logs rpi-lora-sensorified_data_1
> resin-websocket#1.0.1 start /usr/src/app
> node index.js
/usr/src/app/node_modules/bindings/bindings.js:121
throw e;
^
Error: libnode.so.64: cannot open shared object file: No such file or directory
I've tried installing libnode-dev (which I've concluded provides this library) within the container, but I'm getting a
E: Unable to locate package libnode-dev
And yes, I've rebuilt the container without cache but still cannot locate that package.
Any (like really even some pointers would help) idea where do I continue to look further?
So the solution that I can't explain at all is:
I was trying to run the code on debian stretch, while testing if it works on debian buster.. When updating the docker image to buster, everthing works as expected.

Could not find a required file

I'm trying to run docker container with create-react-app App. App works fine and here's how my dockerfile looks like.
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
# install and cache dependencies
COPY package.json ./package.json
COPY ./build/* ./public/
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
# start
CMD ["npm", "start"]
When I run docker im getting error
> my-app#0.1.0 start /
> react-scripts start
Could not find a required file.
Name: index.js
Searched in: /src
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! my-app#0.1.0 start: `react-scripts start`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the my-app#0.1.0 start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2019-07-14T08_29_30_761Z-debug.log
has anybody have any idea?
npm start is for webpack - which serves you as the dev server. you are still using directly the src files, not the minified build (dist), which will only be used on production.
#Dockerfile.dev:
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
COPY package.json ./package.json
#use the minified build file for production, not now - npm start is for development.
#COPY ./build/* ./public/
#install dependencies:
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
#copy your project files: (also bad for development, use volume(https://docs.docker.com/storage/volumes/) instead)
COPY . .
# start
CMD ["npm", "start"]
(This builds on #EfratLevitan's answer, but is a little more production-oriented. Their answer will be better if you want to use Docker as a core part of your development flow.)
If you have a working Webpack setup already, its output is static files that can be served up by any Web server. Once you've successfully run npm run build, you can use anything to serve the resulting build directory – serve it as static content from something like a Flask application, put it in a cloud service like Amazon S3 that can serve it for you, directly host it yourself. Any of the techniques described on the CRA Deployment page will work just fine in conjunction with a Docker-based backend.
If you'd like to serve this yourself via Docker, you don't need Node to serve the build directory, so a plain Web server like nginx will work fine. The two examples from the image description work for you here:
# Just use the image and inject the content as data
docker run -v $PWD/build:/usr/share/nginx/html -p 80:80 nginx
# Build an image with the content "baked in"
cat >Dockerfile <<EOF
FROM nginx
COPY ./build /usr/share/nginx/html
EOF
# Run it
docker build -t me/nginx .
docker run -p 80:80 me/nginx
The all-Docker equivalent to this is to use a multi-stage build to run the Webpack build inside Docker, then copy it out to a production Web server image.
FROM node:12.2.0-alpine AS build
WORKDIR /app
COPY package.json yarn.lock .
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
COPY . .
RUN npm run build
FROM nginx
COPY --from=build /app/build /usr/share/nginx/html
In this model you'd develop your front-end code locally. (The Webpack/CRA stack has pretty minimal host dependencies, and since the application runs in the user's browser, it can't depend on Docker-specific networking features.) You'd only build this Dockerfile once you wanted to run an end-to-end test with all of the parts running together before you actually pushed out to production.

Resources