I am working on sails application In my application I have used mysql and mongo adpater to connect with different database. Both db are hosted on somewhere on internet. Application is working fine in my local environment. I am facing issue once I add project to docker container. I am able to generate docker image and run docker container. When I call simple routers where DB connection is not exists it's working fine but when I call Testcontroller which is return data from mongodb. it give me ReferenceError: Test is not define. here Test is mongodb's entity.
DockerFile:
FROM node:latest
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY ["package.json", "./"]
RUN npm install --verbose --force && mv node_modules ../
COPY . .
EXPOSE 80
CMD npm start
TestController
/**
* TestController
* #description :: Server-side actions for handling incoming requests.
*
* #help :: See https://sailsjs.com/docs/concepts/actions
*/
module.exports = {
index: async function(req, res) {
var data = await Test.find(); // Here I am getting error Test is not define.
res.json(data);
}
};
Routes.js
'GET /test': {controller:'test', action:'index'}
I found issue I am moving node_modules to previous directory that is the issue.
below Configuration works for me.
FROM node:latest
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY ["package.json", "./"]
RUN npm install --verbose --force
COPY . .
EXPOSE 80
CMD npm start
Related
After building I start the image and see in the terminal:
*Executing task: docker run --rm -it -p 3000:3000/tcp bamdock:latest
Listening on 0.0.0.0:3000*
However when trying to reach http://localhost:3000/ in browser I see:
*The connection was reset*
*The connection to the server was reset while the page was loading.
The site could be temporarily unavailable or too busy. Try again in a few moments.
If you are unable to load any pages, check your computer’s network connection.
If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the web.*
This is my Dockerfile:
FROM node:19 as builder
RUN npm install -g pnpm
WORKDIR /usr/src/app
COPY package*.json ./
RUN pnpm install
COPY prisma ./prisma/
COPY .env ./
RUN npx prisma generate
COPY . .
RUN pnpm run build
FROM node:19-alpine3.16
WORKDIR /app
COPY --from=builder /usr/src/app/build .
COPY --from=builder /usr/src/app/package.json .
COPY --from=builder /usr/src/app/node_modules ./node_modules
EXPOSE 3000
CMD ["node", "index.js"]
Any ideas what I'm missing?
My first time trying to get Docker working ...
I have a NextJS application that accesses a database from the API.
When in development, I have a .env file that contains the host, port, username, password, and database. After running npm run dev, the API functionality works as expected. Even if I run npm run build and npm run start on my local machine, it works correctly.
The problem comes after I push to Github and have Github build the app and deploy it as a docker container. For some reason, the docker build does not accept my environment variables loaded through an attached .env file.
To further elaborate, the environment variables are either on the dev machine or attached to the docker container on the production server. They are mounted to the same place (the root directory of the project: /my-site/.env) but the production API does not work.
The environment variables are included in /lib/db.tsx:
import mysql from "mysql2/promise";
const executeQuery = async (query: string) => {
try {
const db = await mysql.createConnection({
host: process.env.MYSQL_HOST,
database: process.env.MYSQL_DATABASE,
user: process.env.MYSQL_USER,
password: process.env.MYSQL_PASSWORD,
});
const [results] = await db.execute(query, []);
db.end();
return results;
} catch (error) {
return { error };
}
};
export default executeQuery;
This file is included in the API endpoints as:
import executeQuery from "../../../../lib/db";
Again, since it works on the development computer, I think there is an issue is with the building of the docker container.
Here is my included Dockerfile:
FROM node:lts as dependencies
WORKDIR /my-site
COPY package.json yarn.lock ./
RUN yarn install --frozen-lockfile
FROM node:lts as builder
WORKDIR /my-site
COPY . .
COPY --from=dependencies /my-site/node_modules ./node_modules
RUN yarn build
FROM node:lts as runner
WORKDIR /my-site
ENV NODE_ENV production
# If you are using a custom next.config.js file, uncomment this line.
# COPY --from=builder /my-site/next.config.js ./
COPY --from=builder /my-site/public ./public
COPY --from=builder /my-site/.next ./.next
COPY --from=builder /my-site/node_modules ./node_modules
COPY --from=builder /my-site/package.json ./package.json
EXPOSE 3000
CMD ["yarn", "start"]
Any and all assistance is greatly appreciated!
Edit: Other things I have attempted:
Add them as environment variables to the docker container in the docker compose file (under environment) and verified that they are accessible inside of the container using echo $MYSQL_USER.
Mounting the .env file inside of the .next folder
Mounting the .env file inside of the .next/server folder
I ended up solving my own issue after hours of trying to figure it out.
My solution was to create a .env.production file and commit it to git.
I also adjusted my Dockerfile to include: COPY --from=builder /my-site/.env.production ./
I am not a fan of that solution, as it involves pushing secrets to a repo, but it works.
I moved project from PIP to Poetry and my Docker container failed to run on Google Cloud Run.
Last string in Docker:
CMD ["poetry", "run", "uwsgi", "--http-socket", "0.0.0.0:80", "--wsgi-file", "/server/app/main.py", "--callable", "app", "-b 65535"]
It's works locally, it's works on other laptop, it's works in Cloud Run Emulator, but fails when I try to run it on Cloud Run.
Here is a Cloud Run log:
Creating virtualenv my-project-xTUGyw3C-py3.8 in /home/.cache/pypoetry/virtualenvs
FileNotFoundError
[Errno 2] No such file or directory: b'/bin/uwsgi'
at /usr/local/lib/python3.8/os.py:601 in _execvpe
597│ path_list = map(fsencode, path_list)
598│ for dir in path_list:
599│ fullname = path.join(dir, file)
600│ try:
→ 601│ exec_func(fullname, *argrest)
602│ except (FileNotFoundError, NotADirectoryError) as e:
603│ last_exc = e
604│ except OSError as e:
605│ last_exc = e
Container called exit(1).
It's have correct port set. It's doesn't use any environment variables. I don't use volumes, files passed to Docker through COPY.
Logs says that application can't find uwsgi file. That file doesn't exists in local version too, but it's works without any errors.
How that even possible that a docker container behaves differently?
UPD: My Docker file
FROM python:3.8
WORKDIR server
ENV PYTHONPATH $PYTHONPATH:/server
RUN pip install poetry==1.1.11
COPY poetry.lock /server
COPY pyproject.toml /server
RUN poetry install
EXPOSE 80
COPY /data /server/data
COPY /test /server/test
COPY /app /server/app
CMD ["poetry", "run", "uwsgi", "--http-socket", "0.0.0.0:80", "--wsgi-file", "/server/app/main.py", "--callable", "app", "-b 65535"]
adding HOME=/root to the CMD, fixed the problem for me.
CMD HOME=/root poetry run python main.py
Apparently home is not set correctly and this is a known issue
I was trying to dockerize my existing simple vue app , following on this tutorial from vue webpage https://v2.vuejs.org/v2/cookbook/dockerize-vuejs-app.html. I successfully created the image and the container. My problem is that when I edit my code like "hello world" in App.vue it will not automatically update or what they called this hot reload ? or should I migrate to the latest Vue so that it will work ?
docker run -it --name=mynicevue -p 8080:8080 mynicevue/app
FROM node:lts-alpine
# install simple http server for serving static content
RUN npm install -g http-server
# make the 'app' folder the current working directory
WORKDIR /app
# copy both 'package.json' and 'package-lock.json' (if available)
COPY package*.json ./
# install project dependencies
RUN npm install
# copy project files and folders to the current working directory (i.e. 'app' folder)
COPY . .
# build app for production with minification
# RUN npm run build
EXPOSE 8080
CMD [ "http-server", "serve" ]
EDIT:
Still no luck. I comment out the npm run build. I set up also vue.config.js and add this code
module.exports = {
devServer: {
watchOptions: {
ignored: /node_modules/,
aggregateTimeout: 300,
poll: 1000,
},
}
};
then I run the container like this
`docker run -it --name=mynicevue -v %cd%:/app -p 8080:8080 mynicevue/app
when the app launches to browser I get this error in terminal and the browser is whitescreen
"GET /" Error (404): "Not found"
Can someone help me please of my Dockerfile what is wrong or missing so that I can play my vue app using docker ?
Thank you in advance.
Okay I tried your project in my local and here's how you do it.
Dockerfile
FROM node:lts-alpine
# bind your app to the gateway IP
ENV HOST=0.0.0.0
# make the 'app' folder the current working directory
WORKDIR /app
# copy both 'package.json' and 'package-lock.json' (if available)
COPY package*.json ./
# install project dependencies
RUN npm install
# copy project files and folders to the current working directory (i.e. 'app' folder)
COPY . .
EXPOSE 8080
ENTRYPOINT [ "npm", "run", "dev" ]
Use this command to run the docker image after you build it:
docker run -v ${PWD}/src:/app/src -p 8080:8080 -d mynicevue/app
Explanation
It seems that Vue is expecting your app to be bound to your gateway IP when it is served from within a container. Hence ENV HOST=0.0.0.0 inside the Dockerfile.
You need to mount your src directory to the running container's /app/src directory so that the changes in your local filesystem directly reflects and visible in the container itself.
The way in Vue to watch for the file changes is using npm run dev, hence ENTRYPOINT [ "npm", "run", "dev" ] in Dockerfile
if you tried previous answers and still doesn't work , try adding watch:{usePolling: true} to vite.config.js file
import { defineConfig } from 'vite'
import vue from '#vitejs/plugin-vue'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [vue()],
server: {
host: true,
port: 4173,
watch: {
usePolling: true
}
}
})
I have set up a development environment with docker-compose. I use gulp to manage my front end build process, which still runs on the host machine, rather than the container. How can I go about running this in a docker container?
This is for a development environment.
This is not a great idea. You should not have your dev dependencies in the container. You should do the build on your build slave (using Jenkins or something) and then copy the generated build files into the Docker web server container that will serve them up.
Look here - https://medium.com/#DenysVuika/your-angular-apps-as-docker-containers-471f570a7f2
So I do something similar to what you are trying to achieve (I am building a MERN stack type) app which builds my client (usually running on host machine) that then after build copies it over to my node/express server to create an image...
So you could create a Dockerfile to do this and then tie that in to your Docker-Compose? I am assuming you only want to build at set stages?
Where I have yarn build is essentially where you would run gulp......
FROM node:10 as build-deps
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY client/package.json ./
RUN yarn
COPY client ./
RUN yarn build
FROM node:10
RUN mkdir -p /usr/src/app/client
COPY --from=build-deps /usr/src/app/build /usr/src/app/client
COPY server/. /usr/src/app
WORKDIR /usr/src/app
RUN npm install
WORKDIR /usr/src/app
EXPOSE 3050
CMD ["node", "index.js"]