sveltekit app running in docker shows changes only for second - docker

I have dockerized the sveltekit app and my issue is that when I am running container
and when I make changes in frontend UI I am able to see them only for 1 second and then
my frontend is looking like before any changes.
I think that problem is about caching in sveltekit.
My Dockerfile:
FROM node:16
WORKDIR /test-app
COPY package.json package-lock.json ./
RUN npm ci
COPY . .
RUN npm run build
ENV PORT 3000
EXPOSE 3000
EXPOSE 24678
CMD ["node", "build"]
My docker-compose.yaml file:
version: '3'
services:
svelte-test:
image: sveltekit-test:node
volumes:
- ./:/test-app/
ports:
- 3000:3000
- 24678:24678
- 5173:5173
tty: true
stdin_open: true
Port 3000 is for sveltekit, 5173 is for sveltekit but in Docker and 24678 is for vite.
My folder structure is:
sveltekit-docker
test-app
-Dockerfile
-docker-compose.yaml
-package-lock.json
-package.json
-svelte.config.js
-tsconfig.json
-vite.config.js
-all sveltekit folders (src, node_modules, static, tests)

If you want to see the changes you need HMR (Hot module reloading) available in dev mode.
You also may need to rebuild esbuild. Try replacing the last line of your Dockerfile with
CMD npm rebuild esbuild && npm run dev -- --host
I created a repo if it helps.
To use it:
docker build --tag sveltekit-test .
docker-compose up

Related

Nuxt 3 Docker doesn't recognize new pages, what am I doing wrong?

I have a problem with my Nuxt 3 project that I run with Docker (dev environment).
Nuxt 3 should automatically create routes when I create .vue files in pages directory, and that works when I run my project outside of Docker, but when I use Docker it doesn't recognize my files until I restart the container. Same thing happens when I try to delete files from pages directory, it doesn't recognize any changes until I restart the container. Weird thing is that this happens only in pages directory, in other directories everything works fine. Just to mention that hot reload works, I set up vite in nuxt.config.ts.
docker-compose.yaml
version: '3.8'
services:
nuxt:
build:
context: .
image: nuxt_dev
container_name: nuxt_dev
command: npm run dev
volumes:
- .:/app
- /app/node_modules
ports:
- "3000:3000"
- "24678:24678"
Dockerfile:
FROM node:16.14.2-alpine
WORKDIR /app
RUN apk update && apk upgrade
RUN apk add git
COPY ./package*.json /app/
RUN npm install && npm cache clean --force && npm run build
COPY . .
ENV PATH ./node_modules/.bin/:$PATH
ENV NUXT_HOST=0.0.0.0
ENV NUXT_PORT=3000
EXPOSE 3000
CMD ["npm", "run", "dev"]
I tried some things with Docker volumes, like to add a separate volume just for pages, like this:
./pages:app/pages
/pages:app/pages
app/pages
but as I thought, none of those things helped.
One more thing that is weird to me, when I created a .vue file in pages directory, I checked if it appeared in the container and it did. I'm not an expert in Docker nor in Nuxt, I just started to learn, so any help would be much appreciated.

2 docker builds into a multi-build

I have 1 dockerfile, 1 stage of the build for the node server, serving some data, and the 2nd stage is a react app. I use a docker compose file to run the dockerfile.
I am able to access the react app via port 3000, but the 2nd stage server isn't running so I can't access the data.
Any idea how to solve it?
FROM node:12.6
WORKDIR /usr/src/app
COPY package.json .
COPY . .
EXPOSE 5500 // node server
CMD ["npm","run", "server"]
FROM node:12.6
WORKDIR /usr/src/app
COPY package.json .
RUN npm i
COPY . .
EXPOSE 3000 // react app
CMD ["npm","run", "dev"]
version: "3.9"
services:
testingapp:
container_name: testingApp
build: .
volumes:
- ./src:/app/src:delegated
ports:
- "3000:3000"
I have read various docs online.
You're trying to run the front- and back-ends in the same container. A container only runs one process, though; if you need two separate processes from the same code base then you can run two separate containers off the same image, overriding the command: on one of them.
So reduce the Dockerfile to copy the code base in, and declare one process or the other as the main container command:
FROM node:12.6
WORKDIR /usr/src/app
COPY package.json package-lock.json ./
RUN npm ci
COPY ./ ./
EXPOSE 3000
CMD ["npm", "run", "server"]
Now in your Compose file, declare two separate containers. For the second, override the command: with the alternate program to run. Both can build: the same image; the second build will come entirely from the Docker layer cache and be all but free. The code is built into the image and you don't need to replace it using volumes:.
version: '3.8'
services:
express:
build: .
ports: ['5500:3000']
react:
build: .
command: npm run dev
ports: ['3000:3000']

NextJS Docker error: Couldn't find a `pages` directory. Please create one under the project root

Goal
Dockerize NextJS application
Problem
Docker compose up yields in the following error: Couldn't find a pages directory. Please create one under the project root.
Application
Files & folders
docker-compose.yml
web
.next
pages
public
.dockerignore
dockerfile
[more nextjs files & folders here]
docker-compose
version: '3'
services:
web:
build:
context: web
dockerfile: dockerfile
ports:
- "3000:3000"
container_name: rughood_web
dockerfile
FROM node:16
WORKDIR /web
COPY package*.json .
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "run", "dev"]
.dockerignore
Dockerfile
.dockerignore
node_modules
npm-debug.log
README.md
.git
Note!
The NextJS application itself is working fine when I run npm run dev within the web directory (which invokes the script "dev": "next dev" in package.json). I only have the error when trying to dockerize it. Moreover, in the docker-compose I also initiate a Redis cache, which is working fine too. Therefore I conclude the error must be how I try to combine Docker and NextJS. Thank you very much in advance :)
Update 1
How I got there
Using the tips from #HansKilian and Exploring Docker container's file system I did the following:
Cd to the web directory
Built an image from the dockerfile docker build .
Explored the image with the following command docker run --rm -it --entrypoint=/bin/bash name-of-image
Once inside, execute ls or ls -lsa
This gave me the following results:
What's in the derived image
dockerfile
next-env.d.ts
next.config.js
node_modules
package-lock.json
package.json
pages
public
tsconfig.json
[Among other files/folders]
So the pages folder actually seems to be in the root of the container, yet still I get the error (pages is a directory in the container in which I can cd and -ls)
P.s. don't forget to delete your image if you're not going to using it anymore
Update 2
Building the image and running it from within the web directory actually works, so it might actually have something to do with the docker-compose?
Here is my working Dockerfile with nextjs:
FROM node:16.14.0
RUN npm install -g npm#8.5.5
RUN mkdir -p /app
WORKDIR /app
COPY package*.json /app
RUN npm i
COPY . /app
EXPOSE 3000
RUN npm run build
CMD ["npm", "run", "dev"]
And docker-compose.yml :
version: "3.7"
services:
project-name:
image: project-name
build:
context: .
dockerfile: Dockerfile
container_name: project-name
restart: always
volumes:
- ./:/app
- /app/node_modules
- /app/.next
ports:
- "3000:3000"
While I was trying every single line of code ever uploaded to the internet, I came back to my initial set-up (from the question) and suddenly it now does work. Source control confirming I didn't change a thing.
To be sure, I deleted all containers, images and volumes from Docker and ran docker compose up. Yet still it worked. Tried many things to recreated the error, but I couldn't. Thank you all for helping and hopefully this may be come to use for someone else!

Container exited with code 0, and my app is served from the host OS

I want to dockerize a Next.js project.
I am using Ubuntu 20.04
I first created a Next.js app in my /home/user/project/ folder using npx create-next-app
So I have the project source code in my host machine.
But I want to dockerize it, so I created a docker-compose.yaml:
next:
build:
context: ./next
dockerfile: Dockerfile
container_name: next
volumes:
- ./next:/var/www/html
ports:
- "3000:3000"
networks:
- nginx
And this is the Dockerfile:
#Creates a layer from node:alpine image.
FROM node:alpine
#Creates directories
RUN mkdir -p /usr/src/app
#Sets an environment variable
ENV PORT 3000
#Sets the working directory for any RUN, CMD, ENTRYPOINT, COPY, and ADD commands
WORKDIR /usr/src/app
#Copy new files or directories into the filesystem of the container
COPY package.json /usr/src/app
COPY package-lock.json /usr/src/app
#Execute commands in a new layer on top of the current image and commit the results
RUN npm install
##Copy new files or directories into the filesystem of the container
COPY . /usr/src/app
#Execute commands in a new layer on top of the current image and commit the results
RUN npm run build
#Informs container runtime that the container listens on the specified network ports at runtime
EXPOSE 3000
#Allows you to configure a container that will run as an executable
ENTRYPOINT ["npm", "run"]
Then I build my container using docker-compose build && docker-compose up.
The container is built, but it's not running and is displaying EXITED (0)
and the LOGS has the following message:
Lifecycle scripts included in next-frontend#0.1.0:
start
next start
available via `npm run-script`:
dev
next dev
build
next build
lint
next lint
But of course if I run in the host npm run dev it will run the app from the host, and not from the container (It runs, but that's not what I want)
I feel like there is some very fundamental mistake in my deployment, but I just started with Docker so I can't find out what
Also, I copied the Dockerfile from a tutorial so it might not fit the way I created the project
ENTRYPOINT ["npm", "run"]... What?
From npm run documentation,
This runs an arbitrary command from a package's "scripts" object. If no "command" is provided, it will list the available scripts.
In the docker-compose.yml, you need to override the CMD instruction (that is empty in your case) with the npm script you want to run. Something like this:
next:
build:
context: ./next
dockerfile: Dockerfile
container_name: next
command: ["start"]
volumes:
- ./next:/var/www/html
ports:
- "3000:3000"
networks:
- nginx
Since you are using the Compose Spec, this is the reference for the command instruction.

Next running inside docker is not seeing files changed on the host

I'm trying to run Next.js in dev mode inside of docker. I'd like make changes on my host machine and have Next.js rebuild the app running inside of docker. The problem is that it will do the initial build but not perform any rebuilds.
Dockerfile.dev
FROM node:lts
WORKDIR /usr/src/app
COPY package*.json ./
RUN yarn install
COPY . .
RUN npm run build
docker-compose.dev.yml
version: "3.8"
services:
backend-service:
build:
context: .
dockerfile: ./Dockerfile.dev
command: npm run dev
ports:
- 3001:3001
volumes:
- .:/usr/src/app
- /usr/src/app/node_modules
- /usr/src/app/.next
Note: 3001 is the port it's running on. That's not a mistake.
What am I missing?
Edit
I'm seeing recommendations for npm run watch but the Next documentation specifies dev, build, start scripts. Their specific naming is not important but dev or the next command is how you watch for changes and rebuild.

Resources