I'm trying to create a docker image using GitHub Actions from a static web application built with npm. However, while running the dockerfile, the /dist folder is not copied into the image as expected.
This is the dockerfile:
FROM nginx:1.21.6-alpine
COPY dist /usr/share/nginx/html
And this is the action:
name: Deploy
on:
push:
tags:
- v*
jobs:
build-homolog:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Setup
uses: actions/setup-node#v3
with:
node-version: '16'
- name: Build
env:
NODE_ENV: homolog
run: npm install; npm run build; docker build -t my-image:1.0.0 .
The result is a working nginx but without content, it just shows its default page. When I run the npm build and the docker build locally on my machine, it works as expected. I think there is a problem with the directory structure on the GitHub Actions machine, but I can't seem to understand it.
Related
I am having trouble getting mkdocs to work within a container being run by GitHub actions on commit.
Hi all,
I have been trying to get my python code documentation up on GitHub. I have managed to do this via GitHub actions running
mkdocs gh-deploy --force
using the below GitHub action workflow:
name: ci
on:
push:
branches:
- master
- main
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- uses: actions/setup-python#v4
with:
python-version: 3.x
- run: pip install mkdocs
- run: pip install mkdocs-material
- run: pip install mkdocstrings[python]
- run: mkdocs gh-deploy --force --config-file './docs/mkdocs.yml'
The issue with this is that mkdocstrings did not work, and so no source code was shown on the webpage. I have made a docker container with access via volume binding to the .github folder on my local computer.
Dockerfile:
FROM ubuntu:20.04
# This stops being asked for geographical location with apt-get
ARG DEBIAN_FRONTEND=noninteractive
WORKDIR /
COPY requirements.txt /
# TODO: #1 Maybe should not use update (as this can change environment from update to update)
RUN apt-get update -y
RUN apt-get install -y python3.10 python3-pip git-all expect
RUN pip install -r requirements.txt
Docker compose:
version: "3.9"
services:
mkdocs:
build: .
container_name: mkdocs
ports:
- 8000:8000
env_file:
- ../.env
volumes:
- ../:/project
working_dir: /project/docs
command:
sh -c "./gh-deploy.sh"
This works when I run the docker container on my computer, but of course when it is run as a workflow on GitHub actions it does not have access to a .github folder. The GitHub action is:
name: dockerMkdocs
on:
push:
branches:
- master
- main
jobs:
build:
runs-on: ubuntu-latest
env:
GH_user: ${{ secrets.GH_user }}
GH_token: ${{ secrets.GH_token }}
steps:
- uses: actions/checkout#v2
- name: Build the Docker image and run
run: docker compose --file ./docs/Docker-compose_GA.yml up
Anyone know how mkdocs knows it is running in a github action when run in the first example above but it then does not have access to the same "environment" when running in a container in docker? If I could answer this, then I can get 'mkdocs gh-deploy --force' to work within github actions and speed up CI/CD.
My GitHub repo is at: https://github.com/healthENV/healthENVsandbox
Many thanks
I think you have two options:
1. Run the entire job inside of a container
In that case, the checkout action will get your repository and then the script you run can find the necessary files. This works because all steps in the job are executed inside of the container.
2. Mount the $GITHUB_WORKSPACE folder
Mount the folder with the checked out repo in the container. You already mount a folder to the project folder, but it seems that is not the correct folder. You can run a check to see what the current folder is before you run docker compose (and maybe an extra one inside of the script as well.
Have a few queries around it
I have a docker build failing due to path. It works when I execute on local PC. What should be the path here?
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: checkout
uses: actions/checkout#v3
- name: Node
uses: actions/setup-node#v3
with:
node-version: 14.15
- name: Dependencies
run: npm install --legacy-peer-deps
- name: Build
run: npm run build-prod
- name: Docker Login
run: docker login -u $USERNAME -p $PWD
- name: Build
run: docker build . --tag $REPO:latest
- name: Docker Push
run: docker push $REPO:latest
Error
> [3/3] COPY /dist/app-name /usr/share/nginx/html:
145
------
146
error: failed to solve: failed to compute cache key: failed to walk /var/lib/docker/tmp/buildkit-mount67345738657/dist: lstat /var/lib/docker/tmp/buildkit-mount67345738657/dist: no such file or directory
Dockerfile
FROM nginx:1.17.1-alpine
COPY nginx.conf /etc/nginx/nginx.conf
COPY /dist/app-name /usr/share/nginx/html
How do I get the tag version when a new release is created to start the build with two tags: latest and {tag} created for the build. This is to keep build backups of old tags in docker. What will be the changes above to do two builds?
We have a java application which uses maven, docker and Github actions.
The below snippet is from our Dockerfile.
FROM maven:3.6.3-jdk-8-openj9 AS builder
RUN mkdir /app
WORKDIR /app
ADD . .
RUN mvn clean install
And then we have a deploy.yml for GitHub actions. The issue is that on GitHub actions, maven always downloads the dependencies and then creates a jar and finally a docker image is created.
Using below tutorial, I have tried to implement caching in GitHub actions.
https://evilmartians.com/chronicles/build-images-on-github-actions-with-docker-layer-caching
The key for the cache in my case is calculated as below:
key: ${{ runner.os }}-buildx-${{ hashFiles('pom.xml') }}
Also made the following changes in the Dockerfile.
FROM maven:3.6.3-jdk-8-openj9 AS builder
RUN mkdir /app
WORKDIR /app
ADD . .
RUN mvn clean dependency:copy-dependencies
ADD . .
RUN mvn install
Still I do not see any significant changes in the reduction in build time.
What I am trying to do is that I want the maven dependencies download as a separate layer in docker image, and caching this docker layer which can be later re-used in the final docker image build.
If anyone can shade a light on this issue.
Use Buildkit(buildkit),now it is already part of every Docker Engine(19 and latest versions for sure).
Very nice Medium post
Introducing buildkit
This is nice example which you can use
docker cache ci
although with Python.
Reagarding the CI environment,Github Actions has fantastic build-push-action
Example
name: ci
on:
push:
branches:
- "master"
jobs:
docker:
runs-on: ubuntu-20.04
steps:
# Check out code
- name: Checkout
uses: actions/checkout#v2
# This is the a separate action that sets up buildx runner
- name: Set up Docker Buildx
uses: docker/setup-buildx-action#v1
# So now you can use Actions' own caching!
- name: Cache Docker layers
uses: actions/cache#v2
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Login to DockerHub
uses: docker/login-action#v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# And make it available for the builds
- name: Build and push
uses: docker/build-push-action#v2
with:
context: .
push: false
tags: user/app:latest
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new
- name: Move cache
run: |
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
I have got a repo setup with 3 major branches. master, development and demo. When i commit I run through a global gitops file and pass in a Dockerfile.
Using Github.
If i'm pushing either development or demo i want to run npm run development, if master then i want to run npm run production. However I can't figure out how to pass the branch names into the docker file.
# gitops.yaml
jobs:
gitops:
uses: github-actions/.github/workflows/gitops.yaml#v1
with:
dockerfile: ./docker/php/Dockerfile
secrets:
DOCKER_BUILD_ARGS: |
ENVIRONMENT=${GITHUB_REF#refs/heads/}
# gitops.yaml#v1
jobs:
build:
- name: Build and push
id: docker_build
uses: docker/build-push-action#v2
with:
push: true
context: .
file: ${{ inputs.dockerfile }}
build-args: ${{ secrets.DOCKER_BUILD_ARGS }}
# Dockerfile
FROM node:11 as node
COPY package.json package-lock.json ./
RUN npm ci
COPY . .
ARG ENVIRONMENT
RUN npm run ${ENVIRONMENT} && rm -rf node_modules/
The above doesn't work at all, not too sure how I go about this.
you didn't specify which event triggers your workflow.
assuming you are using the pull_request event to trigger your workflow, then you can use github context. specifically, github.head_ref (or GITHUB_HEAD_REF environment variable)
The head_ref or source branch of the pull request in a workflow run. This property is only available when the event that triggers a workflow run is either pull_request or pull_request_target.
since you are using node, i suggest you will leverage NODE_ENV environment variable within your Dockerfile.
I want to run some NPM scripts, create a docker image and publish it on dockerhub.
I get this error trying to generate the image. It seems the second job doesn't see the build directory.
COPY failed: file not found in build context or excluded by .dockerignore: stat build/: file does not exist
Dockerfile
FROM httpd:2.4-alpine
COPY ./build/ /usr/local/apache2/htdocs/myapp/
EXPOSE 80
this is my workflow
name: CD
on:
push:
branches: [ main ]
jobs:
build:
name: App build
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v2
- name: Npm install
run: npm install
- name: Npm build
run: npm run build
deploy:
name: Docker image in DockerHub repository
runs-on: ubuntu-18.04
needs: build
steps:
- uses: actions/checkout#v2
- name: LS
run: ls -R
- name: Login to dockerhub
run: docker login -u ${{ secrets.DOCKER_HUB_USER }} -p ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Build Docker image
run: docker build -f ./Dockerfile -t myaccount/myapp .
- name: Push Docker image to DockerHub
run: docker push myaccount/myapp:latest
Project structure
| Dockerfile
| package.json
| README.md
| webpack.config.js
+---.github
| \---workflows
| deploy.yml
+---build
+---src
Update: I changed my workflow to ls the whole GITHUB_WORKSPACE.
build dir is actually missing (the other files are there). Yet, the build process (the first job) ends without errors, and if I try to ls -R in the first job the build dir is there. It is missing in the second job.
It seems the state of the workspace at the end of the first job is not available to the second job.
It seems for that you need actions/upload-artifact and actions/download-artifact.
name: CD
on:
push:
branches: [ main ]
jobs:
build:
name: App build
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v2
- name: Npm install
run: npm install
- name: Npm build
run: npm run build
- name: LS
run: ls -R
- name: Temporarily save webpack artifact
uses: actions/upload-artifact#v2
with:
name: webpack-artifact
path: build
retention-days: 1
deploy:
name: Docker image in DockerHub repository
runs-on: ubuntu-18.04
needs: build
steps:
## Build and deploy Docker images to DockerHub
- uses: actions/checkout#v2
- name: Retrieve built package
uses: actions/download-artifact#v2
with:
name: webpack-artifact
path: build
- name: LS
run: ls -R
- name: Login to dockerhub
run: docker login -u ${{ secrets.DOCKER_HUB_USER }} -p ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Build Docker image
run: docker build -f ./Dockerfile -t myaccount/myapp ./
- name: Push Docker image to DockerHub
run: docker push myaccount/myapp:latest
2 jobs in Github Actions are run on 2 separate machines, so the second job cannot see the first one. The solution is to put them into one job.
name: CD
on:
push:
branches: [ main ]
jobs:
deploy:
name: Docker image in DockerHub repository
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v2
- name: Npm install
run: npm install
- name: Npm build
run: npm run build
- name: LS
run: ls -R
- name: Login to dockerhub
run: docker login -u ${{ secrets.DOCKER_HUB_USER }} -p ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Build Docker image
run: docker build -f ./Dockerfile -t myaccount/myapp .
- name: Push Docker image to DockerHub
run: docker push myaccount/myapp:latest