GitHub Action nginx reverse proxy vue.js and Node.js server - docker

I would like to create a CI/CD pipeline with GitHub Action to create a nginx reverse proxy server for my vue.js frontend and for my NestJS server.
What I have done so far:
When I push or pull a request on the main branch, I run the tests.
If the tests pass, build the frontend and backend.
What I want to do now:
Create a docker image of the nginx reverse proxy server for my frontend and backend and push it to the docker registry.
name: CI/CD Pipeline - Runs All tests and if all pass, builds the frontend and backend and deploys and configures the nginx proxy server to serve the frontend and backend.
# 1) on push or pull request to main branch
# 2) run frontend and backend tests
# 3) if tests pass, build the frontend and backend and save the build artifacts (./WEB/dist and ./BACKEND/dist)
# 4) if build succeeds, create a docker image of the nginx proxy server and push it to the docker registry
# 5) if tests fail or build fails, do nothing
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
backend-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- name: Use Node.js 16.x
uses: actions/setup-node#v3
with:
node-version: 16.x
cache: 'npm'
cache-dependency-path: 'BACKEND/package-lock.json'
- name: Execute Backend Unit tests
run: |
npm ci
npm run test
working-directory: ./BACKEND
frontend-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- name: Use Node.js 16.x
uses: actions/setup-node#v3
with:
node-version: 16.x
cache: 'npm'
cache-dependency-path: 'WEB/package-lock.json'
- name: Execute Frontend Unit tests
run: |
npm ci
npm run test
working-directory: ./WEB
build-frontend:
runs-on: ubuntu-latest
needs: [backend-tests, frontend-tests]
steps:
- uses: actions/checkout#v3
- name: Use Node.js 16.x
uses: actions/setup-node#v3
with:
node-version: 16.x
cache: 'npm'
cache-dependency-path: 'WEB/package-lock.json'
- name: Build Frontend
run: |
npm ci
npm run build
working-directory: ./WEB
- name: Save Frontend Build Artifacts
uses: actions/upload-artifact#v2
with:
name: frontend-build-artifacts
path: ./WEB/dist
build-backend:
runs-on: ubuntu-latest
needs: [backend-tests, frontend-tests]
steps:
- uses: actions/checkout#v3
- name: Use Node.js 16.x
uses: actions/setup-node#v3
with:
node-version: 16.x
cache: 'npm'
cache-dependency-path: 'BACKEND/package-lock.json'
- name: Build Backend
run: |
npm ci
npm run build
working-directory: ./BACKEND
- name: Save Backend Build Artifacts
uses: actions/upload-artifact#v2
with:
name: backend-build-artifacts
path: ./BACKEND/dist
I tried to do it with this action:
But I don't know how to create a docker image of reverse nginx proxy server and push it to docker registry with this action.
Thanks in advance for your help.

Related

How can I synchronize Docker container registry with a GitHub action in the same repository?

I have a repository with a Dockerfile and a custom action described by action.yml which references that Dockerfile. At first, I referenced the Dockerfile locally as a path in action.yml using image: 'Dockerfile'. However because GitHub doesn't support caching for Docker image and rebuilds it on every run, it takes a long time to prepare and clutters the logs, and it also doesn't have the required entrypoint file. Thus I upload it to the GitHub container registry on push to master. The problem is now, that it always points to the master tag, which means the tests may be executed on an outdated version as it may not be deployed yet, and I also can't pin tags to the associated tag of the image or go back and get an older state. How can I synchronize the image on the GitHub container registry to the associated state in the repository?
action.yml
[...]
runs:
using: 'docker'
image: docker://ghcr.io/orgaization/repo:master
[...]
.github/workflows/test
on:
workflow_dispatch:
push:
branches: ['master']
pull_request_target:
branches: ['master']
jobs:
test-correct:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout#v2
- name: test
uses: ./
with:
[...]
.github/workflows/publish
on:
workflow_dispatch:
push:
branches: ['master']
pull_request_target:
branches: ['master']
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
jobs:
build-and-push-image:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: actions/checkout#v2
[...]
- name: Log in to the Container registry
[...]
- name: Build and push Docker image
uses: docker/build-push-action#v2
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

Docker : Re-use container image by caching

Here I have two workflows under a job. The only target we want to achieve is that, we want to reuse the container images by using cache or some other means. Similar way we do for node_modules
jobs:
build:
name: build
runs-on: [self-hosted, x64, linux, research]
container:
image: <sample docker image>
env:
NPM_AUTH_TOKEN: <sample token>
steps:
- uses: actions/checkout#v2
- name: Install
run: |
npm install
- name: Build
run: |
npm build
Test:
name: Test Lint
runs-on: [self-hosted, x64, linux, research]
container:
image: <sample docker image>
env:
NPM_AUTH_TOKEN: <sample token>
steps:
- uses: actions/checkout#v2
- name: Install Dependencies
run: npm ci
- name: Lint Check
run: npm run lint
I would suggest using the Docker's Build Push action for this purpose. Through the build-push-action, you can cache your container images by using the inline cache, registry cache or the experimental cache backend API:
Inline cache
name: Build and push
uses: docker/build-push-action#v2
with:
context: .
push: true
tags: user/app:latest
cache-from: type=registry,ref=user/app:latest
cache-to: type=inline
Refer to the Buildkit docs.
Registry Cache
name: Build and push
uses: docker/build-push-action#v2
with:
context: .
push: true
tags: user/app:latest
cache-from: type=registry,ref=user/app:buildcache
cache-to: type=registry,ref=user/app:buildcache,mode=max
Refer to Buildkit docs.
Cache backend API
name: Build and push
uses: docker/build-push-action#v2
with:
context: .
push: true
tags: user/app:latest
cache-from: type=gha
cache-to: type=gha,mode=max
Refer to Buildkit docs.
I personally prefer using the Cache backend API as its easy to setup and provides a great boost in reducing the overall CI pipeline run duration.
By looking at the comments, it seems you want to share Docker cache between workflows. In this case you can share Docker containers between jobs in a workflow using this example:
jobs:
build:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout#v2
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action#v1
-
name: Build and push
uses: docker/build-push-action#v2
with:
context: .
file: ./Dockerfile
tags: myimage:latest
outputs: type=docker,dest=/tmp/myimage.tar
-
name: Upload artifact
uses: actions/upload-artifact#v2
with:
name: myimage
path: /tmp/myimage.tar
use:
runs-on: ubuntu-latest
needs: build
steps:
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action#v1
-
name: Download artifact
uses: actions/download-artifact#v2
with:
name: myimage
path: /tmp
-
name: Load Docker image
run: |
docker load --input /tmp/myimage.tar
docker image ls -a
In general, data is not shared between jobs in GitHub Actions (GHA). jobs actually will run in parallel on distinct ephemeral VMs unless you explicitly create a dependency with needs
GHA does provide a cache mechanism. For package manager type caching, they simplified it, see here.
For docker images, you either can use docker buildx cache and cache to a remote registry (including ghcr), or use the GHA cache action, which probably is easier. The syntax for actions/cache is pretty straightforward and clear on the page. For buildx, documentation always has been a bit of an issue (largely, I think, because he people building it are so smart that they do not realize how much we do not understand what is in their hearts), so you would need to configure the cache action, and then buildx to cache it.
Alternatively, you could do docker save imagename > imagename.tar and use that in the cache. There is a decent example of that here. No idea who wrote it, but it does the job.

GitHub Actions does not trigger on push event

I have a repository that has two folders and both of them have a Dockerfile inside. Only one of them has GitHub Actions configured to build the Dockerfile. It used to work just fine but now it does not trigger at all.
What could be the reason for it? This is the GitHub Action flow I have built. There are no exceptions.
name: Docker Image CI
on:
workflow_dispatch:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Login to GitHub Package Registry
run: echo ${{ secrets.GITHUB_TOKEN }} | docker login docker.pkg.github.com -u ${{ github.repository }} --password-stdin
- name: Build the Docker image
run: docker build -t dashboard:latest .
- name: Tag the Docker image
run: docker tag dashboard:latest docker.pkg.github.com/test/dashboard/dashboard:latest
- name: Push the Docker image to the registry
run: docker push docker.pkg.github.com/test/dashboard/dashboard:latest

How to run cached Docker image in Github Action?

I don't know how to run a cached Docker image in Github Actions.
I've followed a tutorial about Publishing Docker images to implement a task that would cache, build and push Docker image to a DockerHub.
I need to build, cache and run the image, the image publishing is optional.
My goal is to speed up CI workflow.
Here is the Github Actions workflow:
name: CI
# Controls when the action will run.
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ master ]
pull_request:
branches: [ master ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- name: Check Out Repo
uses: actions/checkout#v2
with:
fetch-depth: 0
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action#v1
- name: Cache Docker layers
uses: actions/cache#v2
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Login to Docker Hub
uses: docker/login-action#v1
with:
username: ${{ secrets.DOCKER_HUB_USERNAME }}
password: ${{ secrets.DOCKER_HUB_ACCESS_TOKEN }}
- name: Build and push
id: docker_build
uses: docker/build-push-action#v2
with:
context: ./
file: ./Dockerfile
builder: ${{ steps.buildx.outputs.name }}
push: true
tags: ivan123123/c_matrix_library:latest
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache
#- name: Run Docker container
# run: ???
# Upload gcovr code coverage report
- name: Upload GCC Code Coverage Report
uses: actions/upload-artifact#v2
with:
name: coveragereport
path: ./builddir/meson-logs/coveragereport/
- name: Upload code coverage reports to codecov.io page
run: bash <(curl -s https://codecov.io/bash)
Edit:
I've found no solution to running cached Docker image, but I have managed to build cached image every time I run CI workflow with docker/setup-buildx-action#v1 action. Because the image is cached, we don't need to download every Docker image dependencies thus saving time from 3 minutes originally to only 40 seconds.
Below is the Github Actions workflow:
name: CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Check Out Repo
uses: actions/checkout#v2
with:
fetch-depth: 0
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action#v1
- name: Cache register
uses: actions/cache#v2
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ hashFiles('**/Dockerfile') }}
- name: Build Docker image
uses: docker/build-push-action#v2
with:
context: ./
file: ./Dockerfile
builder: ${{ steps.buildx.outputs.name }}
load: true
tags: c_matrix_library:latest
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache
- name: Run Docker container
run: docker run -v "$(pwd):/app" c_matrix_library:latest
If you want to cache a published Docker image that lives in the Docker Repository, you can do:
- name: Restore MySQL Image Cache if it exists
id: cache-docker-mysql
uses: actions/cache#v3
with:
path: ci/cache/docker/mysql
key: cache-docker-mysql-5.7
- name: Update MySQL Image Cache if cache miss
if: steps.cache-docker-mysql.outputs.cache-hit != 'true'
run: docker pull mysql:5.7 && mkdir -p ci/cache/docker/mysql && docker image save mysql:5.7 --output ./ci/cache/docker/mysql/mysql-5.7.tar
- name: Use MySQL Image Cache if cache hit
if: steps.cache-docker-mysql.outputs.cache-hit == 'true'
run: docker image load --input ./ci/cache/docker/mysql/mysql-5.7.tar
- name: Start containers
run: docker compose up -d
When docker compose up runs, if a service uses the Docker image mysql:5.7 image, it's going to skip downloading it.
This might not fully answer you question since I think there is no actual way of running your cached image.
But you can speed up your build using Github's cache, I have posted a complete tutorial about this that you can read here
Summarizing you can setup Docker buildx and then use GH cache
with build-push-action:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action#v1
- name: Build and push
uses: docker/build-push-action#v2
with:
context: .
file: ./Dockerfile
push: true
tags: ivan123123/c_matrix_library:latest
cache-from: type=gha
cache-to: type=gha
Edit
Just found a reference in build-push action that might be useful to you:
https://github.com/docker/build-push-action/blob/master/docs/advanced/share-image-jobs.md
This question is a bit old now, but I've found the documented way of running a built image from the docker/build-push-action in a subsequent step. In short, you have to set up a local registry.
The yaml below has been directly copy + pasted from here.
name: ci
on:
push:
branches:
- 'main'
jobs:
docker:
runs-on: ubuntu-latest
services:
registry:
image: registry:2
ports:
- 5000:5000
steps:
-
name: Checkout
uses: actions/checkout#v3
-
name: Set up QEMU
uses: docker/setup-qemu-action#v2
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action#v2
with:
driver-opts: network=host
-
name: Build and push to local registry
uses: docker/build-push-action#v3
with:
context: .
push: true
tags: localhost:5000/name/app:latest
-
name: Inspect
run: |
docker buildx imagetools inspect localhost:5000/name/app:latest
Edit:
As mentioned by Romain in the comments. The initial solution will pull the image at the beginning of the workflow and as such will not use the image that is built during the workflow. The only solution seem to be running docker run yourself in the step:
- name: Run my docker image
run: >
docker run -t ivan123123/c_matrix_library:latest
...
On a side note. Using this solution might get a bit complicated if you use services in your job. In which case, the networking between your container and the service containers will be troublesome
Original answer:
To run the image you can use the following:
- name: Run my docker image
uses: docker://ivan123123/c_matrix_library:latest
with:
entrypoint: ...
args: ...
The entrypoint and args are optional. You can find more info here. One limitation though is that you can use any variable or context in the uses field. You can only hardcode the name and tag of the image.

Run deployment workflow if tests workflow is passed

I have the following two workflows:
Workflow to run test suite
Workflow to deploy the code, using https://github.com/miloserdow/capistrano-deploy
Now when I push my code, both workflows started. I want deployment Work to only start once Test Suite gets passed.
How can I do this?
Workflow that runs tests:
name: CI
on:
push:
branches: [setup_github]
jobs:
test:
runs-on: ubuntu-18.04
services:
postgres:
image: postgres:10
steps:
- name: Checkout
uses: actions/checkout#v1
- name: Set up Ruby
uses: ruby/setup-ruby#v1
with:
ruby-version: 2.5.3
- uses: borales/actions-yarn#v2.0.2
with:
cmd: install
- name: Install Dependencies
run: |
sudo apt-get -yqq install libpq-dev
- name: Install Gems
run: |
gem install bundler
- name: prepare Database
- name: RSpec
run: |
bundle exec rspec specs
Workflow that deploys:
name: Deploy on server
on:
push:
branches:
- setup_github
jobs:
deploy:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v1
- uses: ruby/setup-ruby#v1
with:
ruby-version: 2.5.3
bundler-cache: true
- uses: miloserdow/capistrano-deploy#master
with:
target: staging
deploy_key: ${{ secrets.DEPLOY_ENC_KEY }}
Your goal should be achievable by ensuring the following are true:
The CI workflow has run
The CI workflow was success
name: Deploy on server
on:
workflow_run:
workflows: [CI]
branches: [setup_github]
types:
- completed
jobs:
deploy:
if: ${{ github.event.workflow_run.conclusion == 'success' }}
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v1
- uses: ruby/setup-ruby#v1
with:
ruby-version: 2.5.3
bundler-cache: true
- uses: miloserdow/capistrano-deploy#master
with:
target: staging
deploy_key: ${{ secrets.DEPLOY_ENC_KEY }}
This is described in the Github Actions docs on workflow_run.

Resources