Copy files from calling repo to action - docker

I want to create a github docker container action that uses files from the repository using the action.
Do I need to check the repository out by myself or can I specify a point where the files are copied if it was checked out before?

Github actions do not have a concept of repository or branch. Each workflow is run in a container and only has that container to interact with.
Actions can have inputs. You can create an input that should point to the location of the user's repository.
In your action.yml:
inputs:
repo_path:
description: 'The path to the repository in the current environment'
required: true
In your code, you can check for the repo_path input to get the path to the repository on the file system.
You can also check if the GITHUB_WORKSPACE points to a git repository which means that the user had used the actions/checkout action in a prior step.

Related

Exntended Choice Parameter - does not see property file path from gitlab repository

I have created choice parameted using Extended Choice Parameter property files. When used locally jenkins sees file's path correctly and choices are visible, but when I use deployed version and try to get the property file from git repository it does not recognize the path.
I use pipeline script from SCM, pass credentials and connect with no problems. Jenkins file is read from git repository correctly.
Can the Extended Choice Paramters use git files? What is it's working directory? I don't know where the extension is looking for for this files.

Github: Building a docker image in a repository with multiple Dockerfile

I have a repository that has multiple directories each containing a python script, a requirements.txt, and a Dockerfile. I want to use Github actions to build an image on merge to the main branch and push this image to Github packages. Each Dockerfile defines a running environment for the accompanying python script. But want to build only that directory's Dockerfile on which changes are made and tag each image with the directory name and a version number that is different from other directories version changes. How can I accomplish this? An example workflow.yml file would be awesome.
Yes, it is possible.
In order to trigger a separate build for every Dockerfile, you will need a separate workflow - e.g for every Dockerfile - one workflow. The workflows should be triggered on push to main and specify paths for the Dockerfile.
For building the images themselves you can use GitHub Action Build and push Docker images
I saw on another thread for building specific Dockerfile which has been changed by using another github action.
Build Specific Dockerfile from set of dockerfiles in github action

Copy file from Docker container to Github repository

I have a Dockerfile and a tex file in my repository. I use Github Actions to build an image(ubuntu 18.10 with packages for PDFLaTeX) and run a container, which gets main.tex and produces main.pdf with PDFLaTeX. So far everything seems to work OK, but the problem is I can't copy the pdf from container to repository. I tried using docker cp:
docker cp pdf-creator:/main.tex .
But it doesn't seem to work, as pdf doesn't appear in my repository. Can you think of any other way to solve this?
The docker cp command copies a file into the local filesystem. In the context of a GitHub action, this is just whatever virtual environment is being used to run your code: it has nothing to do with your repository.
The only way to add something to your repository is to git add the file, git commit the change, and git push the change to your repository (which of course requires providing your Action with the necessary credentials to push changes to your repository, probably using a GitHub Secret).
But rather than adding the file to your repository, maybe you want to look at support for Artifacts? This lets you save files generated as part of your workflow and make them available for Download.
The workflow step would look something like:
- name: Archive generated PDF file
uses: actions/upload-artifact#v2
with:
name: main.pdf
path: /main.pdf
See the linked docs for more information.

How to pass job-specific files to Jenkins as part of job configuration?

I have a jenkins job that pulls source code from GitHub public repo. I need to pass some files such as instance-specific configuration files containing secrets to the job and merge with source code prior to running build because these files are obviously inappropriate to be put in public SCM. The Jenkins instance is a multi-tenanted shared service.
The config files don't change often so I don't want to implement using file parameter which forces user manually input the file on every run. Another reason file parameter doesn't work is some builds are triggered automatically by SCM.
I don't want to use Config File Provider Plugin either, because the plugin requires jenkins admin access but I want users with job-level privileges manage the files themselves.
Ideally the uploaded files are saved alongside with job config.xml instead of in workspace, because I would like to delete workspace after each build. I can write scripts to copy the files from job config folder to workspace.
Are there any solutions available? Thanks.
If the "special" files are being placed in a folder with say some access privileges to it, couldn't you either run a Pre-SCM-Buildstep to move the files with shell commands, or introduce a regular build step (i.e. after the SCM stuff and before the other build steps) that would also use shell commands to move files?

Custom capistrano task for working with scm repository

Is there any way to create a custom capistrano task for performing other actions on a scm repository?
For instance, I would like to create a task that will checkout a particular folder from my repository and then symlink it into the shared/ directory of the main project on my server.
I know this can be done by creating a task and explicity defining the "svn co ..." command along with the scm username, password, and repository location. But this would display the password in plain text. Are there any built-in capistrano variables/methods that would help in this process?
I have found a solution to this problem by setting the svn:externals property on a folder in my repository to contain a folder from another repository. When I perform cap:deploy, this folder is populated with the HEAD from the other repository.

Resources