Is there any way to create a custom capistrano task for performing other actions on a scm repository?
For instance, I would like to create a task that will checkout a particular folder from my repository and then symlink it into the shared/ directory of the main project on my server.
I know this can be done by creating a task and explicity defining the "svn co ..." command along with the scm username, password, and repository location. But this would display the password in plain text. Are there any built-in capistrano variables/methods that would help in this process?
I have found a solution to this problem by setting the svn:externals property on a folder in my repository to contain a folder from another repository. When I perform cap:deploy, this folder is populated with the HEAD from the other repository.
Related
I have created choice parameted using Extended Choice Parameter property files. When used locally jenkins sees file's path correctly and choices are visible, but when I use deployed version and try to get the property file from git repository it does not recognize the path.
I use pipeline script from SCM, pass credentials and connect with no problems. Jenkins file is read from git repository correctly.
Can the Extended Choice Paramters use git files? What is it's working directory? I don't know where the extension is looking for for this files.
Within a Jenkins-Groovy pipelines I want to do the following:
Clone a particular GitLab based Code repo.
Then within this repo I want to find out all the files where there is a particular string. Example: "find_me"
Once found I want to change all these files from find_me to found_me.
Then commit these changes to the GitLab repo.
Step 4 above maybe I can find out myself, but I am struggling on how to do the 2nd and 3rd steps mentioned above.
Can anyone please suggest what can be the best way to do this?
Pipeline: SCM Step
findFiles: Find files in the workspace
readFile: Read file from workspace, writeFile: Write file to workspace, prependToFile: Create a file (if not already exist) in the workspace, and prepend given content to that file.
You can't commit to a GitLab repo directly from within Jenkins. You add/commit/merge locally, then you push. See, for instance: Is it possible to Git merge / push using Jenkins pipeline.
I want to create a github docker container action that uses files from the repository using the action.
Do I need to check the repository out by myself or can I specify a point where the files are copied if it was checked out before?
Github actions do not have a concept of repository or branch. Each workflow is run in a container and only has that container to interact with.
Actions can have inputs. You can create an input that should point to the location of the user's repository.
In your action.yml:
inputs:
repo_path:
description: 'The path to the repository in the current environment'
required: true
In your code, you can check for the repo_path input to get the path to the repository on the file system.
You can also check if the GITHUB_WORKSPACE points to a git repository which means that the user had used the actions/checkout action in a prior step.
I can't find anything in the docs on how to do this - anybody have any ideas?
It seems it is currently not possible. But you can easily init a local git repository and use it as SCM without any remote hostings.
To init a git repo use the following commands in the root directory of your shared library (for example C:\Users\Jenkins\pipeline-shared-library-test):
git init
git add .
git commit -m "init"
Then in Manage Jenkins->Configure System->Global Pipeline Libraries you can point Project Repository to you local repo using a file URI file:///c:/Users/Jenkins/pipeline-shared-library-test
This approach works fine for me.
You can use File System SCM plugin to load your library from file system.
Once you have installed this plugin, use "Legacy SCM" in your library configuration to set a path and choose "master" as default version. Can not use Load implicitly, so explicit configuration should be done in the pipeline.
As a reference, I read this approach in this slides https://www.slideshare.net/roidelapluie/jenkins-shared-libraries-workshop.
I have a jenkins job that pulls source code from GitHub public repo. I need to pass some files such as instance-specific configuration files containing secrets to the job and merge with source code prior to running build because these files are obviously inappropriate to be put in public SCM. The Jenkins instance is a multi-tenanted shared service.
The config files don't change often so I don't want to implement using file parameter which forces user manually input the file on every run. Another reason file parameter doesn't work is some builds are triggered automatically by SCM.
I don't want to use Config File Provider Plugin either, because the plugin requires jenkins admin access but I want users with job-level privileges manage the files themselves.
Ideally the uploaded files are saved alongside with job config.xml instead of in workspace, because I would like to delete workspace after each build. I can write scripts to copy the files from job config folder to workspace.
Are there any solutions available? Thanks.
If the "special" files are being placed in a folder with say some access privileges to it, couldn't you either run a Pre-SCM-Buildstep to move the files with shell commands, or introduce a regular build step (i.e. after the SCM stuff and before the other build steps) that would also use shell commands to move files?