CDK codebuild project - put code in a subdirectory? - aws-cdk

I have the following aws-codebuild Project which gets some source from a Github repo, then does some build actions.
const project = new Project(scope, "RaidoDbS3Import", {
vpc: props.vpc,
subnetSelection: { subnetType: SubnetType.PRIVATE_WITH_NAT },
securityGroups: props.securityGroups,
source: Source.gitHub({owner: "au-research", repo: "raido-v2"}),
...
I would like to change this to get the source code into a sub-directory, so that I can put some
data files in a sibling directory to the codebase.
I've looked around in the props for Project and GitHubSourceProps, but don't see anything that seems relevant.
My workaround will be to put the data files into an ignored subdirectory of the repository; but this questions is about "How can I clone the Github repository into a subdirectory of the codebuild current directory?"

Another workaround can be using git submodules for your data and fetch it while cloning repo.

Related

How to find a particular text in entire codebase - Groovy Jenkins

Within a Jenkins-Groovy pipelines I want to do the following:
Clone a particular GitLab based Code repo.
Then within this repo I want to find out all the files where there is a particular string. Example: "find_me"
Once found I want to change all these files from find_me to found_me.
Then commit these changes to the GitLab repo.
Step 4 above maybe I can find out myself, but I am struggling on how to do the 2nd and 3rd steps mentioned above.
Can anyone please suggest what can be the best way to do this?
Pipeline: SCM Step
findFiles: Find files in the workspace
readFile: Read file from workspace, writeFile: Write file to workspace, prependToFile: Create a file (if not already exist) in the workspace, and prepend given content to that file.
You can't commit to a GitLab repo directly from within Jenkins. You add/commit/merge locally, then you push. See, for instance: Is it possible to Git merge / push using Jenkins pipeline.

Copy file from Docker container to Github repository

I have a Dockerfile and a tex file in my repository. I use Github Actions to build an image(ubuntu 18.10 with packages for PDFLaTeX) and run a container, which gets main.tex and produces main.pdf with PDFLaTeX. So far everything seems to work OK, but the problem is I can't copy the pdf from container to repository. I tried using docker cp:
docker cp pdf-creator:/main.tex .
But it doesn't seem to work, as pdf doesn't appear in my repository. Can you think of any other way to solve this?
The docker cp command copies a file into the local filesystem. In the context of a GitHub action, this is just whatever virtual environment is being used to run your code: it has nothing to do with your repository.
The only way to add something to your repository is to git add the file, git commit the change, and git push the change to your repository (which of course requires providing your Action with the necessary credentials to push changes to your repository, probably using a GitHub Secret).
But rather than adding the file to your repository, maybe you want to look at support for Artifacts? This lets you save files generated as part of your workflow and make them available for Download.
The workflow step would look something like:
- name: Archive generated PDF file
uses: actions/upload-artifact#v2
with:
name: main.pdf
path: /main.pdf
See the linked docs for more information.

Jenkins - load global pipeline library from filesystem instead of SCM?

I can't find anything in the docs on how to do this - anybody have any ideas?
It seems it is currently not possible. But you can easily init a local git repository and use it as SCM without any remote hostings.
To init a git repo use the following commands in the root directory of your shared library (for example C:\Users\Jenkins\pipeline-shared-library-test):
git init
git add .
git commit -m "init"
Then in Manage Jenkins->Configure System->Global Pipeline Libraries you can point Project Repository to you local repo using a file URI file:///c:/Users/Jenkins/pipeline-shared-library-test
This approach works fine for me.
You can use File System SCM plugin to load your library from file system.
Once you have installed this plugin, use "Legacy SCM" in your library configuration to set a path and choose "master" as default version. Can not use Load implicitly, so explicit configuration should be done in the pipeline.
As a reference, I read this approach in this slides https://www.slideshare.net/roidelapluie/jenkins-shared-libraries-workshop.

Use local flat file repository instead of remote maven repository

I have no experience with maven, so excuse me if this question is silly...
From another question (How does Grails handle plugin dependencies), I've learned that I can avoid the jar-hell in grails through maven repositories. But I now have the requirements that...
I am not allowed to use remote maven repositories
I would like to bundle the needed jars with my plugin (but low priority)
I would like to avoid the effort to install a local maven repository
I already worked with a reference to a local folder for plugin resolution. This works great.
But how do I have to structure a local folder in order to use this option:
repositories {
flatDir name:'myRepo', dirs:'/path/to/repo'
}
I mean, I could just drop the jar files to this folder, but how do I then reference those jar files? Do they have a naming schema like artifact_version.jar? Or do I have to create an XML configuration for this local repository?
Or is the effort to use a local maven repo small and maven is even already on my machine through grails?
The fact is Maven comes already with a local repository (~/.m2 on linux boxes). If you don't have access to an external repo, you just have to install your jars in the local repo, with this command
mvn install:install-file -Dfile=<path-to-file> -DgroupId=<group-id> -DartifactId=<artifact-id> -Dversion=<version> -Dpackaging=<packaging>
is 'jar' (without quotes) and group-id and artifact-id are either determined if it's 3rd-party library (go make a search on mvnrepository.com if you don't know them for a particular library) or you put there your group and artifact ids
EDIT : In fact, the naming scheme under the repository is for the library example version 1.2 from jexample.com is usually com/jexample/example/1.2/example-1.2.jar (groupId : com.jexample, artifactId : example, version : 1.0)

Custom capistrano task for working with scm repository

Is there any way to create a custom capistrano task for performing other actions on a scm repository?
For instance, I would like to create a task that will checkout a particular folder from my repository and then symlink it into the shared/ directory of the main project on my server.
I know this can be done by creating a task and explicity defining the "svn co ..." command along with the scm username, password, and repository location. But this would display the password in plain text. Are there any built-in capistrano variables/methods that would help in this process?
I have found a solution to this problem by setting the svn:externals property on a folder in my repository to contain a folder from another repository. When I perform cap:deploy, this folder is populated with the HEAD from the other repository.

Resources