How to access source code from within Docker build? - docker

I'm trying to build a Github Action that's going to take a screenshot of the Github Pages website (built with Jekyll) and allow the user to upload it as an artifact. For my convienience and since installing Pyppeteer is non-trivial, I wrapped the thing around in Docker. Here's the source code for the action at the time of writing the question:
https://github.com/hakierspejs/jekyll-screenshot-github-action/tree/8569f1370c5decf5ecfb4bc17a977cad5aa3d2aa
The problem is in accessing the source code of the Github Pages website. I noticed that Github Actions is mapping a couple of directories as a volume, but neither /github/workflow nor /github/workspace seems to contain the source code. Is there a path I could mount to somehow access it?

/github/workflow is the right path, but I was missing a actions/checkout#master step. Adding this line to the list of steps made it work:
- uses: actions/checkout#master

Related

Add condition to transition using script runner

I am using the scriptrunner plugin for Jira.
Is it possible to add a condition to a transition using scriptrunner?
Currently, my condition is in a script which I have manually added to the workflow.
But I was wondering if there is a way to do it automatically?
I was looking through documentation on: https://docs.atlassian.com/
I came across this method:
replaceConditionInTransition which is a method of WorkFlowManager.
But I'm unsure on how to use this.
Any help would be appreciated.
Conditions as any another scripts can be added from file system. You can store scripts in any VCS (bitbucket, github, gitlab, etc) and automatically deploy them to Jira server file system through any CI/CD system (teamcity, jenkins, bamboo, gitlab, etc).
So, as result process will be looks like. 1. commit changes in you script to vcs 2. wait a bit for auto deploy (e.g. triggered by commit) 3. done. As additional you can write any script/service/etc for commit these changes automatically if needed.
Also look at script roots it's helpful way which allows reuse any of script fragments through helpers classes.
It's rather conceptual answer basically because implementation is depends on environment, but I hope that you get at least one more point of view to solve this task.
I think that using the Java API to modify Jira workflows is pretty tough. You could dig around in the workflow editor to see how conditions are added there. Remember that you have to do this in a draft workflow and then publish it, which takes some time in large projects
I like the idea of replacing a script file as easier, if it can be done when no issues are transitioning

Making sense out of VSTS logging commands task.addattachment, task.uploadfile, artifact.upload and build.uploadlog

I have two artifacts that I download from my Octopus server in order to expose in my vNext build (we are using an on-premises TFS).
The code is:
$Artifacts.GetEnumerator() |% {
curl $_.Value -OutFile "$TestResults\$($_.Key)" -Headers #{
"X-Octopus-ApiKey" = $ApiKey
}
Write-Host "##vso[task.addattachment type=NUnitTestResults;name=$($_.Key);]$TestResults\$($_.Key)"
Write-Host "##vso[task.uploadfile]$TestResults\$($_.Key)"
Write-Host "##vso[artifact.upload containerfolder=NUnitTestResults2;artifactname=$($_.Key);]$TestResults\$($_.Key)"
#Write-Host "##vso[build.uploadlog]$TestResults\$($_.Key)"
}
Two files - CSTests.xml and PSTests.xml are downloaded and placed in a folder. Then I am issuing the VSTS logging commands.
The only documentation I could find for them is https://github.com/Microsoft/azure-pipelines-tasks/blob/master/docs/authoring/commands.md and it leaves a lot of space to our imagination.
What I have learned so far:
build.uploadlog
Embeds the contents of the files in the log of the respective task. For example:
As one can see, the NUnit test results are prepended to the step log proper. And here is what the documentation says:
I hope it makes sense to somebody, to me it does not make any. Next:
artifact.upload
This one is easy - it adds the files as artifacts to the build:
But each artifact contains ALL the files. So, it does not matter which Explore button I click (for CSTests.xml or PSTests.xml), I always get this:
Sounds like I am expected to place the two artifacts in different container folders, but then what is the purpose of having both the container folders and the artifact names? I am confused.
task.uploadfile
Using this one I got my NUnit test result files included in the log archive when downloading logs:
No questions here.
task.addattachment
This one is a mystery to me. It has no apparent effect. The documentation says:
Not clear what kind of an attachment it is and where can we find it.
So, my questions are:
Is there a serious documentation for the VSTS logging commands beyond the half baked aforementioned markdown page?
build.uploadlog - does it always prepend the contents of the files to the step log or appending is also an option?
artifact.upload - how publish files as separate artifacts? Does it mean separate container folders? But then the name of the file is likely to be mentioned in two places - container folder and artifact name. Is it the way?
task.addattachment - what does it do?
I've been similarly frustrated with the documentation for these logging commands.
As for task.addattachment I discovered the Build/Attachments REST API and the attachments show up there. It has this format:
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/attachments/{type}?api-version=6.0-preview.2
Note that type is required. You simply use the same type that was specified in the task.addattachment command.
I was able to build a url and just plug that into my browser window. Then it displays the raw json response.

Difference between $(Build.Repository.LocalPath) and $(Build.SourcesDirectory) in TFS Build Online 2017

I am trying to figure out if there is a difference between the two pre-defined variables in TFS Online 2017: $(Build.Repository.LocalPath) and $(Build.SourcesDirectory). I have a build that uses these two variables and didn't know if I could use them interchangeably or not.
Looking at Microsoft's documentation the descriptions are as follows:
$(Build.SourcesDirectory): The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You can modify how files are downloaded on the Repository tab.
$(Build.Repository.LocalPath): The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You can modify how files are downloaded on the Repository tab.
Are these representing the same thing or am I missing something?
They're synonyms. Most standard templates and tasks use the $(Build.SourcesDirectory), so that is what I tend to use.
They often result in the same but not necessarily. As described in the docs:
If you check out multiple repositories, the behavior is as follows (and might differ from the value of the Build.SourcesDirectory variable):
The description for Build.SourcesDirectory on the same page contains a similar note.
Basically if you want to define a custom path for the self checkout and still not need to specify the extra dir, you specifically need Build.Repository.LocalPath.
For clarity, you can still use Build.SourcesDirectory to resolve to the full path if you have the usual
- checkout: self
path: s
and I'd recommend using it whenever possible if so. If you have something like
- checkout: self
path: main_project
then you'd need $(Agent.BuildDirectory)/main_project to reach the same.

Github API - create contents api method not triggering GitHub Pages build

I'm working on a Rails app that allows users to publish datasets via Github and access it via GitHub pages. I'm using the Github API to act as the user, create a repo and add the files, and everything works as expected, the only issue is that the GitHub pages page build doesn't seem to be happening. The datasets can be downloaded, but trying to access the index.html page doesn't seem to work at all.
Here's an example repo:
https://github.com/git-data-publisher/Foo
And here's the Github pages site:
http://git-data-publisher.github.io/Foo/
You can see that, for example:
http://git-data-publisher.github.io/Foo/data/June_2014.csv
Works fine.
I can only guess that this means the GitHub pages build isn't getting triggered. Any way I can make this happen without having to do a manual Git push?
You can also see my code here if this helps:
https://github.com/theodi/git-data-publisher
Github tries to parse you site as a Jekyll site.
You must indicate that you're not using Jekyll by creating an empty .nojekyll file at the root of your repository.

Sharing files in Jenkins

my post-commit build process in Jenkins prepares several files that are very useful in everyday development. Currently we zip necessary files, copy to a director that is simply shared resource.
I'm looking for some kind of a plugin that would allow me to point a directory for publishing and present its content (something like workspace view in defined job).
Any suggestions ??
OK, I solve this problem with jenkins default direcotry JENKINS_HOME\userContetn (files available from jenkins web page) and mentioned here side-bar-plugin. I created needed symbolic links in userContents and then added applicable links to mail window. Works great. Thx for hints!

Resources