Sharing files in Jenkins - jenkins

my post-commit build process in Jenkins prepares several files that are very useful in everyday development. Currently we zip necessary files, copy to a director that is simply shared resource.
I'm looking for some kind of a plugin that would allow me to point a directory for publishing and present its content (something like workspace view in defined job).
Any suggestions ??

OK, I solve this problem with jenkins default direcotry JENKINS_HOME\userContetn (files available from jenkins web page) and mentioned here side-bar-plugin. I created needed symbolic links in userContents and then added applicable links to mail window. Works great. Thx for hints!

Related

How to access source code from within Docker build?

I'm trying to build a Github Action that's going to take a screenshot of the Github Pages website (built with Jekyll) and allow the user to upload it as an artifact. For my convienience and since installing Pyppeteer is non-trivial, I wrapped the thing around in Docker. Here's the source code for the action at the time of writing the question:
https://github.com/hakierspejs/jekyll-screenshot-github-action/tree/8569f1370c5decf5ecfb4bc17a977cad5aa3d2aa
The problem is in accessing the source code of the Github Pages website. I noticed that Github Actions is mapping a couple of directories as a volume, but neither /github/workflow nor /github/workspace seems to contain the source code. Is there a path I could mount to somehow access it?
/github/workflow is the right path, but I was missing a actions/checkout#master step. Adding this line to the list of steps made it work:
- uses: actions/checkout#master

Making sense out of VSTS logging commands task.addattachment, task.uploadfile, artifact.upload and build.uploadlog

I have two artifacts that I download from my Octopus server in order to expose in my vNext build (we are using an on-premises TFS).
The code is:
$Artifacts.GetEnumerator() |% {
curl $_.Value -OutFile "$TestResults\$($_.Key)" -Headers #{
"X-Octopus-ApiKey" = $ApiKey
}
Write-Host "##vso[task.addattachment type=NUnitTestResults;name=$($_.Key);]$TestResults\$($_.Key)"
Write-Host "##vso[task.uploadfile]$TestResults\$($_.Key)"
Write-Host "##vso[artifact.upload containerfolder=NUnitTestResults2;artifactname=$($_.Key);]$TestResults\$($_.Key)"
#Write-Host "##vso[build.uploadlog]$TestResults\$($_.Key)"
}
Two files - CSTests.xml and PSTests.xml are downloaded and placed in a folder. Then I am issuing the VSTS logging commands.
The only documentation I could find for them is https://github.com/Microsoft/azure-pipelines-tasks/blob/master/docs/authoring/commands.md and it leaves a lot of space to our imagination.
What I have learned so far:
build.uploadlog
Embeds the contents of the files in the log of the respective task. For example:
As one can see, the NUnit test results are prepended to the step log proper. And here is what the documentation says:
I hope it makes sense to somebody, to me it does not make any. Next:
artifact.upload
This one is easy - it adds the files as artifacts to the build:
But each artifact contains ALL the files. So, it does not matter which Explore button I click (for CSTests.xml or PSTests.xml), I always get this:
Sounds like I am expected to place the two artifacts in different container folders, but then what is the purpose of having both the container folders and the artifact names? I am confused.
task.uploadfile
Using this one I got my NUnit test result files included in the log archive when downloading logs:
No questions here.
task.addattachment
This one is a mystery to me. It has no apparent effect. The documentation says:
Not clear what kind of an attachment it is and where can we find it.
So, my questions are:
Is there a serious documentation for the VSTS logging commands beyond the half baked aforementioned markdown page?
build.uploadlog - does it always prepend the contents of the files to the step log or appending is also an option?
artifact.upload - how publish files as separate artifacts? Does it mean separate container folders? But then the name of the file is likely to be mentioned in two places - container folder and artifact name. Is it the way?
task.addattachment - what does it do?
I've been similarly frustrated with the documentation for these logging commands.
As for task.addattachment I discovered the Build/Attachments REST API and the attachments show up there. It has this format:
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/attachments/{type}?api-version=6.0-preview.2
Note that type is required. You simply use the same type that was specified in the task.addattachment command.
I was able to build a url and just plug that into my browser window. Then it displays the raw json response.

Jenkins: define custom mime-type for viewing archived artifacts

I am saving certain log-files during a Jenkins build as archived artefacts within jenkins (e.g. via Jenkinsfile's "archiveArtifacts" command). These files are plain-text-files but unfortunately don't end in ".log". The web.xml of the jenkins.war seems to by default use the octet-stream mime-type for unknown extensions. This leads to the "problem" that i cannot open these logs in the browser easily; rather, chrome downloads the file.
I would like to be able to open the file in the browser for convenience.
Is there a way or plugin to configure/register custom mime-types?
I already tried to modify the web.xml just for a quick hack but did not continue when i noticed that the jars are signed and jenkins does not start when the war-file is modified. So, i rather do not want to go this route; also because i will have to patch the war-file after each and every jenkins-update.
Is there a smarter way? What is not an option for me is to rename my log-files to have an extension like ".log" (which Jenkins will serves as text/plain).
Thanks,
Daniel

Are Jenkins job templates only accessible downtree from the folder of their definition?

It appears that our Jenkins job templates are only accessible downtree from the folder in which they're created. This means that if we want a universally available template, it must be created in the Jenkins root directory which of course potentially floods up on Jenkins root. Any way to expose a Jenkins job template outside the folder it's in?
Yes, as documented a template can only be used in or beneath the folder in which it is defined.
There is an RFE CJP-1525 in the internal tracker suggesting that as an alternative, a designated subfolder (like the 3.x Templates/) of the common root folder also be searched. (The extension point to allow this from an outside plugin already exists.)
See the last sentence of the section Using the Folders plugin of the documentation you linked in the comments:
Templates Plugin
You can place a template inside a folder rather than at top level. Then the template is only offered to jobs (or other templates) in that folder or below.

Is it possible to create or edit a Jenkins Project on the fly?

My question was too long so I've shortened it:
Is it possible to create or edit a Jenkins Project on the fly with a template/environment variables/coding/etc, and would this be possible via a trigger from Gitlab/hub/etc.? If someone could point me in the right direction that would be great! Thanks!
Yes.
Your options are to use the rest API (and there's more detailed information in your own Jenkins, check the link in the bottom right corner of every page), or to create the job directory structure and XML files manually and have Jenkins re-scan its workspace directories.
I strongly recommend the rest API. It's designed for this sort of thing.

Resources