aliasing jenkins artifact URLs - jenkins

Jenkins artifact URLs allow abstracting the "last successful build", so that instead of
http://myjenkins.local/job/MyJob/38/artifact/build/MyJob-v1.0.1.zip
we can say
http://myjenkins.local/job/MyJob/lastSuccessfulBuild/artifact/build/MyJob-v1.0.1.zip
Is it possible to abstract this further? My artifacts have their version number in their filename, which can change from build to build. Ideally I'd like to have a some kind of "alias" URL that looks like this:
http://myjenkins.local/job/MyJob/lastSuccessfulBuild/artifact/build/MyJob-latest.zip
MyJob-latest.zip would then resolve to MyJob-v1.0.1.zip.
If Jenkins itself can't do this, perhaps there's a plugin?

Never seen any such plugin, but Jenkins already has a similar functionality built-in.
You can use /*zip*/filename.zip in your artifact path, where filename is anything you choose. It will take all found artifacts, and download them in a zipfile (you may end up with a zip inside a zip, if your artifact is already a zip file)
In your case, it will be:
http://myjenkins.local/job/MyJob/lastSuccessfulBuild/artifact/build/*zip*/MyJob-latest.zip
This will get you the contents of /artifact/build/ returned in zipped archive with name MyJob-latest.zip. Note that if you have more than just that zip file in that directory, other files will be returned too.
You can use wildcards in the path. A single * for a regular wildcard, a double ** for skipping any number of preceding directories.
For example, to get any file that starts with MyJob, ends with .zip, and to look for it in any artifact directory, you could use:
/lastSuccessfulBuild/artifact/**/MyJob*.zip/*zip*/MyJob-latest.zip
Edit:
You cannot do something like this without some form of a container (a zip in this case). With the container, you are telling the system:
Get any possible [undetermined count] wildcard match and place into this container, then give me the container. This is logical and possible, as there is only one single container, whether it is empty or not.
But you cannot tell the system:
Give me a link to a specific single file, but I don't know which one or how many there are. The system can't guarantee that your wildcards will match one, more than one, or none. This is simply impossible from a logic perspective.
If you need it for some script automation, you can unzip the first level zip and be still left with your desired zipped artifact.
If you need to provide this link to someone else, you need an alternative solution.
Alternative 1:
After your build is complete, execute a post-build step that will take your artifact, and rename it to MyJob-latest.zip, but you are losing versioning in the filename. You can also chose to copy instead of rename, but you end up with double the space used for storing these artifacts.
Alternative 2 (recommended):
As a post-build action, upload the artifact to a central repository. It can be Artifactory, or even plain SVN. When you upload it, it will be renamed MyJob-latest.zip and the previous one would be overwritten. This way you have a static link that will always have the latest artifact from lastSuccessfulBuild

There is actually a plugin to assign aliases to build you've run, and I have found it pretty handy: the Build Alias Setter Plugin.
You can use it for instance to assign an alias in the form of your own version number for a build, instead (or rather in addition) to the internal Jenkins-assigned build number.
I found that it is usually most practical to use it in conjunction with the EnvInject plugin (or your favorite variant): you would export an env variable (e.g. MY_VAR=xyz) with a value to the target version or moniker, and then use the form ${ENV,var="myvar"} in the "Token Macro alias" config that the plugin provides in your job config.
You can also use it to assign aliases in the form of "lastSuccesful" if you have such a need, which allows you to distinguish between different types of successful (or other state) builds.
Wait thee's more! You can also use the /*zip*/ trick in conjunction with the alias setter as well.

Related

Difference between $(Build.Repository.LocalPath) and $(Build.SourcesDirectory) in TFS Build Online 2017

I am trying to figure out if there is a difference between the two pre-defined variables in TFS Online 2017: $(Build.Repository.LocalPath) and $(Build.SourcesDirectory). I have a build that uses these two variables and didn't know if I could use them interchangeably or not.
Looking at Microsoft's documentation the descriptions are as follows:
$(Build.SourcesDirectory): The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You can modify how files are downloaded on the Repository tab.
$(Build.Repository.LocalPath): The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You can modify how files are downloaded on the Repository tab.
Are these representing the same thing or am I missing something?
They're synonyms. Most standard templates and tasks use the $(Build.SourcesDirectory), so that is what I tend to use.
They often result in the same but not necessarily. As described in the docs:
If you check out multiple repositories, the behavior is as follows (and might differ from the value of the Build.SourcesDirectory variable):
The description for Build.SourcesDirectory on the same page contains a similar note.
Basically if you want to define a custom path for the self checkout and still not need to specify the extra dir, you specifically need Build.Repository.LocalPath.
For clarity, you can still use Build.SourcesDirectory to resolve to the full path if you have the usual
- checkout: self
path: s
and I'd recommend using it whenever possible if so. If you have something like
- checkout: self
path: main_project
then you'd need $(Agent.BuildDirectory)/main_project to reach the same.

How to autoload environment variables specific to one file path?

I am working on developing a solution that simplifies hands-on debugging of failed Jenkins builds. This involves SSH-ing to the right Jenkins node and going directly on the WORKSPACE so you can interactively try different changes that could solve your problem.
While I solved the problem of starting a SSH session in the right directory there is one missing bit: your shell is missing the original environment variables defined by Jenkins, and these are critical for running any commands after that. So, not the first command of the build is a set > .envrc which saves all into this shell file.
My example refers to the direnv tool which is able to auto-load .envrc files. Due to security concerns this tool does not auto-load these files and gives a message direnv: error .envrc is blocked. Rundirenv allowto approve its content.
So my current solution is to manually run direnv allow after ending up in the right folder.
How can I automate this, so I would not have to type this? A prompting could be ok because it would involve only pressing one key instead of typing ~12.
Please note that I am not forced to use direnv itself, I am open to other solution.
As of v2.15.0, you can now use direnv's whitelist configuration to achieve what you described:
Specifying whitelist directives marks specific directory hierarchies
or specific directories as "trusted" -- direnv will evaluate any
matching .envrc files regardless of whether they have been
specifically allowed. This feature should be used with great care, as
anyone with the ability to write files to that directory (including
collaborators on VCS repositories) will be able to execute arbitrary
code on your computer.
For example, say that the directory hierarchy that contains the .envrcs you want to be evaluated without having to run direnv allow is located under /home/foo/bar.
Create the file /home/foo/.config/direnv/config.toml so that it contains the following:
[whitelist]
prefix = [ "/home/foo/bar" ]
Alternatively, if there are a fixed list of specific paths you want to whitelist, you can use exact rather than prefix:
[whitelist]
exact = [ "/home/foo/projectA", "/home/foo/projectB" ]

How to upload a generic file into a Jenkins job?

I am trying to find a way to prompt the user to select and upload a generic file from a local machine to a Jenkins job prior to build. The input file that user is going to upload is not necessarily a text or a property file.
I am specifically trying to get the user to "select" their desired file - browse to their file ; the user should not pass the file's path.
Thanks
Use the File Parameter:
File parameter allows a build to accept a file, to be submitted by the user when scheduling a new build. The file will be placed inside the workspace at the known location after the check-out/update is done, so that your build scripts can use this file.
If you need to verify the file has a certain extension, you would have to do that with a script as part of your job, and fail the job is extension/content-type does not match what you need.
This is kind of annoying to handle when you don't know what the file name will be or need to change its name before it reaches its destination. You kind of need to perform a hack. This is how I do it:
Use the "File parameter" parameter to upload your file
Use the OS-specific script to rename the file from whatever you named your File Parameter to whatever you want it to be, e.g., if my File Parameter had the File location value of file_name instead of an actual relative file-path, I'd then do something like this for say, Windows inside a Build-Step for "Execute Windows Batch Command":
move .\file_name .\%file_name%
And then just use ArtifactDeployer to copy everything there to your desired location.
ps: this won't remove digital signatures, so the move-operation should be considered mostly safe.
The use of the Jenkins File Parameter will not work for Jenkins pipelines. It's ridiculous that they don't disable that kind of build parameter for pipelines. It's even more ridiculous that they don't at the very least, identify this SEVERE limitation in the help documentation for that parameter.
It would have saved me a couple hours trying to figure out why it would not work in my pipeline.
Refer to this feature request for more details: https://issues.jenkins-ci.org/browse/JENKINS-27413

Using environment variables with Jenkins

I'm building a group of projects from the SVN. There is a possibility of changing the SVN location time to time. As there are bunch of projects I hope to give the repository url with a environment variable so i can change all the url's easily. Any idea how to do that??
In Subversion Source Code Management, you can use variable in the Repository URL, simply type:
http://my.svn.com/path/to/${VARIABLE}
${VARIABLE} is a job parameters that is defined earlier. Never heard of anyone wanting to use actual environment variables for this, but you can try with the same syntax.
By default, it will give you a red warning that this is not a valid URL. You can disable this warning by going to Manage Jenkins -> Configure System and look for Validate repository URLs up to the first variable name. Put a checkmark there and save.

Get result of a build step in Hudson/Jenkins to re-use it in another one

My question may be silly but I've been trying several ways and I still can't do what I want, i.e.:
use the scp target of Ant to target a remote machine and execute
a script there
this script creates a dynamic list of files
get this list of files (only their names) back in Hudson to use it in the next build step (another scp from Ant)
I tried to use environment variables but they are interpreted by Hudson so I'm stuck here...
Globally my question would be: how to get a result from an Ant build step ?
Thanks for your ideas,
Emmanuel
You may find File parameter useful. This allows you to create an input file, pass it to build. You may need to write script/ant script to process the file though.
In the long term you may evaluate a Hudson farm. This will allow to create tasks that span multiple machines , pass results around. (https://wiki.jenkins-ci.org/display/JENKINS/Plugins)
You can get the ID(s) of the job that triggered your job via the API and fetch their status.

Resources