Jenkins pipeline -how to read file from outside workspace - jenkins

i have a script that should run on both linux and windows agents.
this script reads a config file sitting on a network drive.
it gets worse - we have 2 different jenkins masters - one on docker ubuntu, and one on master. they run different jobs but with the same script.
so now -
using script.readFile is out of the question because the file is outside of workspace.
using groovy File(path).text is also problematic because the path (the mounts) is different on windows/linux (the jenkins masters).
There is a shared env var across all machines to get the right mount. when using groovy File, this doesn't work "${SOME_ENV_VAR}/file" it doesn't translate the env var
is there a way to use jenkins pipeline to read a file outside workspace? this would be the best solution.
or some other solution you can think of?
Thanks

using script.readFile is out of the question because the file is outside of workspace.
Not really. Assuming you are referring to the Jenkins step readFile you still can use it. It just takes a whole lot of dots
def config = readFile "../../../../mnt/config/my_config.txt
You'd have to figure out the exact amount of dots yourself

Related

How can Cloud Build take dynamic parameters to increment a registry tag?

I want my Cloud Build to push an image to a registry with an incremented tag. So, when the trigger arrives from GitHub, build the image, and if the latest tag was 1.10, tag the new one 1.11. Similarly, the 1.11 value will serve in multiple other steps in the build.
Reading the registry and incrementing the tag is easy (in a bash Cloud Build step), but Cloud Build has no way to pass parameters. (Substitutions come from outside the Cloud Build process, for example from the Git tags, and are not generated inside the process.)
This StackOverflow question and this article say that Cloud Build steps can communicate by writing files to the workspace directory.
That is clumsy. But worse, this requires using shell steps exclusively, not the native docker-building steps, nor the native image command.
How can I do this?
Sadly you can't. The Cloud Builder image have each time their own sandbox and only the /workspace directory is mounted. By the way, all the environment variable, binaries installed and so, doesn't persist from one container to the next one.
You have to use the shell script each time :( The easiest way is to have a file in your /workspace directory (for example env.var file)
# load the environment variable
source /workspace/env.var
# Add variable
echo "NEW=Variable" >> /workspace/env.var
For this, Cloud Build is boring...

How to use a GitLab link for applying jenkins.yml file for the concept of Jenkins Configuration as Code

I have a local instance of Jenkins. I have previously tried storing the jenkins.yml in my system and giving its path on http://localhost:8080/configuration-as-code. This worked but I want to use a Gitlab repository to store the jenkins.yml file.
I have already tried giving the gitlab link of my jenkins.yml in the path or URL textbox. Some weird things happened, like
1. jenkins broke or huge error console
2. It reapplies the previous configuration(from system path)
jenkins:
systemMessage: "Hello, world"
Your problem as described: you want the job configuration to be saved in GIT and, when a build is triggered, the job should get the current stand of its configuration from there and then, run the build.
Maybe there is a kind of plug-in that does it for you, but I am not aware of any. Maybe anyone?
My suggestion is to define a pipeline job and use a declarative pipeline. It is a file, normally named Jenkinsfile that can be stored in GIT. In the Job, you define the GIT address and when you trigger a build, the file is got from GIT and executed.
There are several flaws in this: pipelines learning curve is not small, you are confronted with groovy (not XML!) and your current XML file is barelly useful.
Maybe someone shows up and tells us about new (for me) plugin that solves your problem using the configuration XML file. In the other hand, pipelines are such a beautyful feature that I encourage you to give it a try

Jenkins drops a letter from file paths

We have a Code Composer Studio (Eclipse) project that uses CMAKE to generate makefiles and build. The project compiles as expected when the project is manually imported onto the Jenkins slave (Win10 x64) and executed from the command line but fails when the build is handled by Jenkins. The failure always follows the same pattern: a singular letter is dropped from the path of an object file. For example, [Repo directory]/Cockpit_Scaling_and_Exceedance_data.dir becomes [Repo direcory]/Cockpit_Scaling_and_Exceedance_ata.dir and linking fails because it cannot find the referenced object file.
I made sure that there are no differences between the account environment variables and the system environment variables and have also configured the Jenkins Service to use the admin account on the slave instead of SYSTEM in order to get rid of as many differences between Jenkins and the command line as possible.
The project will build successfully using one of our other Jenkins slaves (also Win10 x64), so we know that it's not a Windows 10 issue or a problem with our Jenkins configuration. Since I can't find any differences between the configuration of the two slave machines, I was hoping that someone might be able to suggest somewhere to look for this path issue.
I never found out why the paths to object files were being mangled, but I did get the project to build successfully on the slave via Jenkins. All I did was change all of my system environment variables into user environment variables. I copy-pasted, so I know that the variables themselves did not change.
I have no idea why this corrected this issue as I had inserted a whoami call at the beginning of the build to confirm that Jenkins is indeed running as a user and not System. I guess from this point on all of my environment variables will be specific to a user and not SYSTEM...
EDIT: The problem has returned. I have made no further progress in tracking down the cause behind this issue, but I have found that I do not see this symptom when running the scripts in a bash environment instead of a Windows command prompt. Fortunately for me the scripts have all been written in such a way that they can be run in both environments, so I have had my coworkers use bash instead for them.

JENKINS_HOME environment variable used for 2 conflicting purposes

It appears that Jenkins is using the environment variable $JENKINS_HOME for 2 different purposes, and for each purpose it will get a different value.
Purpose#1: First, there is the JENKINS_HOME that is a directory on the local file system that stores files that Jenkins creates. Jenkins uses this directory for disk space to perform builds and keep archive. So a sample value might be:
export JENKINS_HOME=/var/jenkins
That purpose is described here:
https://wiki.jenkins-ci.org/display/JENKINS/Tomcat
https://wiki.jenkins-ci.org/display/JENKINS/Administering+Jenkins
Purpose#2:
There is another instance where Jenkins used the JENKINS_HOME environment variable, and that is for monitoring external jobs. But this time JENKINS_HOME is a URL, like such:
export JENKINS_HOME=http://user:pw#myserver.acme.org/path/to/jenkins/
That purpose is described here:
https://wiki.jenkins-ci.org/display/JENKINS/Monitoring+external+jobs
So it seems odd that Jenkins would use the same environment variable, yet its value will change depending on the purpose. I would think that the external job would use another name for the environment variable, like JENKINS_URL. I suppose as a workaround I can just set the environment variable in the Servlet container (Tomcat for me) instead of on the operating system, so there is no conflict. Still though, the fact that this conflict for the variable exists in the first place seems strange. Is there something I'm missing?
That is pretty confusing, but the second purpose is for monitoring Jenkins jobs in an external process, not within Jenkins itself; so it's not Jenkins that is using the $JENKINS_HOME value in this case and there is no conflict. They could have picked a better name for the variable, though.
In most other cases, the Jenkins master URL is referred to as JENKINS_URL - see the Jenkins CLI documentation for example.

Use Jenkins to compare files in two nodes

I wonder are there features for jenkins to capture the result /data in a node and persist it in master.
I come up with the scenario that I need to check some folders in two machines to see whether they have same no of files & same size.
If hudson can save some result like "ls -ltR" in master , then I can gather at both node the results in two jobs then compare.
Are there any elegant solution to this simple problem?
currently I can connect two machines to each other via SSH and solve the problem, while this connection is not always available.
(With SSH I believe the best way is to use rsync -an /path/to/ hostB:/path/to/)
Simple problem, only slightly elegant solution :
Write a simple job listdir which does DIR > C:\logs\list1.txt .. list
Go to Post-build Actions
Add Archive the artifacts for example from above: C:\logs\*.*
Now run a build and go to http://jenkinsservername:8080/job/listdir/
You'll see the list1.txt which you can click on, and see the contents.
I have given a Windows example, you can of course replace DIR with ls -ltr
Or use archive artifacts in combination with the Copy Artifacts Plugin to pull the result of another job in the job where the comparison shall be done.

Resources