Unable to access workspace on Jenkins Master after build - jenkins

I am running a Kubernetes cloud on which I am hosting Jenkins Master CI instances. My JM launches a builder Pod inside the cloud and kills that Pod after build is done (nothing unusual here). I am copying back the workspace on the master with the 'copy-to-slave' plugin. The problem is, I am not able to access my workspace after the build is done, even though the workspace sits on Jenkins Master (I can exec into the JM pod and I can see the workspace under $JENKINS_HOME/workspace/<job_name>. Is there any workaround regarding this? How can I access my workspace AFTER the build is done?

You need to run specific job at least once on the master. Don't know why such strange behavior, but after one run on master you will be able to see workspace

Related

Jenkins pipeline Workspace got deleted automatically

We have observed that the Jenkins Pipeline Workspace project folder getting deleted. We have not configured any cleanup plugin for this as well as not configured any cleanup module in the pipeline.
This behavior is random and it's deleting Old as well as new jobs.
We can see the workspace deletion traces in /var/lib/jenkins/logs/tasks under Workspace clean-up.log. Please let me know if anybody is facing the same issue and how to fix this issue? Our Jenkins version is 2.289.2
Try disabling the workspace cleanup. There are two ways to achieve this. I was facing the same issue and I've just tried the first approach and monitoring the workspaces to see if this works.
Add -Dhudson.model.WorkspaceCleanupThread.disabled=true to the Jenkins system properties.
If Jenkins is running through terminal:
java -Dhudson.model.WorkspaceCleanupThread.disabled=true -jar jenkins.war
If Jenkins is running as a Linux service:
Stop Jenkins (service jenkins stop). You will need root privileges.
Edit the /etc/defaults/jenkins file.
Add an additional line for the JAVA_ARGS or add to it if it already exists.
JAVA_ARGS="-Dhudson.model.WorkspaceCleanupThread.disabled=true"
Start Jenkins (service jenkins start).
Disable or uninstall the Workspace Cleanup plugin. (I haven't tried this)

Is the Jenkins workspace on the master or the worker?

Who does the actual cloning of the project, is it the master or the agent node? If it is the master, then how does the agent node actually execute the job. If it is the agent node, how can we view the workspace in the browser?
When people ask "where is the workspace" the answer is usually a path, but I am more interested in where that path is, on the master or the agent node? Or maybe it is both?
Edit1
Aligned terminology to this: https://jenkins.io/doc/book/glossary/ in order to avoid confusion.
In a Jenkins set up all the machines are considered nodes. The master node connects to one or more agent nodes. Executors can run both on the master or agent nodes.
In my scenario, no executors run on the master. They are run only on the agent nodes.
The answer is: it depends !
First of all, although it is not a good practice IMO, some installation let the master be an actual worker and run jobs. In this case, the workspace will be on the master.
If you configured the master not to accept jobs, there are still occasion when a workspace can be created on the master. A good example is when your job is a "pipeline script from SCM". In this case, the master will create a workspace for the job, clone the target repo, read the pipeline, and start needed jobs on whatever slave is targeted, creating a workspace to run the actions themselves. If the pipeline targets multiple slaves, there will be a workspace on each of them.
In simple situation (e.g. maven or freestyle job), the workspace will only be on the targeted slave.
I needed to dig a bit deeper to understand this.
I ran a brand new instance of Jenkins and I attached a single agent node. I used SSH and I set the remote (agent) root directory to: /home/igorski/jenkins
As soon as I attached the node the remoting folder and remoting.jar showed up in that root directory.
I ran a basic Gradle Java pipeline job (Jenkinsfile in the project).
The workspace showed up on the slave. Not on the master.
From the Jenkins GUI I can access the workspace and see it's contents.
At the moment I kill the agent machine I can no longer view the workspace in Jenkins.
My guess is that the remoting.jar somehow does a live sync.
I also ran a freestyle project and I can confirm the same. As soon as the agent is killed I can no longer open the Workspace and I get an error stack trace:
hudson.remoting.Channel$CallSiteStackTrace: Remote call to JenkoOne
This was much more obvious with the Pipeline job though. There you get a link to the agent that you need to click in order to see the contents. As soon as the agent is gone the link is disabled. And you know exactly on which agent the node is. With freestyle jobs, you just get a Workspace link. There is no indication on what agent it is or if the agent is accessible at the moment.
So, both Zeitounator and fabian were correct.

View jenkins workspace on slave

We're running tests and producing build files on a jenkins master and a jenkins slave for extra parallellisation, our RCPTT tests takes ages.
Our problem is that Jenkins -> -> Show workspace only shows the workspace on the master, so we have no way to get the builds except copying files manually over ssh.
We don't want duplication since different patches run either on master or slave, and we want to be able to get the files from both master and slave nodes.
You can use "Copy To Slave Plugin" to copy any files from master to slave.
If you use the Jenkins pipeline plugin https://jenkins.io/doc/book/pipeline/
then you can use stash / unstash:
https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#code-stash-code-stash-some-files-to-be-used-later-in-the-build
Her an example: https://www.cloudbees.com/blog/parallelism-and-distributed-builds-jenkins

Jenkins - Copy build log from master to a shared drive

Can someone direct me here? I have a simple job configured in Jenkins on a WINDOWS environment (master and all slaves running on windows) and the job is supposed to run on a particular slave. When you build the job, the build log ( log.log) gets stored in ” %JENKINS_HOME%\jobs\\builds\%BUILD_NUMBER%\” on the master.
I do have a Jenkins workspace (which is required when you add a slave node) set on the slave for this job–where nothing gets stored when the job runs.
With this scenario, I would like to copy the build log (log.log file that’s available on the master) to a share drive. Please advise me the way to get this done. I have tried few plugins “Copy to slave”, “Copy Artifact Plugin” and ArtifactDeployer Plugin…I could not get them working to meet what I need.
Use a second build action with the execute batch option. Put the copy command there to copy the log to another location.
The following command kind-of works:
curl ${BUILD_URL}consoleFull -o ${TargetDir}/Log.txt
where
TargetDir="${WORKSPACE}/Directory/target"
BUILD_URL and WORKSPACE are set by Jenkins. Unfortunately Jenkins doesn't copy the whole log. I've tried consoleText and gotten the same result: partial logs files. :-(

Jenkins Slave build locations

I have just added a slave to my Jenkins build - with the idea that I can now deploy artefacts to either my dev server or my test server.
However i've now hit a problem.
When I deploy a job on the master slave, the job build directory is
$JENKINS_HOME/localmoduledirectory (as defined in the build job)
However when I deploy my job via the slave the build directory is different which breaks my jobs. The build directory is
$JENKINS_HOME/workspace/build job title/localmoduledirectory
I know I can change the workspace root directory location for the master under configure settings /advances .. so can change it to $JENKINS_HOME/workspace, but I want to stop the slave using the build job title in the path.
The end result I'm after is to have jenkins, building / deploying from the same location on two servers i.e /opt/jenkins/workspace/localmoduledirectory.
Any ideas ?
ok after lots of head scratching ...
managed to discover that the mvn plugin has a custom workspace option hidden under advanced. so configured all jobs with a customer workspace of /opt/jenkins.

Resources