Jenkins Restored Job is not visible - jenkins

We have restored a job on Jenkins server from our last backup. But its not visible on UI.
I have tried reloading configuration from disk and even restarted the service , still not visible.
Am I missing something ?

Did you restore only the config.xml file (and its parent folder)?
Did you check the permissions on the restored file (or folder)?
Did you check the "All" view in Jenkins?
If you are using a direct path to your job (like http://your.jenkins.ci/jobs/myjob), does it work?

Related

Set group of Jenkins system user when creating a job

I'm using the Authorize User plugin in Jenkins, and I'm trying to setup a multi-tenant jenkins with genuine access control. I want to have a folder with a set of jobs which group A can see, and another folder with another set of jobs which group B can see.
At the master/controller, the $JENKINS_HOME/jobs/ folder creates a new job folder when a build is triggered. However this folder is created as the SYSTEM user, not the build user. The issue is although I could just put the build user into the SYSTEM user's group, this would ALSO give them access to any job on the filesystem, not just the folder they should have.
Is there a way to configure what user:group is set when a job folder is created?
Perhaps this helps:
https://support.cloudbees.com/hc/en-us/articles/204173600-How-do-I-limit-users-access-to-the-folders-to-which-they-belong-to-?page=94
It should also work with vanilla Jenkins.
Warning: untried ;-)

There are resources Jenkins was not able to dispose automatically - concerning?

After running different jobs I sometimes get this message in Jenkins:
"There are resources Jenkins was not able to dispose automatically."
I can then click the link provided and there is no additional information there. The jobs run fine, the workspace is as expected, the jobs folder looks normal. Is this something I should be concerned with?
You mentioned you believe all the work happens on your master, not an agent. This may negate what I'm about to say but it might help for troubleshooting anyway;
We have a master/agent setup and often get those warnings. We found it was because one of our jobs created files with permission settings that didn't give Jenkins permission to delete them. Sometimes we could track down the exact files; sometimes it was blank, like you said.
We figured out that the blank ones were happening because the agent was taken offline once it was done with its jobs, and then deleted. No agent = no files. Maybe your master deletes its workspace periodically and creates the same effect?
Either way the solution for us was to change the permissions on the affected files, and we stopped getting the messages.
This error came when jenkins tries to delete cleanup folder but cant delete may be due to permission error or other.
To check the files which Jenkins is trying but not able to delete:
sudo find /var/lib/jenkins/workspace/ws-cleanup/ -user root
To delete:
sudo find /var/lib/jenkins/workspace/ws-cleanup/ -user root -delete
To avoid add the delete command whose job files are giving problem.
Regards
DevOpsBro

Jenkins: Tracing the history of unsaved new test definition (copied from another test definition)?

Recently, in our enterprise production setup, it seems someone has tried to setup a new job / test definition by using another (copying) from identical job. However, (s)he seems to have NOT saved (and probably, am guessing here, closed the browser with the session being lost).
But the new job got saved though it was not set to stable or active; we knew about this because changes uploaded to gerrit, started failing in this newly setup partial job (because, these changes were in certain repos that met certain TDD settings).
Question: Jenkins system does not have trace of who setup the system in 'configure versions' option. Is there anyway to know the details of who setup the job / when was that done ?
No, Jenkins does not store that information by default.
If your Jenkins instance happen to be running behind an Apache or Nginx web server, there might be access logs that can help you. To find out when the job was created you could look at when its config.xml file was created/modified.
However, there are a few plugins that can add this functionality so that you won't have this problem again:
JobConfigHistory Plugin – Tracks changes in your job configurations and gives the ability to restore old versions.
Audit Trail Plugin – Keeps a log of who performed particular Jenkins operations, such as configuring jobs.

Jenkins: prevent from reloading configuration

I am currently doing an analysis whether jenkins could fit for our needs.
Therefore I need to know something about (NOT) reloading configurations:
I know that there is an explicit way to reload a configuration (via WebGUI and CLI).
BUT:
Is there also a way to PREVENT Jenkins from reloading configs?
One requirement is that the CI-System reads in all config-files (general and job-configs) ONCE at the start of the ci-system. And afterwards a modification of the config-files shall take NO effect!
Do you know whether this is already the case (except I press that button under "manage Jenkins" | "reload Configuration from hard disk" [exact options might sound a little different because I only have a German version here])
Would be very thankful for your help,
Lukas
I run a Git repo to control the config.xml files so have experience of the xml files changing while Jenkins is running.
I can confidently state that Jenkins will not reread the config.xml file unless you specifically ask it to via the UI or cli. In fact if the config is changed via the UI any changes that has been done to the config.xml file will be overwritten with the in-memory version.

Changing path in an config file stored in TFS

We have a solution stored in TFS that deploys to SharePoint. As part of the solution we have a config file that has a path to a specific site. The problem is this path changes dependent on the users dev machine e.g
<site>devmachine1/somesite</site>
<site>devmachine2/somesite</site>
This can obviously be updated to work locally after a check out however when the file gets checked back in it will be incorrect on the next users machine if they do a Get. Is there a way that the file can be excluded or a script can be run to update the path when checked back in or out?
The best option I'd to rationalist all of the developer workstations.
I would do this by adding an identical entry to the hosts file that hard coded the name of the Sharepoint, allowing you to have the same config file work on every dev machine.
Make it dynamic by having a pre build instruction that adds the host, that way any developer can get and build.
You can use a custom check-in policy to update back the file when is checked-in. See here

Resources