How to create a user defined task in synology? - task

I want to create a task that should run a file containing script every day in synology

There is scheduler in control panel. Put your script in any folder you want and run it from scheduler.
Choose user defined script
mine looks like
/volume1/MainData/Programs/Scripts/Bash/Synology/rsync_work_backup.sh

Related

Set group of Jenkins system user when creating a job

I'm using the Authorize User plugin in Jenkins, and I'm trying to setup a multi-tenant jenkins with genuine access control. I want to have a folder with a set of jobs which group A can see, and another folder with another set of jobs which group B can see.
At the master/controller, the $JENKINS_HOME/jobs/ folder creates a new job folder when a build is triggered. However this folder is created as the SYSTEM user, not the build user. The issue is although I could just put the build user into the SYSTEM user's group, this would ALSO give them access to any job on the filesystem, not just the folder they should have.
Is there a way to configure what user:group is set when a job folder is created?
Perhaps this helps:
https://support.cloudbees.com/hc/en-us/articles/204173600-How-do-I-limit-users-access-to-the-folders-to-which-they-belong-to-?page=94
It should also work with vanilla Jenkins.
Warning: untried ;-)

Is there a way to overwrite a value contained within a config.properties file via Jenkins?

Is there a way to overwrite a value contained within a config.properties file via Jenkins?
I have the following config.properties file contained within my automation framework:
browser=chrome
url=http//www.example.com
If the value of chrome get changed to firefox then all tests will now execute within firefox browser.
I can manually change this value by directly accessing the config.properties file but can the value get altered via jenkins?
I use the Pipeline Utility Steps plugin to read properties files, and it looks like it can write a few other types of files, but not properties files.
It seems to me that you want to make this change in this file so you can run some tests first in one browser, then in another. If this is the case, I think a better way to handle this is to try to get your tests to point to different files. This is a little cleaner, and allows things like parallel execution and when you find that another thing needs to change in the future, you won't be writing so many things to the file in a script, which gets a little error prone.
If you can't make your tests execute against a different properties file, you could have a copy of each file you need, and then copy them to them appropriate filename to execute your tests.
But maybe I made poor assumptions as to your setup here. ;)
Yes.
You can create a build parameter as $browser to accept the value say "firefox" and using sed inside "execute shell", replace the value in config.properties.
Once done, execute your scripts.
This is just overview as you have not posted details about your config.properties file, its location, if you are using Jenkins jobs or jenkinsfile/pipeline etc.

How to upload a generic file into a Jenkins job?

I am trying to find a way to prompt the user to select and upload a generic file from a local machine to a Jenkins job prior to build. The input file that user is going to upload is not necessarily a text or a property file.
I am specifically trying to get the user to "select" their desired file - browse to their file ; the user should not pass the file's path.
Thanks
Use the File Parameter:
File parameter allows a build to accept a file, to be submitted by the user when scheduling a new build. The file will be placed inside the workspace at the known location after the check-out/update is done, so that your build scripts can use this file.
If you need to verify the file has a certain extension, you would have to do that with a script as part of your job, and fail the job is extension/content-type does not match what you need.
This is kind of annoying to handle when you don't know what the file name will be or need to change its name before it reaches its destination. You kind of need to perform a hack. This is how I do it:
Use the "File parameter" parameter to upload your file
Use the OS-specific script to rename the file from whatever you named your File Parameter to whatever you want it to be, e.g., if my File Parameter had the File location value of file_name instead of an actual relative file-path, I'd then do something like this for say, Windows inside a Build-Step for "Execute Windows Batch Command":
move .\file_name .\%file_name%
And then just use ArtifactDeployer to copy everything there to your desired location.
ps: this won't remove digital signatures, so the move-operation should be considered mostly safe.
The use of the Jenkins File Parameter will not work for Jenkins pipelines. It's ridiculous that they don't disable that kind of build parameter for pipelines. It's even more ridiculous that they don't at the very least, identify this SEVERE limitation in the help documentation for that parameter.
It would have saved me a couple hours trying to figure out why it would not work in my pipeline.
Refer to this feature request for more details: https://issues.jenkins-ci.org/browse/JENKINS-27413

Get result of a build step in Hudson/Jenkins to re-use it in another one

My question may be silly but I've been trying several ways and I still can't do what I want, i.e.:
use the scp target of Ant to target a remote machine and execute
a script there
this script creates a dynamic list of files
get this list of files (only their names) back in Hudson to use it in the next build step (another scp from Ant)
I tried to use environment variables but they are interpreted by Hudson so I'm stuck here...
Globally my question would be: how to get a result from an Ant build step ?
Thanks for your ideas,
Emmanuel
You may find File parameter useful. This allows you to create an input file, pass it to build. You may need to write script/ant script to process the file though.
In the long term you may evaluate a Hudson farm. This will allow to create tasks that span multiple machines , pass results around. (https://wiki.jenkins-ci.org/display/JENKINS/Plugins)
You can get the ID(s) of the job that triggered your job via the API and fetch their status.

Ant task to remotely delete a directory

Is there a way except the sshexec task to do this? I know that you can copy files with the scp task. However, I need to do additional things like check if some folders are there then delete them. I would like to use something like the condition task and delete task for this. For now I have set it up with the sshexec task. But this will most likely not work on a Windows server. And to do something like check if a directory is there and delete it I would have to write a script instead using ant tasks (for now I expect that the directory which should be deleted is actually there which I don't like because when it's missing my sshexec task will interrupt). Thanks in advance for any help.
Yes: you can use sshexec. The documentation describes how to do it: http://ant.apache.org/manual/Tasks/sshexec.html
For example:
<sshexec host="somehost"
username="dude"
password="yo"
command="touch somefile"/>
You can use the 'command' attribute to call a shell script on the remote host that will do your checks and deletes as I'm not sure if you can have multiple commands in the command attribute, plus it would get a bit messy that way.

Resources