I have a mixture of local tasks and Mantis repository tasks. I'd like to move all my local tasks to the Mantis repository. Is there a way to do this easily? Currently the only way I can see is to recreate the task and specify it to be a task for the Mantis repository which will be cumbersome and time consuming.
Thanks in advance for any answers.
The only way I'm aware of is to Clone each local task and send it to the Mantis repository. You can right-click on the task editor header or on the task in the task list to access the context menu.
Afterwards, in the same menu, you can use the Context - >Copy To... to copy the task context to the new task, if needed.
Related
I try to use "schedule task" on eaver. So, I make my "database task". When I right click on a task to set the schedule, there is no "scheduler" button. The options to click on are:
"Run task, Edit task, Create new task, Copy task, Delete, Create new task folder, rename folder, group task by category, group task by type, copy, configure column, auto-size column"
The version is 22.3.1.
Is there anything that I have to download or set before these steps? what is wrong?
After facing the same problem, I found the reason. The scheduler feature is only available on the enterprise and ultimate versions of DBeaver."
On the following link, scroll down and click on Administration & Security tab to check the difference between edition:
dbeaver.com/edition
I am using the scriptrunner plugin for Jira.
Is it possible to add a condition to a transition using scriptrunner?
Currently, my condition is in a script which I have manually added to the workflow.
But I was wondering if there is a way to do it automatically?
I was looking through documentation on: https://docs.atlassian.com/
I came across this method:
replaceConditionInTransition which is a method of WorkFlowManager.
But I'm unsure on how to use this.
Any help would be appreciated.
Conditions as any another scripts can be added from file system. You can store scripts in any VCS (bitbucket, github, gitlab, etc) and automatically deploy them to Jira server file system through any CI/CD system (teamcity, jenkins, bamboo, gitlab, etc).
So, as result process will be looks like. 1. commit changes in you script to vcs 2. wait a bit for auto deploy (e.g. triggered by commit) 3. done. As additional you can write any script/service/etc for commit these changes automatically if needed.
Also look at script roots it's helpful way which allows reuse any of script fragments through helpers classes.
It's rather conceptual answer basically because implementation is depends on environment, but I hope that you get at least one more point of view to solve this task.
I think that using the Java API to modify Jira workflows is pretty tough. You could dig around in the workflow editor to see how conditions are added there. Remember that you have to do this in a draft workflow and then publish it, which takes some time in large projects
I like the idea of replacing a script file as easier, if it can be done when no issues are transitioning
I'm trying to set up a build server for a process we're trying to automate, and have run into an issue that I'm hoping there's an easy fix (or plugin) for.
The end goal is to have a set of jobs that each check for changes in a small subset of our Perforce depot, and then perform a task. The issue is that each of these tasks requires the entire depot in order to execute properly, and since the depot is so large (30+GB) and there are so many of these tasks (50+), actually duplicating the depot and syncing it would be an extreme waste of disk space and network bandwidth.
Instead, I'd like to have one "master" job that deals with syncing the depot (which all the child jobs share), and then have each child job use their own workspace and the "Preview Check Only" populate option in the Jenkins Perforce plugin (which syncs with p4 sync -k). Each child job's workspace would then exist only for the job to be able to detect when changes it is interested in have happened, after which they would run their tasks from inside the "master" workspace depot, and everything should just work!
Except I have not been able to figure out how to trigger the child jobs AND have them check for changes to their local workspace before deciding to run. Is there any way to have the children run all the checks they would normally run (if they were just scheduled to run every once in a while) when triggered by another job?
Or maybe there's a better way to do what I'm trying to do? Something that would allow me to share a single Perforce depot, but have child jobs that only run when part of that depot changes? There will also be more child jobs created over time, so being able to set them up and configure them easily would also be a nice thing to have.
I finally figured out a solution here, but it's pretty convoluted. Here goes!
The master job does the Perforce sync, and then runs a batch script that uses curl to run a polling of each child job: curl -X POST http://jenkins/view/Job/polling. When the child jobs have Poll SCM enabled (but do not have a schedule set), this lets them poll the SCM when the receive the polling request from the web API. Kinda messy, but it works!
I'm looking for a way to automatically add +2 permissions for certain refs for a lot of projects in Gerrit and unfortunately it seems there are no API calls to modify access rights, only to read them. Do you have any idea how to modify refs permissions for a big amount of projects?
I'm using Gerrit 2.9.
Thanks.
One possibility would be to create a batch script to modify the project.config for those projects and commit them back to gerrit.
This is how you can checkout the project.config for the All-Projects, it works the same for other projects: http://blog.bruin.sg/2013/04/how-to-edit-the-project-config-for-all-projects-in-gerrit/
Simply put:
Create list of project you want to change
Iterate over the lest
Checkout the refs/meta/config ref
Use script to modify project.config
Commit and push back to the server
More information about the project.config: http://gerrit-review.googlesource.com/Documentation/config-project-config.html
I need to build in jenkins only if there has been any change in ClearCase stream. I want to check it also in nightly or when someone choose to build manually, and to stop the build completely if there are no changes.
I tried the poll SCM but it doesn't seem to work well...
Any suggestion?
If it is possible, you should monitor the update of a snapshot view and, if the log of said update reveal any new files loaded, trigger the Jnekins job.
You find a similar approach in this thread.
You don't want to do something like that in a checkin trigger. It runs on the users client and will slow tings down, not to mention that you'd somehow have to figure out how to give every client access to that snapshot view.
What can work is a cron or scheduled job that runs lshistory and does something when it finds new checkins.
Yes you could do this via trigger, but I'd suggest a combo of trigger and additional script.. since updating the snapshot view might be time-consuming and effect checkins...
Create a simple trigger that when the files you are concerned about are changed on a stream will fire..
The script should "touch/create" a file in some well-known network location (or perhaps write to a pipe)...
the other script could be some cron (unix) or AT (windows) job that runs continually or each minute and if the well-known file is there will perform the update of the snapshot view..
The script could also read the Pipe written to by the trigger if you go that route
This is better than a cron job that has to do an lshistory each time.. but Martina was right to suggest not doing the whole thing in a trigger for performance and snapshot view accessability for all clients.. but a trigger to write to a pipe or write some empty file is efficient and the cron/AT job that actually does the update is effieicnet as it does not have to query the VOB each minute... just the file (or only after there is info on the pipe)..