In the Gerrit documentation there is no configuration setting, which could be used to automatically send emails if there was no action with a commit for a given period of time.
There is a notification type, abandoned_changes, but it means that the change was already abandoned.
What I want is an e-mail notification if the commit was not abandoned, but there was no action with it, like Hey, do you really need this commit?
You can't configure Gerrit to send this type of notification but you can search for open changes which are not updated for a while like in the following example:
is:open AND age:1week
See more info about search operators here.
You can make a script (bash, perl, etc) to execute the query using REST and send the result by e-mail.
See more info about query changes using REST here.
Related
‘’’’
Need to know if there is any possible way to capture the event of pull request creation in bitbucket to write a custom script using this event.
Dont want PR to trigger any automatic build instead just need capture the pr creation event to write a custom python script to trigger certain activities through script run.
One option is to read/continuous poll the bit bucket PR creation page to identify the activity/event but it sounds to be a bad scripting standard and not a good practice.
Is there any git command to identify PR creation at feature branch and to merge into parent(say its DEV) and also git command to capture PR default reviewer names???
So that, these git commandS can be used in script To track if any PR is raised?
Verified many websites and read many articals and unfortunately no where found the possibility to capture the PR creation event in bit bucket?
Great if anyone can post the possibility to capture the event to write our own custom logic with the PR event capture
‘’’’
PR is not a part of git. It's a concept unique to code hosting sites like Bitbucket, Github, etc. Therefore, theoretically it's not possible to capture PR creation event using git tool alone.
You can poll the Bitbucket REST Apis for the PR list. The response is in Json format, stable and easy to parse, so it's not really a bad practice (as when you parse the raw html).
Another way is to set up a Bitbucket webhook that triggers on PR creation event. You then need to run a web server (something like this) to accept the webhook call and run the corresponding scripts.
I am having an issue with my Jenkins pipeline that pushes a tag as one of the steps, this ultimately kicks off the build again causing a loop.
Doesn't GitHub have a way of only sending a webhook with a source commit to the repo and not a tag?
When you register for a given type of webhook with GitHub, you get notifications for every webhook of that type. Filtering is not possible for efficiency reasons, since GitHub sends massive numbers of webhook payloads. The assumption is that your service will discard any events you don't care about.
If you don't want Jenkins to build when a tag is pushed, then configure it not to do that. From some quick Googling, it appears you can control the refs to be built, so you may want to configure it to just build refs/heads/*, which doesn't include tags.
Gerrit allows associated external changes into a single change request via "Depends-On" on the commit message. However, by the looks of it, rest api does not expose these dependencies.
I can ofcourse get the commit message and then parse it, and then get change request for this external change.
Anyone know if there would be a bit more streamlined option to archive the same ?
You can get the related changes using REST API:
'GET /changes/{change-id}/revisions/{revision-id}/related'
Retrieves related changes of a revision. Related changes are changes
that either depend on, or are dependencies of the revision.
Request GET
/changes/gerrit~master~I5e4fc08ce34d33c090c9e0bf320de1b17309f774/revisions/b1cb4caa6be46d12b94c25aa68aebabcbb3f53fe/related
HTTP/1.0
See more info in the Gerrit documentation here
I am working on a plugin for TFS to hook certain operations. I'm able to successfully hook code pushes via using an ISubscriber on PushNotification, but am having trouble finding any type that matches up with the completion of a Pull Request.
A little more on what I'm trying to do. I currently have a PushNotification hook that has some branch specific checks that it does. Some reject a push, others provide notifications to users using some complex rules. I need a way to be able to provide the notifications at a minimum, and ideally prevent the pull request from going through. I can't provide notifications prior to the pull request going through as the notifications should only occur for code being placed in our main repository.
Long term, I want to switch it over to using the webhooks and some async approval, but I don't have the time to adapt the tools to work like that and setup the additional server needed to make that happen. If there's no good solution, I'll simply disable pull requests for now until I can write proper services for it, but if there is a way to reuse the adapted hooks that can run on the PushNotification ISubscriber, it would be extremely helpful.
Is it possible to setup TFS/Test Manager so that it sends out an email after a test fails?
Yes, it is possible but it requires quite a lot of changes/additions to the process template and possibly a custom-made activity.
After tests have run, we check if BuildDetail.BuildPhaseStatus has status failed
We send mail to everyone who has changesets committed to this build, so the build goes through BuildDetail.AssociatedChangesets (you need to have AssociateChangesetsAndWorkItems on) and get the committer username.
Unfortunately for us, there's no good correlation between TFS username and email address at our place, so we had to create a custom activity that looks that up in the AD.
The actual email is sent with the BuildReport action from Community TFS Build Extensions. We modified the xslt, but that's not really necessary. We also wanted to include a listing of the failed tests, and that required modification of the action itself (test data isn't included by default).
Looking at this description and all the work made to get this working, I'm beginning to wonder if it was worth it ;).