Is it possible to programmatically create and register a runner in bitbucket pipelines, in other words without having to create it first via the BitBucket UI.
The docker command provided requires a runner UUID, which must be created when creating the runner through the UI. Is there a way to programmatically create it through the BitBucket API? It seems a bit backward to have to create the runner first just to get the UUID so you can then deploy it.
With GitHub Actions Self Hosted runners, a runner can be created and registered to GitHub using a temporary token, but it does not seem like BitBucket have a dopted this approach, at least yet.
At the time of writing the Bitbucket API does not allow for this. There are two open feature requests for Bitbucket Runner APIs, BCLOUD-21708 and BCLOUD-21309, that may benefit from some votes.
Related
I want to implement pre-receive hooks on the GitLab server side but we don't have access to the file system. Is there any way I can handle it with GitLab-ci? I want to get control over what can and can't be pushed to the repository.
One possible workaround would be for developers to:
push to a gateway repository
pull from an official one.
(both on GitLab)
You can then associate a job on the first one, on push:, in order to validate what has been pushed.
If validated, the job push the commit to the second official repository.
It's possible to implement pre-recieve hooks in gitLab but that comes when we have access to file system. For now, I have added gitlab-ci that will check on every merge request on protected branches and let all developers push to temp branches.
Just to quote as an example one can submit a remote-run with some tool like TeamCity (similar to Jenkins) where it will apply delta/patch on what user is trying to commit & produces result whether changes is good from set-of configured checks for that project.
With Github & Jenkins, can such validation be achieved with any plugins out there?, which will avoid breaking a build?
I know with pull-request & status check one can achieve similar end-result. But without commit/push to remote repo of Git - is there a way Jenkins can handle this validation & produce initial result ??
It isn't possible to have GitHub perform checks on data it doesn't have, so if you don't push the data to the remote server, GitHub won't know anything about it and therefore will do nothing.
Jenkins does have a REST API that you could use to do this, provided you equipped each developer with appropriate credentials. However, this is not a common situation and wouldn't be a recommended configuration.
You'd be better off with a script in the repository that users could install as a hook or invoke from a hook that would perform the testing you want. If your CI jobs run a script in your repository, then sharing code between them should be easy.
Note that you shouldn't mandate pre-commit hooks, since they can interfere with advanced users (who may make intentionally incomplete temporary commits) and they can be disabled by users. Any sort of required checks should be done as part of CI, where policy can be enforced appropriately.
I am using the GitHub pull request builder plugin in Jenkins to make pull requests on GitHub automatically trigger Jenkins jobs.
I am using GitHub Enterprise and when I try to get the values of environment vars ghprbActualCommitAuthor and ghprbActualCommitAuthorMail, I get incorrect values:
ghprbActualCommitAuthor : GitHub Enterprise
ghprbActualCommitAuthorEmail : noreply#github.***.com
Please help, thanks!
This behavior is seen in GitHub Enterprise when users commit changes directly using the web UI or they have not set their email addresses.
According to GHE support:
This is by design, since the commit is actually done by the GitHub Enterprise instance. This is because we do not impersonate users when creating commits.
You can fix this by ensuring that users make commits only through the Git clients using their own SSH credentials or Personal Access Tokens.
Bamboo and bitbucket are two pieces from the same vendor and there should be no problem integrating those two with each other, but I have a weird situation.
Here is what i get when trying to add bitbucket repo to my Bamboo.
See attached screenshot.
I'm pretty sure my repo is public and I use correct bitbucket user account name.
Thanks in advance.
I have installed BAMBOO to AWS using Java script provided by Atlassian
At the end it gave me a web UI links as on following screenshot .
It worked, however some functionality was blocked by XSRF (that is not enabled by default in Atlassian products).
Works fine when I use native Bamboo URL (HTTP and port 8085) instead on HTTPS.
Be careful with that... I just wasted about 2 days trying to fix something that didn't have to be fixed at all.
Have you linked your Bitbucket repo to Bamboo server. If not see this https://confluence.atlassian.com/bamboo/linking-to-another-application-360677713.html
May be this can help.
I have a scenario where I'm setting up Jenkins for my app. I have BitBucket set up and firing appropriate webhooks.
I want to start a build whenever a push is made to the repo as well as whenever someone creates/updates a pull request.
I've looked at the BitBucket Plugin. It works good if I have the BitBucket webhook to fire for all pushes.
Then I added the BitBucket Pull request plugin to build on every pull request create/update. So I changed the BitBucket webhook preferences to fire on pushes and PR creates/updates.
Unfortunately, these plugins have conflicting settings, hence they cannot be used at the same time (as per my research, the minute I send custom webhooks from BitBucket, the first one stops working, but the second one works)
Has anyone been able to set this up correctly? Maybe there's a plugin for what I want, but I couldn't find it.
I want to keep writing a proxy in front of Jenkins to manage webhooks the last option, only if there really is nothing I can do.
Thanks for the help!