I'm currently running Jenkins lts in docker and i wanted to try the Docker Swarm Plugin. However is can't seem to find the Docker Host Certificate Authentication Credentials anywhere when adding a cloud provider. See Image:Credentials
Is it a plugin that i need to install?
My current docker plugins:
docker-commons 1.16
docker-java-api 3.0.14
docker-plugin 1.1.9
docker-swarm 1.8
docker-workflow 1.21
I'm at a complete loss, help would be appreciated!
As described in the changelogs of docker-commons plugin, "Docker Host Certificate Authentication Credentials" was renamed to "X.509 Client Certificate" since 1.16.
Link : https://github.com/jenkinsci/docker-commons-plugin/releases/tag/docker-commons-1.16
Related
When I try to link a jenkins project to my gitlab project, I have the following error :
Here is the form and the error message.
The jenkins project "test" does exists and the credentials are good.
The issue is probably not about the credentials but that gitlab seem to not be able to read the fields.
I tried with both chrome and Firefox.
I also tried to use webhook, but for services hosted on the same network, doc says it may be hazardous. So I'd rather use the first method.
Some information about my environment:
Linux Centos 7
jenkins and gitlab are built by docker-compose
hosted on localhost
jenkins and gitlab use different ports (8080 and 8081)
I found only one thread on the internet about it here but no answer were given.
Any ideas?
Thanks
This is my first post, I hope I did not make any mistake.\
I want to use oauth2 to authenticate and authorize the MQTT protocol in Rabbitmq.
I found out there rabbitmq-auth-backend-oauth2 plugin that helps deploy this purpose.
However, I cannot install this plugin on the Rabbitmq-server as its plugin.
OS: Centos
Rabbitmq 3.7.14
Erlang 21
I installed this plugin with the command, but it's always failure:
make run-broker RABBITMQ_CONFIG_FILE = demo / symmetric_keys /
rabbitmq
Please let me know the deployment model of this integration and the configuration for rabbitmq, as well as installing the that plugin (of course if needed)
I am trying to trigger jenkins build whenever there is a push to GitLab.
I am referring to https://github.com/jenkinsci/gitlab-plugin.
When I test the connection for webhook it shows execution expired.
I am using:
Jenkins ver. 2.60.1
GitLab version 9.4.0-rc2-ee
Git lab plugin 1.4.6
The exact error message, clicking "Test setting" from GitLab:
We tried to send a request to the provided URL but an error occurred: execution expired
As mentioned in issue 128:
This looks and sounds like a configuration or network error.
Maybe your machine is not publicly available on the webhook address (firewall etc).
For instance, on Digital Ocean server, you would need to open up the port (mentioned in git-auto-deploy.conf.json) in the firewall:
sudo ufw allow 8866/tcp
Double-check though what you put in Manage Jenkins > Configure in term of Gitlab information (connection name, host url, credentials), as mentioned in jenkinsci/gitlab-plugin issue 391.
See GitLab Integration Jenkins: Configure the Jenkins server
It means issues in between jenkins server and gitlab or github server.
Like what I did:
I have set my local-IP:port/project/jenkins_project_name
http://192.168.1.21:8080/project/jenkins_project_name
and set the above URL in the gitlab webhook, it shouldn't work - right?
Because it's an IP that's private and not routable.
SO later I realized and set the public-IP and then hook worked.
http://public_IP:8080/project/jenkins_project_name
Note: To routable public-IP, you should expose port in your router [e.g. 8080 was for me or anything want ]
Hope this works.
I have faced the same issue.
In my case Jenkins is running in an AWS EC2 instance. I have resolved the issue by whitelisting the Public IP addresses of Gitlab on port 443 into the instance security group.
We recently created a docker registry in artifactory. In order to set it up properly, we had to configure a reverse proxy. Instructions here: https://www.jfrog.com/confluence/display/RTF/Configuring+a+Reverse+Proxy
After setting up the proxy, when trying to login to artifactory using the following command:
docker login
We get a 404 error. When we try to use base64 for our password, the password is not recognized.
Has anyone else experienced this issue? If you have and resolved it, can you please explain how you did it?
Artifactory Version: 4.14.2
Docker Version 1.12.3
Thank you
I have installed docker in my ubuntu 14.04 OS.In docker containers im running puppet master and puppet agent.But im getting errors during the certificate exchange.
The puppet agent is not requesting certificates.Also showing an error saying the name cannot be resolved.
I checked the IP and hostname in /etc/hosts and /etc/hostname.
root#55fe460464d3:/# puppet agent --test
Error: Could not request certificate: getaddrinfo: Name or service not known
Exiting; failed to retrieve certificate and waitforcert is disabled
root#f7d7516d720e:/# puppet cert list -all
+ "f7d7516d720e" (SHA256) D1:6C:50:5B:BD:F6:AA:91:C4:B2:FD:4D:58:B8:DF:18:32:F4:EB:D7:B2:75:FF:E4:AF:7B:F6:F6:FE:0D:84:54
The puppet cert list --all command is showing only the master certificate,not the client certificate
What it looks like is happening is that the puppet agent can't talk to or find the puppetmaster to ask for a certificate.
The first thing to check would be that they can talk to each other over the network; the second thing to check is that the short hostname puppet resolves to the puppetmaster when run on the host. Unless you've specified a different dns name in /etc/puppet/puppet.conf by setting a server =directive in the [main] section or specified it on the command line with puppet agent -t --server <foo>, it will look for a hostname called puppet and rely on your /etc/resolv.conf's search domains to find it.