I´ve installed the Skip Certificate Check plugin on our Jenkins server, re-downloaded slave.jar & jnlp to the agent, but when I start them, I still get the message about the untrusted SSL certificate.
Any advice?
Thanks in advance,
Christian
Add -noCertificateCheck to the arguments for slave.jar.
If I understand the Skip Certificate Check Plugin correctly (I haven't used it) it makes the Jenkins Master process skip cert checks. E.g. to ignore the warning, when checking for updates, if you have a self-signed cert for your Subversion server.
If I understand your problem, it seems to be to get the slave to connect to the master when the master has a self-signed cert. The switch above makes the slave-process ignore cert warnings when it's connecting to the master.
Related
Azure DevOps Server 2020 with self hosted agents on a different server
I have a build, which is creating an artifact in Azure Artifacts, then a release pipeline is triggered which should download this artifact and do something with it. Previously, the output of a build was on a file share and it work as expected. When I switched to Azure Artifacts - the download artifact task (which is added automatically by Azure DevOps) fails with following:
All other build tasks work on this server without a problem, and also this release pipeline works as expected on other servers. How I can start to troubleshoot this issue as I don't see any meaningful error message?
According to your screenshot, I could reproduce the similar issue in Azure Devops Server.
But in the task, I could see the error message:
Failed in getBuildApi with error: Error: self signed certificate.
If you have the same issue, you could try the following methods:
1.You could re-config the agent with the self signed certificate.
.\config.cmd --sslcacert ca.pem xxxx
Here is a ticket about the detailed steps, you could refer to it.
2.You could check if you have set the firewall. Firewalls could block the download of artifacts
3.You could set the system environment :NODE_TLS_REJECT_UNAUTHORIZED=0 and restart the agent services.
We had similar problem with self-hosted agents but there was a minor difference - our agents were deployed in vnet.
If that's the case make sure that in vnet's subnet you have enabled Microsoft.Storage service endpoint.
Hope this will save someone tons of time.
Im new on jenkins technologie.
I started with creating a jenkins job that pull code from gitlab to jenkins. this job did not worked as I had an issue tells that jenkins does not trusted the self signed certificate used by the gitlab server
issue in screenshot
Can I did some configuration from the jenkins inetrface (from web site) to allow cloning the self signed certificate.
Any help is really appreciated.
Thank you
Caution: To be used only by understanding the security issues that will crop-up by this behaviour.
My company is hosting github enterprise with a custom certificate and hence faced similar situation. Following is the work-around I've done to mitigate the issue.
Login to jenkins server as jenkins user (I've used sudo su jenkins to do this in my case).
Add the following lines to ~/.gitconfig and save.
[http]
sslVerify = false
Restart jenkins server.
From the image you haven't specified any credentials when trying to access your repo generally you would mention some form of credentials
apart from that your computer doesn't trust the certificate from gitlab you have to add the certificate into your git(installed location)
This link will give you a detailed explaination :)
server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none
I'm trying to connect Gitlab CE 8.16 with Jenkins 2.46.1 using the GitLab hook plugin 1.4 to trigger builds when push or merge.
So I checked "Build when a change is pushed to GitLab", copied the GitLab CI Service URL: http://server:port/project/my-project and the security token, to gitlab webhook, disabled ssl verification and when I clicked on Test, I got this error :
Hook execution failed: execution expired
What am I doing wrong, please? How can I make it work?
There are a few things that are needed to make it work, there is documentation here:
https://github.com/jenkinsci/gitlab-plugin#global-plugin-configuration
So:
Make sure the jenkins user that you use on the GitLab side has the proper permissions - it needs project access and the APITOKEN needs to be there
Create the webhook on the project in GitLab that corresponds to the project in Jenkins (the Jenkins project that uses the git repo you are working with)
In GitLab, when you create webhooks to trigger Jenkins jobs, use this format for the URL and do not enter anything for 'Secret Token': https://USERID:APITOKEN#JENKINS_URL/project/YOUR_JOB
You can use a non-https link too and skip SSL verification if the certificate is not valid. Either way, the gitlab server has to be able to connect to the name and port you are using there.
Hit test and it should work, if not, you might not be able to connect to the server. Make sure your Jenkins server is listening on the URL and port that you are using, the error seems to be related to that not being right.
It's possible that GitLab server is not allowed to connect to the internet, or to the network you have the Jenkins server on, or there might be a firewall blocking the port you try to connect to (80/443) on the local Jenkins machine.
Try to do for example a curl to the Jenkins server and see what comes back:
curl http://you.jenkins.fqdn/
If you don't get something like:
<html><head><meta http-equiv='refresh' content='1;url=/loginEntry?from=%2F'/><script>window.location.replace('/loginEntry?from=%2F');</script></head><body style='background-color:white; color:white;'>
Authentication required
<!--
You are authenticated as: anonymous
Groups that you are in:
Permission you need to have (but didn't): hudson.model.Hudson.Read
... which is implied by: hudson.security.Permission.GenericRead
... which is implied by: hudson.model.Hudson.Administer
-->
</body></html>
then you cannot connect.
If it's not the Jenkins server where the issue is, you need to ask the network people that manage the server about it.
Hope that helps, good luck!
Make sure to use the latest 1.4 GitLab hook plugin (1.4.3, March 2016)
Look into your GitLab production.log, as in this issue, and see if this is a proxy configuration problem.
You should at least the context of that error message.
Here is what worked for me:
Ensure there is a merge request, even if you don't intend to actually merge any branches.
Go to branches -> select 'merge request' for a branch to merge -> create the request
Now try to test the integration.
I need to run Jenkins over HTTPS.
I created the certificate and put it into the keystore. Then I launched Jenkins with the following options:
--httpsPort=8443 --httpsKeyStore=/etc/pki/java/cacerts --httpsKeyStorePassword=changeit
So far so good. But I had "peer not authenticated" error when deploying a file in the Artifactory.
According to a solution here I added my artifactory certificate to the keystore.
Now I can deploy files on Artifactory but Jenkins picked up the wrong entry from the keystore (the one of Artifactory). And I have a wrong certificate associated with Jenkins.
I was not able to specify the alias which should be used by Jenkins.
According to Jenkins docs it's possible to run Jenkins with the following options:
--httpsPort=443 --httpsCertificate=path/to/cert --httpsPrivateKey=path/to/privatekey
But in this case I'm not able to use my Artifactory certificate.
How can I run Jenkins with both Jenkins and Artifactory certificates ?
Trying different things I came to an idea to rename the alias of jenkins certificate.
The new alias is jenkins.
Strangely enough it solved my problem. For me it looks more like a hack than a solution. Put your real solution here and I'll accept it.
There are lots of question on here about Permission denied (publickey) errors when using the Jenkins git plugin.
Can someone explain the authentication flow this plugin uses to check out a repository? I can't find a good description on the plugin page.
I want to just SSH into the build slave, checkout the repository there, then run my job, but clearly that is not how it works.
I guess I could add my credentials to the jenkins master, but I dont want any code there. I want it on my build slave.
Issue has nothing to do with git really. As their documentation states, it relies on git runtime which in its turn relies on system environment when it comes to secure connections. Ssh requires client to have valid key to connect and fails to that message if client does not provide one. Without any additional actions, key is not injected into environment, so client could not provide any valid key.
What you actually can use is ssh agent plugin. That allows to add key to ssh-agent on slave that will be catched up by git.