I have been trying to get this to work for days, but can anyone point me in the right direction (tutorial ect.) for how to get LetsEncypt certs to run in Solr?
I have a Rails site running on 443 with LetsEncrypt and have added solr on the same machine, on a different port. The Solr is running SSL with self-signed certs.
Can i use the same Certs for the main site as both the site and the solr can be accessed on the same URL?
Obviously the solr instance is secured via IP tables as it seemed an easyier setup than reverse proxying while trying to use the same cert.
Any pointers or hints greatly appreciated!
Thanks
Sometimes i overlook the obvious!
As i have a key for the Domain already, and Solr responds on mydomain.com:8983 all that is needed is to create a Java Key Store (jks) from the existing keys on the system.
So all that was needed is
openssl pkcs12 -export -in /etc/letsencrypt/live/mydomain.com/fullchain.pem -inkey /etc/letsencrypt/live/mydomain.com/privkey.pem -out pkcs.p12 -name NAME
specifing the location of the Lets-Encrypt Cert (on my system /etc/letsencrypt/live/mydomain.com/)
Then convert the PKCS12 key to a jks...
keytool -importkeystore -deststorepass PASSWORD_STORE -destkeypass PASSWORD_KEYPASS -destkeystore keystore.jks -srckeystore pkcs.p12 -srcstoretype PKCS12 -srcstorepass STORE_PASS -alias NAME
replacing password where needed.
I would have thought the best practace here would be to Automate this in a bash script to be run when the Lets-encrypt certs are renewed.
Related
We need to migrate the Jenkins URL from http to https.
We have a server where Jenkins is installed and the jobs are running fine.
Now for the security enhancement we need to migrate to https.
We received a pkcs12 certificate. Now to include the certificate , the following steps was followed
Using the keytool , converted the pkcs12 to jks format.
Command used
keytool -v -importkeystore -srckeystore D:\Installations\JENKINS_HOME\httpsCertificate\certificate.p12 -srcstoretype PKCS12 -destkeystore D:\Installations\JENKINS_HOME\httpsCertificate\certificate.jks -deststoretype JKS
To include the jks certificate , changed the jenkins xml file with the following arguments
--httpPort=-1 --httpsPort=8443 --httpsKeyStore="D:\Installations\JENKINS_HOME\httpsCertificate\certificate.jks" --httpsKeyStorePassword=***** --httpsListenAddress="0.0.0.0"
Restarted Jenkins service
But after the restart even though it was mapped to https but showing as not secure
The Jenkins URL after the migration process
IT team sent me an "cert.pfx" ssl certificate file to use our "subdomain.domain.com" web site.
I have to add this ssl cert to Jenkins and as I know it uses *.jks files. But I'm not sure how to convert pfx to jks.
I used the command below, it's creating an jks file but Jenkins giving error.
keytool -importkeystore -srckeystore mypfxfile.pfx -srcstoretype pkcs12 -destkeystore clientcert.jks -deststoretype JKS
Any help appreciated.
Thanks!
Above command to generate .jks from .pfx looks fine, make sure that you have given the password to your jks file (Best practice to give password for .jks file).
Next step is, you need to make sure that %Jenkins_Home%\jenkins.xml has a correct configuration for .jks file.
Here is an example of Jenkins HTTPS connection setting,
-httpPort=-1 (To stop Jenkins from listening over plain HTTP)
-httpsPort=8080 (or 8181 or whatever SSL port you want Jenkins to listen on)
-httpsKeyStore="<JavaKeystore_path>\clientcert.jks"
-httpsKeyStorePassword="<cleartext-password-to-keystore>"
After modification of jenkins.xml, restart the windows jenkins service, it must be running.
I am in process of creating an app using the following stack:
Python 3.6 + Flask
uwsgi
Nginx
executing inside a Docker Container.
This app in turn calls Jira API to gather and manipulate data. Most of the inbound to app are working fine. But, when the app tries to call Jira API, it is throwing the following error:
[SSL: CERTIFICATE_VERIFY_FAILED]
I believe, this issue is occurring due to a presence of a self-signed certificate in the chain (which is not avoidable).
I did import the certs into docker image and curl command worked fine (initially curl was also throwing the insecure warning).
Also, in order to isolate the issue, I switched off nginx and launched the app directly using uwsgi (uwsgi --socket 0.0.0.0:8080 --protocol=http -w [module]:[app]) and see the same error in uwsgi console.
Does this mean that I need to import the SSL certificate into uwsgi ?
If so, how exactly to do that. I don't intend to make my app secure using own certificate or keys.
This program works, if I purely run the flask app without any uwsgi, nginx and docker.
[edit] Adding Nginx config
server {
listen 8080;
location / {
try_files $uri #app;
}
location #app {
include uwsgi_params;
uwsgi_pass unix:///tmp/uwsgi.sock;
}
location /static {
alias /app/static;
}
location = / {
index /static/index.html;
}
}
uwsgi-directly invoking it using CLI for debugging
[Edit2]
So I did some more troubleshooting:
Created one simple script which is just calling the Jira url
Ran the script in my local (mac os) using python3 [scriptname]. This worked fine and printed a 200 OK
Copied the same file into my container and ran the same code.
Got the same issue.
Used the same URL with CURL and it got the response.
It seems that even though curl works fine, python itself is throwing the SSL error.
So NOW the question may be, how to handle SSL error with python !?
I was finally able to resolve the issue. The issue was related to request call being made by the code.
By default, the request calls have verify set to true. Also, request uses in-built certs for verification.
In my case, since I was using custom certs, request was failing to validate the Jira site.
In order to solve the issue, I created an ENV variable in docker file and pointed request to refer to my image's certs directory.
Exact line
ENV REQUESTS_CA_BUNDLE=/etc/ssl/certs/
Do note, the custom certs should still need to be imported into docker image during build.
This error also happens when crt or pem file is not a bundle certificate file.
To make a bundle, issue this command:
openssl pkcs12 -in my_certificate.pfx -nodes -nokeys -out server-bundle.pem
if you need your server key also, then issue:
openssl pkcs12 -in my_certificate.pfx -nocerts -nodes -out server.key
And update SSL certificate entries on apache or nginx config file and restart and then verify with:
http https://yourdomain.com
or
http --debug -j --verify server-bundle.pem https://yourdomain.com
I'm trying to make requests to a private Docker registry but it requires me to login and responds with a 401 response. I've tried checking the docs but it doesn't say anything about the authorization process. So my questions is how to successfully make HTTP requests to a private Docker registry with authorization enabled using the REST API?
https://www.digitalocean.com/community/tutorials/how-to-set-up-a-private-docker-registry-on-ubuntu-14-04
That article was extremely helpful for me in setting up a secure private Docker registry. Goes through everything you'll need.
(This part, https://www.digitalocean.com/community/tutorials/how-to-set-up-a-private-docker-registry-on-ubuntu-14-04#step-four-—-secure-your-docker-registry-with-nginx, talks about securing the registry with basic HTTP authentication.)
docker registry requires ssl to be set up; you will have to configure ssl to get it to work.
I tried following the tutorial on digitalocean (https://www.digitalocean.com/community/tutorials/how-to-set-up-a-private-docker-registry-on-ubuntu-14-04)
But there are a number of little problems with it and I feel it doesn't quite do what I need it to. I tried the ssl instructions verbatim and it didn't work for me. Here is what I had to do to get set up with ssl (using fairly generic names) using a self-signed certificate:
Make a directory to store the ssl cert:
mkdir /etc/nginx/ssl
Create a certificate and key file:
openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/nginx/ssl/nginx.key -out /etc/nginx/ssl/nginx.crt
Remember to put your common name as per instructions everywhere else (domain name)
Create a new directory under the ca-certificates directory:
mkdir /usr/share/ca-certificates/nginx
Copy the certificate file to that directory:
cp /etc/nginx/ssl/nginx.crt /usr/share/ca-certificates/
Append the following to the /etc/ca-certificates.conf file:
nginx/nginx.crt
Then update the certificates:
update-ca-certificates --fresh
I'm working on an app for a client that requires an SSL connection with an API. I've been provided with three files; a trust root certificate (.cer) file, an intermediate certificate (.cer) file and a signed response file. The instructions I've been given to install this relate to either IIS or the Java keytool program; I'm building the app in Ruby on Rails so neither is an option (as far as I am aware).
The certificates are self-signed by the organisation who runs the API service and it appears I get given client certificates to mutually authenticate an https connection. I'm unsure how to
use the certificates in my application to connect and use the API
what the signed response file does
I've read "Using a self-signed certificate" and this article on OpenSSL in Ruby but neither seems to quite hit the spot (and both have some reliance on Java/JRuby which confuses things).
Any pointers would be greatly appreciated.
Based on your comments, I'm assuming that the certificates are in DER format, which you can convert to PEM with the openssl x509 command (see: openssl x509 command):
openssl x509 -inform DER -outform PEM -in certfile.cer -out certfile.pem
After that, you can instruct the Ruby OpenSSL library to use the trusted root certificate to authenticate the SSL connection with something like this:
require 'socket'
require 'openssl'
tcp_sock = TCPSocket.new("my.host.tld", 443)
ctx = OpenSSL::SSL::SSLContext.new
ctx.verify_mode = OpenSSL::SSL::VERIFY_PEER|OpenSSL::SSL::VERIFY_FAIL_IF_NO_PEER_CERT
#You may need to specify the absolute path to the file
ctx.ca_file = "certfile.pem"
ssl_sock = OpenSSL::SSL::SSLSocket.new(tcp_sock, ctx)
ssl_sock.sync_close = true
ssl_sock.connect
begin
ssl_sock.post_connection_check('my.host.tld')
rescue
puts "Certificate host did not match expected hostname"
end
After that, you should be able to read and write to ssl_sock like any other Ruby IO object. If you are given a client certificate to use to allow the server to authenticate you, you can configure the SSL context with:
ctx.cert = OpenSSL::X509::Certificate.new(File.read("my_cert.pem"))
ctx.key = OpenSSL::PKey::RSA.new(File.read("my_key.rsa"))
before you create ssl_sock. The OpenSSL library also supports other key types besides RSA, such as DSA (see: OpenSSL::PKey module.)
Finally, a last piece of advice, if you are accessing a RESTful API, you may want to consider using a gem like rest-client instead of handling all of the HTTP/S connection stuff directly. Whether or not such a library is appropriate or useful will depend on the service you are using, of course.