I am currently working on a rails app and I want to use Openstack with object storage from OVH.
My error is :
connect_nonblock': SSL_connect returned=1 errno=0 state=unknown state:
certificate verify failed (OpenSSL::SSL::SSLError)
Unable to verify certificate. This may be an issue with the remote host or with Excon.Excon has certificates bundled, but these can be customized.
`Excon.defaults[:ssl_ca_path] = path_to_certs`,
`ENV['SSL_CERT_DIR'] = path_to_certs`,
`Excon.defaults[:ssl_ca_file] = path_to_file`,
`ENV['SSL_CERT_FILE'] = path_to_file`,
`Excon.defaults[:ssl_verify_callback] = callback` (see OpenSSL::SSL::SSLContext#verify_callback),
or `Excon.defaults[:ssl_verify_peer] = false` (less secure). (Excon::Errors::CertificateError)
Does anyone have a tips to do it ?
I have followed this tutorial in french:
https://gist.github.com/BaptisteDixneuf/85dc4419a0398446d2d3
and there is my carrierwave config file :
CarrierWave.configure do |config|
config.fog_provider = 'fog/openstack'
config.fog_credentials = {
:provider => 'OpenStack',
:openstack_username => ENV['OS_USERNAME'],
:openstack_api_key => ENV['OS_USER_MDP'],
:openstack_auth_url => ENV['OS_AUTH_URL'],
:openstack_region => 'GRA1'
}
end
As it says, your app have trouble connecting to openstack because it cannot checks the certificate.
It then provide various ways to overcome the problem.
These ones are used to provide the certificate manually
`Excon.defaults[:ssl_ca_path] = path_to_certs`,
`ENV['SSL_CERT_DIR'] = path_to_certs`,
`Excon.defaults[:ssl_ca_file] = path_to_file`,
`ENV['SSL_CERT_FILE'] = path_to_file`,
The other two bypass the standart verification by respectively manually check it and ignore it.
`Excon.defaults[:ssl_verify_callback] = callback` (see OpenSSL::SSL::SSLContext#verify_callback),
or `Excon.defaults[:ssl_verify_peer] = false` (less secure). (Excon::Errors::CertificateError)
OVH's Openstack cloud used valid certificats. Ensure your server have common ca-certificat list installed. And update openssl library.
Related
resource "azurerm_analysis_services_server" "server" {
name = "analysisservicesserver"
location = "northeurope"
resource_group_name = azurerm_resource_group.rg.name
sku = "S0"
admin_users = ["myuser#domain.tld"]
enable_power_bi_service = true
backup_blob_container_uri = ("https://${STORAGE ACCOUNT NAME}.blob.core.windows.net/${CONTAINER NAME}%s", Blob SAS TOKEN)
*Storage Firewall disable
*Original Error: Code="BadRequest" Message="Invalid backup blob container 'The remote server returned an error: (403) Forbidden.'. Azure blob storage documentation can be found here: https://go.microsoft.com/fwlink/?linkid=2106906"
*I am able to add same container via portal without any error
*I also try to copy and past "Blob SAS URL" directly from storage still the same error
Tested in my environment getting the same error even changing the provider version. Might azurerm_analysis_services_server outdated for backup_blob_container_uri
You can refer this Github Disccusion there was same error reported long back you can follow with them might be there will be new release for this to sort out this error.
I have MQTT (VerneMQ) setup with TLS authentication. Also I have setup frequent pulling
CRL (certificate revocation list) from CA (Private). I am able to revoke the specific client certificate to blocking it from connecting to MQTT.
There is one certificate which is shared & I don't want to revoke but also don't want client to be able to authenticate with MQTT. Following is my configuration
DOCKER_VERNEMQ_ACCEPT_EULA = "yes"
MY_POD_NAME = "vernemq"
DOCKER_VERNEMQ_KUBERNETES_APP_LABEL = "vernemq"
DOCKER_VERNEMQ_LOG__CONSOLE__LEVEL = "debug"
DOCKER_VERNEMQ_KUBERNETES_LABEL_SELECTOR = "app=vernemq"
DOCKER_VERNEMQ_LISTENER__TCP__ALLOWED_PROTOCOL_VERSIONS = "3,4,5"
DOCKER_VERNEMQ_ALLOW_ANONYMOUS = "on"
DOCKER_VERNEMQ_KUBERNETES_INSECURE = "1"
DOCKER_VERNEMQ_MAX_ONLINE_MESSAGES = "-1"
DOCKER_VERNEMQ_MAX_OFFLINE_MESSAGES = "-1"
DOCKER_VERNEMQ_MAX_INFLIGHT_MESSAGES = "0"
DOCKER_VERNEMQ_LISTENER__TCP__DEFAULT = "0.0.0.0:1883"
DOCKER_VERNEMQ_LISTENER__SSL__DEFAULT = "0.0.0.0:8883"
DOCKER_VERNEMQ_LISTENER__WS__DEFAULT = "0.0.0.0:8080"
DOCKER_VERNEMQ_LISTENER__HTTP__METRICS = "0.0.0.0:8888"
DOCKER_VERNEMQ_LISTENER__HTTP__DEFAULT = "0.0.0.0:8888"
DOCKER_VERNEMQ_LISTENER__SSL__REQUIRE_CERTIFICATE = "on"
# DOCKER_VERNEMQ_LISTENER__SSL__USE_IDENTITY_AS_USERNAME = "on"
DOCKER_VERNEMQ_LISTENER__SSL__CAFILE = "/vernemq/cert/ca.crt"
DOCKER_VERNEMQ_LISTENER__SSL__CERTFILE = "/vernemq/cert/server.crt"
DOCKER_VERNEMQ_LISTENER__SSL__KEYFILE = "/vernemq/cert/server.key"
DOCKER_VERNEMQ_LISTENER__SSL__CRLFILE = "/tmp/shared/ca.crl"
DOCKER_VERNEMQ_ALLOW_REGISTER_DURING_NETSPLIT = "on"
DOCKER_VERNEMQ_ALLOW_PUBLISH_DURING_NETSPLIT = "on"
DOCKER_VERNEMQ_ALLOW_SUBSCRIBE_DURING_NETSPLIT = "on"
DOCKER_VERNEMQ_ALLOW_UNSUBSCRIBE_DURING_NETSPLIT = "on"
Any way I can block the specific client certificate ?
I'm not familiar with vernemq's specific options but why not just set up the ACL to block the user represented by that certificate from being able to subscribe or publish to any topics.
Clients would still be able to connect with that shared certificate, but would not be able to receive or publish any messages.
To make this work you would probably have to use the Certificate identity as the User name (but you appear to have commented that out of the env vars you have shown in the question)
I am trying to sync my mails from gmail to my local server using OfflineImap v7.2.1. I followed this tutorial: Using Offlineimap with the Gmail IMAP API and got it working!
Here is my .offlineimaprc file:
[general]
accounts = ExampleCompany
[Account ExampleCompany]
localrepository = ExampleCompanyLocal
remoterepository = ExampleCompanyRemote
postsynchook = notmuch new
#newer versions don't need this
#status_backend = sqlite
[Repository ExampleCompanyRemote]
type = IMAP
remotehost = imap.gmail.com
remoteuser = my-username#gmail.com
ssl = yes
starttls = no
sslcacertfile = /etc/ssl/certs/ca-certificates.crt
### You'll need to configure the gmail API stuff here:
auth_mechanisms = XOAUTH2
oauth2_client_id = XXXX7-eXXXX.apps.googleusercontent.com
oauth2_client_secret = 9XXXXXP
oauth2_request_url = https://accounts.google.com/o/oauth2/token
#oauth2_refresh_token = 1/ZXXXXXw
oauth2_access_token = ya29.XXXXXIHbcS
## remove Gmail prefix on IMAP folders
nametrans = lambda f: f.replace('[Gmail]/', '') if
f.startswith('[Gmail]/') else f
[Repository ExampleCompanyLocal]
type = Maildir
localfolders = ~/mail
restoreatime = no
# Do not sync this folder
folderfilter = lambda folder: folder not in ['2007-2011-inbox']
## Remove GMAIL prefix on Google-specific IMAP folders that are pulled down.
nametrans = lambda f: '[Gmail]/' + f if f in ['Drafts', 'Starred', 'Important', 'Spam', 'Trash', 'All Mail', 'Sent Mail'] else f
I currently generate my Access Token and Refresh Token using this python script from google. I would however like this tokens to be generated from an ios app and then sent to the backend to start syncing. I am using AppAuth to do this but OfflineImap always errors out when using these credentials gotten from the IOS app. error
ERROR: All authentication types failed:
XOAUTH2: [AUTHENTICATIONFAILED] Invalid credentials (Failure)
Any idea why these credentials will be invalid? I am using the same client_id and client_secret when running the script and the app. I think i am missing something obvious.
Here is the authorization request on the app in swift:
// builds authentication request
let request = OIDAuthorizationRequest(configuration: configuration,
clientId: "XXXX7-eXXXX.apps.googleusercontent.com",
clientSecret: "9XXXXXP",
scopes: [OIDScopeEmail],
redirectURL: redirectURI,
responseType: OIDResponseTypeCode,
additionalParameters: nil)
Thank you
I want to use the Time zone gem to convert lat and long to a time zone;
I have added the gem in to my file
I have created timezone.rb in config/initializers
I enabled the Google Maps Time Zone API, and added a server key
timezone.rb file:
Timezone::Lookup.config(:google) do |c|
c.api_key = 'server_key'
end
When I type to rails console;
> timezone = Timezone['America/Los_Angeles']
=> #<Timezone::Zone name: "America/Los_Angeles">
This works but when I try to use lat and long google gives connection error;
> timezone = Timezone.lookup(-34.92771808058, 138.477041423321)
Timezone::Error::Google: SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
Try to check this Github, it will explain everything to you about Google Time Zone API using ruby.
Just make sure you follow this steps:
Ensure you have a Google API Key, server key
Enable the Google Maps Time Zone API
Configure your lookup. NOTE: in Rails it is recommended that you add this code to an initializer
.
Timezone::Lookup.config(:google) do |c|
c.api_key = 'your_google_api_key_goes_here'
c.client_id = 'your_google_client_id' # if using 'Google for Work'
end
For more information, you can also check this thread and SO question.
This example code fails:
require("socket")
require("ssl")
-- TLS/SSL server parameters
local params = {
mode = "server",
protocol = "sslv23",
key = "./keys/server.key",
certificate = "./keys/server.crt",
cafile = "./keys/server.key",
password = "123456",
verify = {"peer", "fail_if_no_peer_cert"},
options = {"all", "no_sslv2"},
ciphers = "ALL:!ADH:#STRENGTH",
}
local socket = require("socket")
local server = socket.bind("*", 8888)
local client = server:accept()
client:settimeout(10)
-- TLS/SSL initialization
local conn,emsg = ssl.wrap(client, params)
print(emsg)
conn:dohandshake()
--
conn:send("one line\n")
conn:close()
request
https://localhost:8888/
output
error loading CA locations ((null))
lua: a.lua:25: attempt to index local 'conn' (a nil value)
stack traceback:
a.lua:25: in main chunk
[C]: ?
Not very much info. Any idea how to trace down to the problem ?
Update
Got this now: the cafile parameter is not necessary for server mode:
local params = {
mode = "server",
protocol = "sslv23",
key = "./keys/server.key",
certificate = "./keys/server.crt",
password = "123456",
options = {"all", "no_sslv2"},
ciphers = "ALL:!ADH:#STRENGTH",
}
LuaSec is a binding for OpenSSL, so the error you are getting (error loading CA locations) means that the OpenSSL library cannot read your CA files. Are you sure they are in the current directory and with proper permissions?
EDIT: According to LuaSec sources, it currently uses only the PEM format for private key. Ensure that the private key is stored as PEM, not DER.
CAFile contains the set of certificates (.crt) that your server or client trust. You put the key (.key).