How can i import a certificate like the command below but then using AWS CDK?
aws acm import-certificate --certificate file://Certificate.pem
--certificate-chain file://CertificateChain.pem
--private-key file://PrivateKey.pem
Since CDK generates cloudformation stack behind the senses, this method is not supported.
Currently, you can achieve that by import the certificate to ACM, And later import the certificate using the certificate ARN.
Usage:
import cert = require('#aws-cdk/aws-certificatemanager');
const certificateArn = "arn:aws:...";
const certificate = cert.Certificate.fromCertificateArn(this, 'CertificateImported', certificateArn);
More info about cert.Certificate.
Related
I'm learning google one tab sign-in according by https://codelabs.developers.google.com/codelabs/credential-management-api/index.html?index=..%2F..index#1
https://developers.google.com/identity/one-tap/web/guides/get-google-api-clientid
and
https://developers.google.com/web/fundamentals/security/credential-management
with 'password' login it work like a charm (without using python script)
but 'federated' login like google or facebook I have stuck on it.
I see demo server side scripts (main.py) from this repo https://github.com/GoogleChromeLabs/credential-management-sample
from google.appengine.ext import vendor
vendor.add('lib')
import os
import sys
import binascii
import json
import urllib
from bcrypt import bcrypt
from flask import Flask, request, make_response, render_template,\
session, redirect, url_for
from oauth2client import client
from google.appengine.ext import ndb
from google.appengine.api import urlfetch
...
I just carious about server side scripts rignt now google.appengine.ext only support on google app engine python runtime? Can I run this project on other server or localhost? or demo in other languages? I didn't find example (less document)
because of I tried to run demo project on localhost and I got error
Traceback (most recent call last):
File "working/main.py", line 17, in <module>
from google.appengine.ext import vendor
ModuleNotFoundError: No module named 'google'
Anyone help me please?
How to use Apache Beam Direct runner to authenticate with GOOGLE_APPLICATION_CREDENTIALS?
I don't want to authenticate using gcloud account. I have a service account(json) which I set as System variable. How do I have Apache Beam program(running as DirectRunner) authenticate using that GOOGLE_APPLICATION_CREDENTIALS?
My usecase would be access GCP Pub/Sub resources in the Apache beam program, so need to authenticate
Preface
As of today at the current version of the Python SDK 2.9.0 Google cloud PubSub functionality is still actively developed and currently only designed for the streaming processing use only, so you cannot use them as part of your pipeline to publish to PubSub.
You can still install a Google PubSub client but you will stumble upon some limitation and nuances due to specifics of the Apache beam runtime like:
You will have to install google-cloud-pubsub and specify your dependency for Apache beam
Overcome serialization incompatibility of installed PubSub client for Apache Beam pipeline
Somehow explicitly provide authentication credentials for the PubSub client
Here is a basic walkthrough to get a pipeline capable to publish elements from a processed collection to Google cloud PubSub service.
Let's assume a basic structure:
├── my_stuff
│ ├── __init__.py
│ └── my_package.py
├── .gitignore
├── main.py
├── README.md
└── setup.py
Get The PubSub Client
Install google-cloud-pubsub (assuming pip: pip install google-cloud-pubsub) and now you will hit the bump of providing dependencies and I suggest to follow the last section of the documentation and provide a setup.py with some meta-data and your package dependency:
from setuptools import find_packages, setup
setup(
name="my_stuff",
version="0.1.0",
description="My Pipeline for DirectRunner that publishes to Google Cloud PubSub",
long_description=open("README.md").read(),
classifiers=[
"Programming Language :: Python",
],
author="John Doe",
author_email="john.doe#example.com",
url="https://example.com/",
license="proprietary",
packages=find_packages(),
include_package_data=True,
zip_safe=True,
install_requires=['google-cloud-pubsub==0.35.4'],
)
You can use pip freeze | grep google-cloud-pubsub to get the exactly installed version.
Make PubSub Client Serializable
If you just try to Map a publishing function having an instance of the PubSub client then you will get a weird error from Apache Beam saying it cannot unserialize it.
To overcome this you can make a callable class and implement a few methods following the pickle documentation to overcome a serialization issue.
Here is a basic example of PubSub publisher instantiating a client:
class Publisher(object):
"""
PubSub publisher for the beam pipeline.
"""
#staticmethod
def init_client():
return pubsub_v1.PublisherClient(credentials='TODO: Get credentials')
def __init__(self, topic):
self.topic = topic
self.client = self.init_client()
def __getstate__(self):
return self.topic
def __setstate__(self, topic):
self.topic = topic
self.client = self.init_client()
def __call__(self, item, *args, **kwargs):
self.client.publish(self.topic, b'{}'.format(item))
Specify Credentials
It was not clear from the question there is a need to reuse credentials of the pipeline run-time or needed to be specified separately.
There are few ways to specify credentials. You can construct them using service_account.Credentials or reuse credentials from the run-time using GoogleCredentials.
Hard-coded credentials
from google.cloud import pubsub_v1
from google.oauth2 import service_account
client1 = pubsub_v1.PublisherClient(
credentials=service_account.Credentials.from_service_account_info({
"type": "service_account",
"project_id": "****",
"private_key_id": "****",
"private_key": "-----BEGIN PRIVATE KEY-----\n****\n-----END PRIVATE KEY-----\n",
"client_email": "****",
"client_id": "****",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/***"
})
)
Credentials from JSON file (via OS environment variable)
from google.cloud import pubsub_v1
from google.oauth2 import service_account
import os
client2 = pubsub_v1.PublisherClient(
credentials=service_account.Credentials.from_service_account_file(
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]
)
)
Reusing run-time credentials (used for pipeline execution)
from google.cloud import pubsub_v1
from oauth2client.client import GoogleCredentials
client3 = pubsub_v1.PublisherClient(
credentials=GoogleCredentials.get_application_default()
)
Usage
Now you can use Publisher in your pipeline like any other transformation:
published = (pipeline | "Publish" >> beam.Map(Publisher("pub/sub/topic")))
Just don't forget that you will need to add --setup_file /absolute/path/to/setup.py argument for the pipeline runner.
When you run locally, your Apache Beam pipeline always runs as the GCP account that you configured with the gcloud command-line tool. You can change the account used by gcloud using gcloud auth login and then gcloud config set.
Note that when you use Java and Maven, you can use the environment variable GOOGLE_APPLICATION_CREDENTIALS as detailed in the Quickstart.
And when you use Java and Eclipse, you can also configure it to use your service account key.
I have enabled signing in my click once application.
But the build server (TFS online services) do not have the certificate. Is there any way I can include the certificate in the repositiory and make the build server sign it or do I have to disable signing and do it manually after?
Instead of picking a cert from the store I checked in a file and picked from file instead.
Build server then gives me:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (2482): Cannot import the following key file: . The key file may be password protected. To correct this, try to import the certificate again or import the certificate manually into the current user’s personal certificate store.
(is it intended that I should check one in without the password and is that what others do? is it not dangerous to have private keys out there with no password.)
I created the following powershell script in the repository:
[CmdletBinding()]
param(
[Parameter(Mandatory=$True)]
[string]$signCertPassword
)
$SecurePassword = $signCertPassword | ConvertTo-SecureString -AsPlainText -Force
Write-Output "Importing pfx"
Import-PfxCertificate -FilePath "YourPfxFile.pfx" -Password $SecurePassword Cert:\CurrentUser\My
and passed the password as a variable (locked)
I created CodeActivites for using Mage.exe and Signtool.exe and build the signing into my build template instead.
From /t:publish all files had .deploy extension and I had to create a XML Task that updates the manifest xml to support this.
I'm working on an MDM NodeJS server for iOS. On the Apple docs, the following ruby code is given :
p7sign = OpenSSL::PKCS7::PKCS7.new(req.body)
store = OpenSSL::X509::Store.new
p7sign.verify(nil, store, nil, OpenSSL::PKCS7::NOVERIFY)
signers = p7sign.signers
What would be the equivalent in NodeJS?
The idea is to access p7sign.data that contains an xml plist.
Is this possible using either crypto or an external node lib (ursa, etc)?
A good option would be to use child_process to invoke openssl directly. I do that to validate iOS .mobileprovision files.
$ openssl smime -verify -in FILE -inform der
The openssl command needs to be the apple-provided (not from ports or homebrew) so that it can find signing certificates and CA's in the keychain.
I haven't tried this myself, but the node-forge library contains an implementation of many cryptographic algorithms.
https://npmjs.org/package/node-forge#pkcs7
Closely related to How to generate CSR when IIS is not installed.
I also do not have this installed. I am developing a mobile application for iOS, and i am trying to obtain a provisioning file so i can test my app locally. In the process of acquiring this, i am asked for a .csr file, and it instructs me on how to build this on my Mac. Except i don't have a mac, i have a PC, and my company exclusively uses PCs. I need this certificate, without having access to a Mac.
i have seen and used this CSR generator, but it gives me the key and request in long strings of characters, and i need a .csr file to upload to Apple.
Pasting it in notepad and changing the extension to .csr didn't work either :/
Does anyone have any insights on this?
You can install OpenSSL for windows and generate CSR file with this command:
openssl req -nodes -newkey rsa:2048 -keyout private_key.key -out cer_sign_request.csr
You'll be asked for a few questions which are optional (press ENTER).
This will generate a private key (such in keychain access) and a certification signing request as csr file.
For those who want an easy to use graphical interface, Digicert has a "Digicert Utility" that is pretty solid. You can use it to create a CSR. It doesnt give you back a private key, so you need to import your self signed or CA certificate to complete the installation of the certificate. Once installed, you can export it as a pfx or crt/key bundle.
set OPENSSL_CONF=c:\OpenSSL\openssl.cnf
if saved in c:\openssl
You can download this example fileopenssl-dem-server-cert-thvs.cnf
rename openssl
ren cert-thvs.cnf openssl.cnf