Integrate cloud run with exiting service mesh - google-cloud-run

We have an existing service mesh built using Envoy and internal service control and discovery stack. We want to offer cloud run to our developers. How can we integrate the cloud run into the mesh network so that:
1, The cloud run containers can talk to the mesh services.
2, The services built using cloud run can be discovered and used by other mesh services (each has a Envoy sidecar)

The GCP docs cover this with the Cloud Run for Anthos services using Istio doc.
In a nutshell, you will need to:
Create a GKE cluster with Cloud Run enabled.
Deploy a sample service to Cloud Run for Anthos on Google Cloud.
Create an IAM service account.
Create an Istio authentication policy.
Create an Istio authorization policy.
Test the solution.
But things change depending on how your existing service mesh is configured. Elaborating on that portion can allow the community to be able to better assist you.

Related

Create service or container from another container, on Google Cloud Run or Cloud Run on GKE

Can I create a service or container from another container, on Google Cloud Run or Cloud Run on GKE ?
I basically want to manage my containers/services dynamically from another container and not sure how to go about this
Adding more details:
One of my microservices needs to create new isolated containers that will run some user-land code. I would like to have full life-cycle control of these containers, run the code, and then destroy as needed.
I also looked at Cloud Run APIs but not sure how to run something like 'kubectl create ...' through the APIs? Is that the right approach?
Yes, you should be able to deploy Cloud Run services from Cloud Run services.
on Cloud Run (hosted): services by default run with Editor permissions, so this should be possible without any extra configuration
note that if you deploy apps with --allow-unauthenticated which requires setting IAM permissions, the Editor role will not be enough, as you need Owner role on the GCP project for that.
on Cloud Run on GKE: services by default run with limited scopes (as they by default inherit GKE node's permissions/scopes). You should add a service account to the Kubernetes Pod and use it to authenticate.
From there, you have several options:
Use the REST API directly: Since run.googleapis.com behaves like a Kubernetes API server, you can directly apply JSON objects of Knative Services. (You can use gcloud ... --log-http to learn how deployments are made using REST API requests).
Use gcloud: you can ship your container image with gcloud and invoke it from your process.
Use Google Cloud Client Libraries: You can use the client libraries that are available for Cloud Run (for example this Go library) to construct in-memory Service objects and send them to the API using a higher level client library (recommended approach)

Connect external workers to Cloud Composer airflow

Is it possible to connect an external worker that is not part of the Cloud Composer Kubernetes cluster? Use case would be connecting a box in a non-cloud data center to a Composer cluster.
Hybrid clusters are not currently supported in Cloud Composer. If you attempt to roll your own solution on top of Composer, I'd be very interested in hearing what did or didn't work for you.

How to deploy docker app using docker-compose.yml in cloud foundry

I have a docker-compose.yml file which have environment variable and certificates. I like to deploy these in cloud foundry dev version.
I want to deploy microgateway on cloud foundry link for microgateway is below-
https://github.com/CAAPIM/Microgateway
In cloud native world, you instantiate the services to your foundation beforehand. You can use prebuilt services (auto-scaler) available from the market place.
If the service you want is not available, you can install a tile (e.g redis, mysql, rabbitmq), which will add services to the market place. Lot of vendors provide tiles that can be installed on PCF (check on newtork.pivotal.io for the full list).
If you have services that are outside of cloud foundry (e.g. Oracle, Mongo, or MS Sql Server), and you wish to inject them into your cloud foundry foundation, you can create do that by creating User Provide Services (cups).
Once you have a service, you have to create a service instance. Think of it as provisioning a service for you. After you have provisioned i.e. created a service instance, then you can bind it to one or more apps.
A service instance is scoped to an org and a space. All apps within a org - space, can be bound to that service instance.
You deploy your app individually, by itself, to cloud foundry (jar, war, zip). You then bind any needed services to your app (e.g db, scaling, caching etc).
Use a manifest file to do all these steps in one deployment.
PCF 2.0 is introducing PKS - Pivotal Container Service. It is implementation of Kubo within PCF. It is still not GA.
Kubo, Kubernetes, and PKS allow you to deployed your containerized applications.
I have played with MiniKube and little bit of Kubo. Still getting my hands wet on PKS.
Hope this helps!

Spring Cloud Data Flow support of Swarm

Currently I can see that Spring Cloud Data Flow has these servers: Local, YARN, Cloud Foundry, Mesos, and Kubernetes; is there any plan for Swarm support?
Caleb: Spring Cloud Data Flow's deployer implementation is based on Spring Cloud Deployer's service provider interface (SPI), so there's currently SPI implementation for Local, CF, Yarn, Kubernetes, and Mesos. These implementations are managed in separate repos, too.
This decoupling provides flexibility and it is easy to extend to add new deployers. Though we haven't attempted to add Docker Swarm deployer yet, we would love to review contributions from the community.

Is it possible to use logmet service for Cloud Foundry app?

For VM and Containers (Docker), we can use logmet service (logging and metrics) as described in the Bluemix documentation. I wonder if we can use this service for Cloud Foundry app or not using log drain ( https://docs.cloudfoundry.org/devguide/services/log-management.html ).
Ref: https://developer.ibm.com/bluemix/2015/12/11/sending-logs-to-bluemix-using-logstash-forwarder/
For Cloud Foundry applications the Monitoring & Analytics service in the catalog provides similar functionality.

Resources