How to integrate backend apps from Azure Spring Cloud with some Angular frontend? Is AKS one solution?
Unless you are already running an AKS cluster, there are more time and cost-efficient ways to deploy an Angular front-end on Azure.
You can deploy static sites which are hosted from a storage account. There is a tool called ng-deploy which streamlines the process.
If you prefer a container based approach, you can create a container that runs nginx to serve your built Angular application and deploy it to Azure using Container Instances or App Services
Related
I have a SPA app dockerized with single Dockerfile (server side is by Kotlin with Spring boot, front end is by typescript with React) and am trying to host that docker image on GCP as web app.
At first I thought Cloud Run cloud be appropriate, but it seems that Cloud Run is serverless service and not for hosting a web app. I understand there are several options; App Engine(flexible environment), Compute Engine and Kubernetes Engine.
Considering the story above, can I ask GCP community support to decide which one to choose for the purposes;
Hosting Docker Image stored at Cloud Registry
That app should be publicly deployed; .i.e. everyone can access that app via browser like every other web sites
That deployed Docker Image needs to connect Cloud SQL to persist its data
Planning to use Cloud Build for CI/CD environment
Any help would be very appreciated. Thank you!
IMO, you need to avoid what you propose (Kubernetes, Compute Engine and App Engine Flex) and to (re)consider Cloud Run and App Engine Standard.
If you have a container, App Engine Standard isn't compliant, but you can simply deploy your code and let App Engine standard building and deploying its own container (with your code inside).
My preference is Cloud Run, and it's perfectly designed for webapp, as long as:
You only perform processing on request (no background process, not long running operation (more than 60 minutes))
You don't need to store data locally (but to store data in external service, in databases or storage)
I also recommend you to split your front end and your backend.
Deploy your Front End on App Engine standard or on Cloud Storage
Deploy your backend in Cloud Run (and thus in a container)
Put a HTTPS load balancer in front of both to remove CORS issues and to have only 1 URL to expose (behind your own domain name)
The main advantage are:
If you serve your file from Cloud Storage you can leverage cache and thus to reduce the cost and the latency. Same thing if you use CDN capacity in load balancer. If you host your front end in Cloud Run or any other compute system, you will use CPU to only serve static file, and you will pay for this CPU/memory -> useless
Separate the frontend and the backend let you the capacity to evolve independently the both part without redeploy the whole application, only the part that have changed.
The proposed pattern is an entreprise grade pattern. starting from 16$ per month, you can scale high and globally. You can also activate a WAF on load balancer to increase the security and attacks prevention.
So now, if you are agree with that, what's your next questions?
My distributed dapr.io application is growing very quickly and contains several dapr app-ids; and running all applications locally for development purposes is becoming difficult.
Is it possible for a local self-hosted app in development to invoke a production app running on an AKS cluster?
When you want to mix local development with existing services on AKS check out Bridge to Kubernetes
https://devblogs.microsoft.com/visualstudio/bridge-to-kubernetes-ga/
https://channel9.msdn.com/Shows/Visual-Studio-Toolbox/Bridge-to-Kubernetes
this information is intended preliminary, I will try to bring up a Dapr sample scenario
My question concerns kubernetes set up for the development purposes. The app consosts of 4 services (react + express backend, nginx for routing, neo4j as a db). Neo4J Db is deployed in google cloud (not by me and it is not mantained by me either) but all other services are running now locally as I’m developing the app. What I want to achieve is to start up and run all those services at once, all together with a simple command as it is possible in docker compose world (thru docker-compose up).
I have a docker-compose.yml file which have environment variable and certificates. I like to deploy these in cloud foundry dev version.
I want to deploy microgateway on cloud foundry link for microgateway is below-
https://github.com/CAAPIM/Microgateway
In cloud native world, you instantiate the services to your foundation beforehand. You can use prebuilt services (auto-scaler) available from the market place.
If the service you want is not available, you can install a tile (e.g redis, mysql, rabbitmq), which will add services to the market place. Lot of vendors provide tiles that can be installed on PCF (check on newtork.pivotal.io for the full list).
If you have services that are outside of cloud foundry (e.g. Oracle, Mongo, or MS Sql Server), and you wish to inject them into your cloud foundry foundation, you can create do that by creating User Provide Services (cups).
Once you have a service, you have to create a service instance. Think of it as provisioning a service for you. After you have provisioned i.e. created a service instance, then you can bind it to one or more apps.
A service instance is scoped to an org and a space. All apps within a org - space, can be bound to that service instance.
You deploy your app individually, by itself, to cloud foundry (jar, war, zip). You then bind any needed services to your app (e.g db, scaling, caching etc).
Use a manifest file to do all these steps in one deployment.
PCF 2.0 is introducing PKS - Pivotal Container Service. It is implementation of Kubo within PCF. It is still not GA.
Kubo, Kubernetes, and PKS allow you to deployed your containerized applications.
I have played with MiniKube and little bit of Kubo. Still getting my hands wet on PKS.
Hope this helps!
I am trying to create a Spring Cloud microservice using Spring and Spring Boot. I developed a Spring Cloud Eureka server as a separate spring boot project. And also created Spring Boot project for Zuul server for gateway proxy and routing.
Now I have 3 Spring Boot project, as mentioned above. Here I am bringing Docker container into my projects. So I explored pom.xml based containerization like by using fabric tool.
So when I deploying my microservices, need to deploy all projects by containerization? So is cloud platform treating all of the microservices into separate containers?
Is there any problem in cloud by deploying my all microservices in separate containers,since they may communicate/call each other?
So when I deploying my microservices, Need to deploy all projects by
containerization? So is cloud platform treating all of the
microservices into separate containers?
You are not forced to dockerize all you microservices. You can dockerize each microservice on its own and keep some non dockerized. However, there is no obvious benefit in doing that. If you are adopting Docker, it is better to Dockerize all
your microservices.
The Docker maven plugin configuration is better done in each project separately. I recommend using the maven plugin only to build the images and optionally push them to a registry. Then you can deploy each image separately. I would recommend
that you use Docker compose or Docker swarm for production, to deploy the different microservice containers.
Is there any problem in cloud by deploying my all microservices in separate containers,since they may communicate/call each other?
No there shouldn't be any problems. But you should be careful about hostnames that need to be used when container commmunicate. If you are using docker compose, containers can talk to each other directly by refering to the service name as a hostname. For instance, other microservices can register with Eureka by using http://eureka:8761. So make sure to set the correct application.properties to reach the other services.