Swagger UI not showing when Quarkus api deployed to Kubernetes - swagger-ui

I have deployed an api that is developed on Quarkus to Kubernetes by enabling swagger with the properties,
quarkus.swagger-ui.always-include=true
quarkus.swagger-ui.path=/swagger-ui.html
In local swagger comes up, but when deployed to Kubernetes, swagger UI doesnt show any api's, it shows below error.enter image description here
any suggestions? on how to fix the issue?

Related

Google cloud vision Python client times out when request comes from from Cloud run service

The bounty expires in 18 hours. Answers to this question are eligible for a +50 reputation bounty.
okonomichiyaki is looking for an answer from a reputable source.
I have a Python application (using Flask) which uses the Google Cloud Vision API client (via pip package google-cloud-vision) to analyze images for text using OCR (via TEXT_DETECTION feature in the API). This works fine when run locally providing Google credentials on the command line via GOOGLE_APPLICATION_CREDENTIALS environment variable pointing to the JSON file I got from a service account in my project with access to the Vision API. It also works fine locally in a Docker container, when the same JSON file is injected via a volume (following the recommendations in the Cloud run docs).
However, when I deploy my application to Cloud run, the same code fails to successfully make a request to the Cloud Vision API in a timely manner, and eventually times out. (the Flask app returns an HTTP 504) Then the container seems to become unhealthy: all subsequent requests (even those not interacting with the Vision API) also time out.
In the Cloud run logs, the last thing logged appears to be related to Google cloud authentication:
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
I believe my project is configured correctly to access this API: as already stated I can use the API locally via the environment variable. And the service is running in Cloud Run using this same service account (at least I believe it is, serviceAccountName field in the YAML tab matches, and I'm deploying it via gcloud run deploy --service-account ...)
Furthermore, the application can access the same Vision API without using the official Python client (locally and in Cloud run), when accessed using an API key and a plain HTTP POST via requests package. So the Cloud run deploy seems to be able to make this API call and the project has access. But there is something wrong with the project in the context of the combination of Cloud run and the official Python client.
I admit this is basically a duplicate of this 4 year old question. But aside from being old that has no resolution and I hope I can provided more details that might help get a good answer. I would much prefer to use the official client

Trigger deployment of Docker container on demand

I have a web application that helps my client launch an API with a button "Launch my API".
Under the hood, I have a Docker image that is ran on two Google Cloud Run services (one for debug environment and one for production).
My challenge is the following: How can I trigger the deployment of new Docker containers on-demand ?
Naively, I would like that this button call an API that trigger the launch of these services based on my Docker image (that is already in Google Cloud or available to download at a certain URL).
Ultimately, I'll need to use Kubernetes to manage all of my container's clients. Maybe I should look into that for triggering new container deployments ?
I tried to glue together (I'm very new to the cloud) a Google Cloud function that trigger a new service on Google Cloud Run based on my docker image but with no success.

How to deploy a Flask Backend and React Front End on Google Cloud

I know this may seem like an opinion-based question but I can't seem to find any answers anywhere. I'm having trouble figuring out how to deploy my flask backend and react front end on google cloud. I am using a docker-compose on my local machine but I can't seem to find a way to deploy that on Google Cloud.
My question is, is there a way to deploy them using a docker-compose file using Cloud Build and Cloud Run? Or do I have to create two different Cloud Run instances to run the frontend and backend? Or is it better to create a VM instance and run the docker-compose container on there (and how would one even do this)? I am very new to deployment so any help is appreciated.
For reference, I saw this but it didn't exactly answer my question. Thanks in advance!
You use docker-compose for multi-container applications. In your case it wouldn't make much sense.
You have a python backend. You can containerize it and deploy to Cloud Run, Cloud Functions, App Engine, Google Kubernetes Engine or even on a Compute Engine VM. In my opinion the most convenient option would be Cloud Run.
If your React frontend is a Single Page App, it communicates with your python backend with HTTP requests. You build the HTML/CSS/JS files and host them somewhere, like a Cloud Storage bucket or Cloud CDN.

How to deploy prefect workflow on azure web app service?

Can anyone has an idea about how to deploy prefect UI and backed on azure web app server using docker/docker-compose?
I need to deploy prefect workflow on the azure web app service.
Thanks for your help.
The Prefect blog has a post about deploying the server to Google cloud compute, the steps should be similar.
Prefect blog: Prefect Server 101
Edit: As OP points out GCP not a docker instance. Adding links I found previously related to docker.
PrefectHQ github issue, docker compose on VM
Github users K8 helm chart configuration might help
I can’t now find the issue on PrefectHQ github that specifically covered prefect server configs when runninng on docker, its a good place to look.

How to integrate Azure Spring Cloud with Angular frontend

How to integrate backend apps from Azure Spring Cloud with some Angular frontend? Is AKS one solution?
Unless you are already running an AKS cluster, there are more time and cost-efficient ways to deploy an Angular front-end on Azure.
You can deploy static sites which are hosted from a storage account. There is a tool called ng-deploy which streamlines the process.
If you prefer a container based approach, you can create a container that runs nginx to serve your built Angular application and deploy it to Azure using Container Instances or App Services

Resources