Combining/Integrating Swagger & Slate with Spring Microservices - jenkins

I have a set of dockerized microservices (spring boot with jersey application) deployed on EC2 instances. Each service exposes rest apis which i need to create a spec for.
As a first step, i have included swagger-core in all the spring boot apps which generates a .json/.yaml file and serves it on http as a rest resource when the service is up.
I want to use slate to serve the api specification for all the services as a single resource (aggregated from individual json/yaml files) over http/s.
Slate (https://github.com/lord/slate) uses markdown format to serve the static content.
Library Swagger2Markup can be used to generate the markdown file by aggregating the api specs (json/yaml files) from each services (each have their own json/yaml file generated).
Using this markdown file, i can use slate default middleman to build the static content and serve it over http.
Question: what can be considered a best practice for seamlessly integrating this process to the jenkins build.

Related

.NET 5.0 Web API project returning 404 when {domain}/swagger/index.html is requested after deployment

I am new to .NET Core 5.0, currently working in a project where we have developed a multi-project Web API solution. For each Web API project, there is a startup.cs file, inside it we have the Swagger code like UseSwagger(...), UseSwaggerUI(....).
Now the issue is, Swagger is working fine in local (i.e., when I run the API on the local machine and request the URL http://localhost:5000/userService/swagger/index.html, it's showing up the API URLs etc). But when the API is deployed in AWS and a request is made to https://{domain}/userService/swagger/index.html, it returns 404.
Domain -: (provided by AWS)
Deployment is done through a Jenkins pipeline which internally calls the Dockerfile (each API project has a Dockerfile) to build and publish the .NET Core project.
Am I missing something?

Spring Cloud Data Flow Stream Deployment to Cloud Foundry

I am new to spring cloud data flow. I am trying to build a simple http source and rabbitmq sink stream using SCDF stream app.The stream should be deployed on OSCF (Cloud Foundry). Once deployed, the stream should be able to receive HTTP POST Request and send the request data to RabbitMQ.
So far, I have downloaded Data Flow Server using below link and push to cloud foundry. I am using Shall application from my local.
https://dataflow.spring.io/docs/installation/cloudfoundry/cf-cli/.
I also have HTTP Source and RabbitMQ Sink application which is deployed in CF. RabbitMQ service is also bound to sink application.
My question - how can I create a stream using application deployed in CF? Registering app requires HTTP/File/Maven URI but I am not sure how can an app deployed on CF be registered?
Appreciate your help. Please let me know if more details are needed?
Thanks
If you're using the out-of-the-box apps that we ship, the relevant Maven repo configuration is already set within SCDF, so you can freely already deploy the http app, and SCDF would resolve and pull it from the Spring Maven repository and then deploy that application to CF.
However, if you're building custom apps, you can configure your internal/private Maven repositories in SCDF/Skipper and then register your apps using the coordinates from your internal repo.
If Maven is not a viable solution for you on CF, I have seen customers resolve artifacts from s3 buckets and persistent-volume services in CF.

Spring Cloud Data Flow java DSL: get logs for stream components

I'm using SCDF 2.5.1 deployed locally via docker-compose and my software sends commands to the SCDF server via the java DSL.
Let's say I create a stream such that
file > :queue
:queue > ftp
where file and ftp are docker deployed apps.
My question is, how can I get the logs for file and ftp?
So far the closest thing I've come up with is
Map<String, String> attributes = scdf.runtimeOperations().streamStatus(streamName).getContent()
.stream().flatMap(stream -> stream.getApplications().getContent().stream()
.filter(app -> app.getName().equals(appName))
.flatMap(appStatus -> appStatus.getInstances().getContent().stream()
.map(AppInstanceStatusResource::getAttributes)))
.findFirst().orElse(Collections.emptyMap());
String logLocation = attributes.get("stdout")
and then mounting logLocation and reading it as a file.
Is there a more elegant solution?
The Java DSL (and subsequently the SCDF REST client) doesn't have log retrieval operation as part of it's REST operations. There is an REST endpoint you can hit the SCDF server to get the logs of stream.
If you would like to contribute, you can submit a proposal/PR here: https://github.com/spring-cloud/spring-cloud-dataflow/pulls

Serving Swagger-UI in JHipster servers and microservices

JHipster servers and microservices don't serve their own Swagger UI. Instead the microservice gateway aggregates all api-docs and serves them in one place, using a custom Swagger UI within the Angular Frontend.
But when using standalone JHipster servers or microservices, there is no frontend and thus no Swagger-UI at all.
In those instances, how can you force the server or microservice to serve its own Swagger-UI?

Spring Cloud Dataflow: how to persist stream definitions

I am using the local server of spring cloud dataflow. Each time I restart the server, all deployed apps and stream definitions are lost. How can I persist my stream definitions so that they survive server restarts?
As of RC1, the stream/task/job definitions among other metadata specs can be configured to persist in an RDBMS, and there's support for many of the commonly used databases. If nothing is provided, the default embedded h2 database is used, which is in-memory and it is recommended only for development purposes.

Resources