Spring Cloud Data Flow in Open Shift - spring-cloud-dataflow

I want to deploy SCDF in Open Shift , Are the following components are compatible with open shift.
Spring Cloud Data Flow Server
Skipper Server.

SCDF and Skipper are upstream K8s compatible. Openshift and other K8s distros are possible; assuming they are compatible with the upstream, as well.
Here's the latest K8s version compatibility chart; there's an update to this chart via: spring-io/dataflow.spring.io#359.

Related

Spring Cloud Data Flow Stream Deployment to Cloud Foundry

I am new to spring cloud data flow. I am trying to build a simple http source and rabbitmq sink stream using SCDF stream app.The stream should be deployed on OSCF (Cloud Foundry). Once deployed, the stream should be able to receive HTTP POST Request and send the request data to RabbitMQ.
So far, I have downloaded Data Flow Server using below link and push to cloud foundry. I am using Shall application from my local.
https://dataflow.spring.io/docs/installation/cloudfoundry/cf-cli/.
I also have HTTP Source and RabbitMQ Sink application which is deployed in CF. RabbitMQ service is also bound to sink application.
My question - how can I create a stream using application deployed in CF? Registering app requires HTTP/File/Maven URI but I am not sure how can an app deployed on CF be registered?
Appreciate your help. Please let me know if more details are needed?
Thanks
If you're using the out-of-the-box apps that we ship, the relevant Maven repo configuration is already set within SCDF, so you can freely already deploy the http app, and SCDF would resolve and pull it from the Spring Maven repository and then deploy that application to CF.
However, if you're building custom apps, you can configure your internal/private Maven repositories in SCDF/Skipper and then register your apps using the coordinates from your internal repo.
If Maven is not a viable solution for you on CF, I have seen customers resolve artifacts from s3 buckets and persistent-volume services in CF.

Stream apps not using the buildpack provided in SCDF server environment variable (SCDF ver 2.1.2)

Recently, I upgraded from SCDF 1.7.3 to SCDF 2.1.2 for cloud foundry. Also, I am using skipper (I have to with 2.x). There are two main problems I am facing:-
Buildpack given as a property in the SCDF server environment is not being used to deploy stream applications. Following is the env key that I am using:-
SPRING_CLOUD_DATAFLOW_STREAM_PLATFORM_CLOUDFOUNDRY_ACCOUNTS[xxx]_DEPLOYMENT_BUILDPACK. This has no effect at all.
Even though I set SPRING_CLOUD_DATAFLOW_STREAM_PLATFORM_CLOUDFOUNDRY_ACCOUNTS[xxx]_DEPLOYMENT_ENABLE_RANDOM_APP_NAME_PREFIX to false skipper still generates random prefix for these applications.
I am not sure what I am doing wrong. Any advice will be of great help.
There are no stream platform properties with the prefix SPRING_CLOUD_DATAFLOW_STREAM_PLATFORM_CLOUDFOUNDRY in Spring Cloud Data Flow as the stream deployments are managed by Spring Cloud Skipper. Hence, you need to use the Skipper properties for stream deployment-related configurations.
The correct properties to use in this case are:
SPRING_CLOUD_SKIPPER_SERVER_PLATFORM_CLOUDFOUNDRY_ACCOUNTS[xxx]_DEPLOYMENT_ENABLERANDOMAPPNAMEPREFIX: false
SPRING_CLOUD_SKIPPER_SERVER_PLATFORM_CLOUDFOUNDRY_ACCOUNTS[xxx]_DEPLOYMENT_BUILDPACK:

How to integrate Rabbitmq authentication (using MQTT) with oauth2

I want to use oauth2 to authenticate and authorize the MQTT protocol in Rabbitmq.
I found out there rabbitmq-auth-backend-oauth2 plugin that helps deploy this purpose.
However, I cannot install this plugin on the Rabbitmq-server as its plugin.
OS: Centos
Rabbitmq 3.7.14
Erlang 21
I installed this plugin with the command, but it's always failure: 
make run-broker RABBITMQ_CONFIG_FILE = demo / symmetric_keys /
rabbitmq
Please let me know the deployment model of this integration and the configuration for rabbitmq, as well as installing the that plugin (of course if needed)

Spring Cloud Data Flow Local Server + Skipper Server: Error after undeploying streams

I trying to manage my streams on spring cloud data flow with skipper server.
I followed the instruction here:
https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#getting-started-deploying-spring-cloud-dataflow
The app registration and stream definition/deployment goes quite well, but after I undeploy the deployed stream, I can't see any stream on the dashboard any more.
The dashboard shows an error instead:
Could not parse Skipper Platform Status JSON:null
I have to restart the scdf server and skipper server in order to see my stream definition again.
The version of the components are:
scdf local server: 1.6.0.RELEASE
skipper server: 1.0.8.RELEASE
metrics collector: kafka-10-1.0.0.RELEASE
Some operation details:
I registered my app using scdf shell in skipper mode.
I defined and depolyed my stream on the scdf dashboard. I undeployed the stream via the stop button on the dashboard, too.
How should I solve this problem?
We have recently observed this on our side, too, and it has been fixed! [see spring-cloud/spring-cloud-dataflow#2361]
We are preparing for a 1.6.1 GA release, but in the meantime, please feel free to pull the 1.6.1.BUILD-SNAPSHOT from Spring repo and give it a go.
If you see any other anomaly, it'd be great to have a bug report on the matter.

Access Parse Server Dashboard using Bitnami VM

I have used Bitnami VM to deploy Parse Server on Azure but I cannot seem to be able to access Parse Server Dashboard. What URL is it available on? Do I need to open any ports?
Just an update on this. A new version of Parse Server provided by Bitnami is now available in the Azure Marketplace. The new version does include the Dashboard.
Have you been following Bitnami instructions?
It states you can access the dashboard using this URL: http://[server-IP-address]/parse
This means only TCP/80 port needs to be open (on your Network Security Group if you use one or in your VM ACL if you don't).
You have now other (probably easier) options to deploy Parse Server on Azure:
using a dedicated ARM template leveraging Azure services (App Service, DocumentDB, Notification hub, ...).
using Azure App Service with the original Facebook/Parse version with MongoDB.
I've got it. The bitnami guys were kind enough to reply to me for this topic:
You can launch the latest Parse version that ships the Dashboard from https://vmdepot.msopentech.com/Vhd/Show?vhdId=64574&version=66817 It could take some time to be available in the Azure Marketplace
So bottom line, use the image from VM depot and not the one on Azure Marketplace as it is an old one and doesn't include the Dashboard.

Resources