Spring Cloud Data Flow Stream Deployment to Cloud Foundry - spring-cloud-dataflow

I am new to spring cloud data flow. I am trying to build a simple http source and rabbitmq sink stream using SCDF stream app.The stream should be deployed on OSCF (Cloud Foundry). Once deployed, the stream should be able to receive HTTP POST Request and send the request data to RabbitMQ.
So far, I have downloaded Data Flow Server using below link and push to cloud foundry. I am using Shall application from my local.
https://dataflow.spring.io/docs/installation/cloudfoundry/cf-cli/.
I also have HTTP Source and RabbitMQ Sink application which is deployed in CF. RabbitMQ service is also bound to sink application.
My question - how can I create a stream using application deployed in CF? Registering app requires HTTP/File/Maven URI but I am not sure how can an app deployed on CF be registered?
Appreciate your help. Please let me know if more details are needed?
Thanks

If you're using the out-of-the-box apps that we ship, the relevant Maven repo configuration is already set within SCDF, so you can freely already deploy the http app, and SCDF would resolve and pull it from the Spring Maven repository and then deploy that application to CF.
However, if you're building custom apps, you can configure your internal/private Maven repositories in SCDF/Skipper and then register your apps using the coordinates from your internal repo.
If Maven is not a viable solution for you on CF, I have seen customers resolve artifacts from s3 buckets and persistent-volume services in CF.

Related

Comparison StreamPipes vs Spring Cloud Dataflow

I'm comparing Apache StreamPipes and SCDF (Spring Cloud Dataflow).
I found out that there are some similarities:
Components of the Stream are executed as microservices via Wrappers (Flink/standalone)
Internally uses Message Broker to automatically create required topics and connect pipeline-components by that
I found nothing about a Support for using Kubernetes as Execution Engine. Is something planned in the Future? Anyone knows some other differences/similarities?

Access Pivotal SSO tile in local development

Our OPS team have configured a SSO tile that connects to ADFS. I am building a sample application that utilize an SSO service instance. I can deploy my application to PCF and remote debug my SSO configuration. These things work.
What I need is a way to access the SSO service instance while I am developing on my PC. Otherwise only way to verify my code really works is to deploy my application to PCF and either add log statements or configure remote debugging. Both of these are pretty time consuming.
I looked into configuring ssh access to pivotal services. That works for database service instances, but not for SSO service instance. Has anyone figured it out?
After repeated trials and error, I found the solution. Posting it here in case someone else has similar issue
In PCF, for your SSO add a new application. Auth redirect url for this application should point to your localhost. In my case it is http://localhost:8080
run cf env . Copy the p-identity section only and save to vcap_services.json. Then update the clientId and clientSecret with the values from the new application created in previous step.
Use the following command to start your application
VCAP_APPLICATION=true VCAP_SERVICES=$(cat vcap_services.json) SPRING_PROFILES_ACTIVE=... ./gradlew bootRun

Spring Cloud Data Flow Local Server + Skipper Server: Error after undeploying streams

I trying to manage my streams on spring cloud data flow with skipper server.
I followed the instruction here:
https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#getting-started-deploying-spring-cloud-dataflow
The app registration and stream definition/deployment goes quite well, but after I undeploy the deployed stream, I can't see any stream on the dashboard any more.
The dashboard shows an error instead:
Could not parse Skipper Platform Status JSON:null
I have to restart the scdf server and skipper server in order to see my stream definition again.
The version of the components are:
scdf local server: 1.6.0.RELEASE
skipper server: 1.0.8.RELEASE
metrics collector: kafka-10-1.0.0.RELEASE
Some operation details:
I registered my app using scdf shell in skipper mode.
I defined and depolyed my stream on the scdf dashboard. I undeployed the stream via the stop button on the dashboard, too.
How should I solve this problem?
We have recently observed this on our side, too, and it has been fixed! [see spring-cloud/spring-cloud-dataflow#2361]
We are preparing for a 1.6.1 GA release, but in the meantime, please feel free to pull the 1.6.1.BUILD-SNAPSHOT from Spring repo and give it a go.
If you see any other anomaly, it'd be great to have a bug report on the matter.

Access Parse Server Dashboard using Bitnami VM

I have used Bitnami VM to deploy Parse Server on Azure but I cannot seem to be able to access Parse Server Dashboard. What URL is it available on? Do I need to open any ports?
Just an update on this. A new version of Parse Server provided by Bitnami is now available in the Azure Marketplace. The new version does include the Dashboard.
Have you been following Bitnami instructions?
It states you can access the dashboard using this URL: http://[server-IP-address]/parse
This means only TCP/80 port needs to be open (on your Network Security Group if you use one or in your VM ACL if you don't).
You have now other (probably easier) options to deploy Parse Server on Azure:
using a dedicated ARM template leveraging Azure services (App Service, DocumentDB, Notification hub, ...).
using Azure App Service with the original Facebook/Parse version with MongoDB.
I've got it. The bitnami guys were kind enough to reply to me for this topic:
You can launch the latest Parse version that ships the Dashboard from https://vmdepot.msopentech.com/Vhd/Show?vhdId=64574&version=66817 It could take some time to be available in the Azure Marketplace
So bottom line, use the image from VM depot and not the one on Azure Marketplace as it is an old one and doesn't include the Dashboard.

Deploy Grails app to AppFog using Eclipse CloudFoundry plugin

I'm trying to deploy my Grails application to AppFog using CloudFoundry plugin (ver. 1.1) in Spring Source Suites (STS 2.9.2).
I'm using https://api.appfog.com as server address and MYAPPNAME.aws.af.cm for application address when deploying app.
Application is pushed and started, services are bounded but, after that, I recive error saying that:
Communication with server failed: I/O error: Server returned HTTP response code: 405 for URL: https://api.appfog.com/apps/MYAPPNAME/application
Also, when I try to create Caldecott tunnel toward database I recive response "The URIs: caldecott-85393a.appfog.com have already been taken or reserved. (404 Not Found)" which I also saw when (by mistake) treid to deploy application to MYAPPNAME.api.appfog.com (default AF name instead of particular infrastructure adress).
I suppose that CloudFoundry plugin uses default server address to reach application and also trying to create Caldecott tunnel on default server address (caldecott-85393a.appfog.com instead of caldecott-85393a.aws.af.cm)
Does anybody have idea how to circumvent this situation?
BR
Zoran
this doesn't sound like an issue with the plugin itself but the response coming back from AppFog's cloud. I would take this up as an issue with them and clarify you can use that plugin with their cloud.
AppFog had issues recently with their Java deployments that was specifically affecting Grails apps. This has been resolved and should be working seamlessly as expected now. You can always reach out for more information in the active google group as well: https://groups.google.com/forum/#!forum/appfog-users

Resources