Spring cloud composed task event - spring-cloud-dataflow

I'm after the composed task execution listener, that publishes events to middle-ware, basically, the same behavior as documented for custom task here
Is there any way to enable this feature for composed tasks run via SCDF REST API ?
Thanks

Related

Creating spring cloud data flow Task in pcf using UI

I am using following spring doc and try to create Task using UI in PCF however UI is not available in my PCF, How to enable UI on PCF ?
- https://dataflow.spring.io/docs/batch-developer-guides/getting-started/task/
You may want to update the description with what version of SCDF is in use, and how you have provisioned SCDF on PCF.
That said, it could be that you have explicitly disabled the Task's feature-toggle, so please verify the feature-toggles either from the About tab on the Dashboard or from the SCDF's http://<SCDF_CF_ROUTE/about endpoint.
UI:
About Endpoint:
You can learn more about the feature-toggles from the reference guide.

Enqueue a task in Google TaskQueue / Cloud Tasks from a Dataflow Pipeline

I need to read in a GCS file of 750K records.
For each record I need to compare it to a corresponding record in Google Datastore. If the record from the file does not match the record in Datastore, I need to update the Datastore record and enqueue a Taskqueue task.
The part I'm stuck on is launching this taskqueue task.
The only way seems to be via Google Cloud Task's HTTP api (https://cloud.google.com/tasks/docs/creating-http-target-tasks) but issuing a HTTP call from within a DoFn feels inefficient.
I looked into using pubsub for the task since dataflow has an adapter for that, but you can only use pubsub on streaming pipelines.
Yes, Beam doesn't seem to have special IO connectors for Cloud Task. So I guess you can only issue HTTP requests from inside a Beam DoFn.

Spring Cloud Dataflow REST API: deploying Spring Batch-specific REST API and Console standalone?

I need a Spring Batch Admin-like application to embed in my own SB-powered Spring Boot application.
The Spring website says it's deprecated and been moved to the Spring Attic. They recommend making use of Spring Cloud Dataflow Console.
I investigated this, and it appears that there is a lot of additional functionality I don't need -- all I want to do is inspect and retry batch job executions.
Is there a means of getting only this functionality, short of carving out the Jobs controllers out of the REST API implementation, and building my own admin screens?
Yes, it is possible; however, you'd still have to use SCDF to gain access to the REST-APIs.
Once when you have SCDF running, you'd get access to the Task/Batch-job specific REST endpoints and that you can use in your custom dashboard tooling.

Spring Cloud data flow does not show Spring cloud task execution details

The Spring cloud dataflow documentation mentions
When executing tasks externally (i.e. command line) and you wish for Spring Cloud Data Flow to show the TaskExecutions in its UI, be sure that common datasource settings are shared among the both. By default Spring Cloud Task will use a local H2 instance and the execution will not be recorded to the database used by Spring Cloud Data Flow.
I am new to Spring cloud dataflow and spring cloud task. Can somebody help me how to setup a common datasource for both. For my development purpose I'm using the embedded H2 database. Can I use the embedded one to see task execution details in Spring Flo/Dashboard?
A common "datasource" must be shared between Spring Cloud Data Flow (SCDF) and your Spring Cloud Task (SCT) applications in order to track and monitor task executions. If the datasource is not shared, both SCDF and SCT applications by default use a individual H2 database. And because they are on different databases, the task-executions in SCDF won't have visibility to independent execution history of SCT microservice applications.
Make sure to supply common DB properties to both. In your case, you can supply the same H2 DB properties. It is as simple as Spring Boot DB property overrides.

How can I order a job in Control-M using a message queue?

I am trying to find a way to order a Control-M job via a message from an external application. We are using Control-M v8. We are able to send messages to the queue, but we have been unsuccessful in receiving messages that perform some sort of action in Control-m.
Erick, look at the documentation for the Control-M Business Process Integration Suite Manual. This suite provides the capability that you are looking for.
We have application back-end in UNix and, we use Control-M in-built utilities to call jobs from unix. The jobs should be created in desktop, and should have been uploaded to control M database without any specific schedule. A utility called 'ctmorder' can be used to call these jobs as and when required.

Resources