I am trying to create a workflow solution in ActiveVOS that connects to Informatica P360. I am trying to create 2 BPELs, one for the actual workflow and another one that performs a specific task. How do I invoke another BPEL from main process?
Related
Let's say we have a REST app with its REST end-points that we can wrap in a Docker container.
Let's say our Spring batch's Item Processor likes to use the above app to get business logic information about a record it is handling.
We like this REST app to be used solely by the Spring Batch process and we don't like the Batch process to communicate with the Production REST app, but to have some kind of it own instance of the REST app.
We also like that this app instance will be created automatically when the Spring Batch process starts, so no extra human intervention or configuration is needed.
Is it possible that the Spring batch will use run and use the REST service as a Docker container (maybe as testconainer or maybe docker compose?) so it can use it "internally" in some way?
Is this a reasonable architecture?
I would recommend to:
create a custom docker network and attach both your Spring Batch app container and REST app container to it. These two containers will only be visible in that network.
attach your Spring Batch app container to the REST API container so that the item processor can see the REST API endpoint and make calls
You can find the official tutorial about how to create custom docker networks here.
Hope this helps.
I'm after the composed task execution listener, that publishes events to middle-ware, basically, the same behavior as documented for custom task here
Is there any way to enable this feature for composed tasks run via SCDF REST API ?
Thanks
I'm working on a POC to automate downstream processes in external systems based on JIRA processes and have hit a wall with the API. It appears to have great integration for pulling data about tickets out of JIRA and for the ability to externally generate tickets into JIRA.
However I don't see how to trigger external calls as a part of my workflows. For example if a ticket should be prevented from being routed to the next stage of a workflow without accessing a database to ensure availability of inventory first how could I do that in JIRA?
Based on attributes in the JIRA ticket upon final completion of the workflow we'd like to send a JMS or REST message or possibly update an external database. Is this possible?
Thanks all in advance for the help!
If you want to do a "before" check, use a Validator on the Workflow Transition.
I strongly suggest deploying the (free) Script Runner add-on. There you can implement a ton of things. For example, you'll get a new validator option "Script Validator", where you can specify a Groovy script that decides if it lets through the transition or aborts it.
I am trying to find a way to order a Control-M job via a message from an external application. We are using Control-M v8. We are able to send messages to the queue, but we have been unsuccessful in receiving messages that perform some sort of action in Control-m.
Erick, look at the documentation for the Control-M Business Process Integration Suite Manual. This suite provides the capability that you are looking for.
We have application back-end in UNix and, we use Control-M in-built utilities to call jobs from unix. The jobs should be created in desktop, and should have been uploaded to control M database without any specific schedule. A utility called 'ctmorder' can be used to call these jobs as and when required.
We have an existing method in an existing controller that we'd like to call at a specific schedule (eg "daily at 2am"). The application is an MVC3 application running on Azure as a web role and we don't want to create, maintain and pay for an entire new role (worker role) just to run one small piece of identical logic.
Is it possible to schedule a controller method to trigger off at a specific scheduled time in the future? Also, would the same technique work in regular ASP.NET webforms?
Assuming you can just call this controller action with a URL, you can just...
1) create a PowerShell script to "ping" the website:
http://learn-powershell.net/2011/02/11/using-powershell-to-query-web-site-information/
2) Schedule that PowerShell script via remote desktop in a Scheduled Task that runs at 2am
You could also write a deployment script that automates #2.
You could use Phil Haack's WebBackgrounder as described here.
I've successfully used Cron jobs in a Shared hosting environment where scheduled tasks/powershell wasn't available.
Here's a website explaining more about it