How to implement a client for file uploading to a WSO2-ESB proxy service? - vfs

I config an ESB proxy:
ESB Proxy --> inSqquence --> outSequence --> clone --> VFS
How to implement a client for file uploading to a WSO2-ESB proxy service?

You can configure ESB to upload files to a location using VFS transport listener and sender. Refer to Sample 254: Using the File System as Transport Medium (VFS) where in the outSequence the response is saved to a directory using an endpoint with vfs file address. To clone the messages going through the ESB you can use Clone mediator.

Related

incorrect swagger file path for hosted URL in Informatica cloud (mass ingestion)

I am trying to set up mass ingestion in IICS. This will be a streaming ingestion task. The source is REST V2.
 
According to the documentation, I can't provide an absolute path to the Swagger file for this connection. Instead, I need a hosted URL.
I tried hosting the Swagger file on a server that has the Informatica Cloud secure agent installed. When I create a connection everything works.
But when I try to add a connection for mass ingestion I get following error:
What is interesting, I also tried hosting this file on a VM in Azure, and when I try to access this file from a server using an internet browser it works. I can also see requests on the web server, but when I create mass ingestion and define the source I still get an error and I can't see any requests for the Swagger file on the web server.
What is wrong?

How to set credentials in bindings file for JMSInput Node

I am using the JMSInput node in an IIB flow to connect with Rabbitmq. Locally it's working fine with a binding file, but how/where to set login credentials for a remote rabbitmq server?
You will have to use JMS Node and create your own JMSProviders in configurable services.

Spring Cloud Data Flow Stream Deployment to Cloud Foundry

I am new to spring cloud data flow. I am trying to build a simple http source and rabbitmq sink stream using SCDF stream app.The stream should be deployed on OSCF (Cloud Foundry). Once deployed, the stream should be able to receive HTTP POST Request and send the request data to RabbitMQ.
So far, I have downloaded Data Flow Server using below link and push to cloud foundry. I am using Shall application from my local.
https://dataflow.spring.io/docs/installation/cloudfoundry/cf-cli/.
I also have HTTP Source and RabbitMQ Sink application which is deployed in CF. RabbitMQ service is also bound to sink application.
My question - how can I create a stream using application deployed in CF? Registering app requires HTTP/File/Maven URI but I am not sure how can an app deployed on CF be registered?
Appreciate your help. Please let me know if more details are needed?
Thanks
If you're using the out-of-the-box apps that we ship, the relevant Maven repo configuration is already set within SCDF, so you can freely already deploy the http app, and SCDF would resolve and pull it from the Spring Maven repository and then deploy that application to CF.
However, if you're building custom apps, you can configure your internal/private Maven repositories in SCDF/Skipper and then register your apps using the coordinates from your internal repo.
If Maven is not a viable solution for you on CF, I have seen customers resolve artifacts from s3 buckets and persistent-volume services in CF.

How to keep log file in Spring Cloud Data Flow?

My Spring Cloud Data Flow deleted log file in folder after I stopped it.
Why SCDF does that and How can I keep these log files?
You can customize the logging configuration in the logback config file and pass it as a configuration properties for the SCDF server. Assuming you are trying this with the local data flow server, you can refer this documentation for logback configuration.

WSO2 API Manager in Docker

I am trying to deploy API Manager and Enterprise Integrator using Docker Compose. This is deployed using a cloud server.
Everything works locally when using localhost as the host but when I deploy it on a using a cloud server, I cannot access the API Manager using the public IP of the server. The Enterprise Integrator works though. I've modified some configuration parameters as shown below but the problem persists:
<APIStore>
<!--GroupingExtractor>org.wso2.carbon.apimgt.impl.DefaultGroupIDExtractorImpl</GroupingExtractor-->
<!--This property is used to indicate how we do user name comparision for token generation https://wso2.org/jira/browse/APIMANAGER-2225-->
<CompareCaseInsensitively>true</CompareCaseInsensitively>
<DisplayURL>false</DisplayURL>
<URL>https://<PUBLIC IP HERE>:${mgt.transport.https.port}/store</URL>
<!-- Server URL of the API Store. -->
<ServerURL>https://<PUBLIC IP HERE>:${mgt.transport.https.port}${carbon.context}services/</ServerURL>
I've also whitelisted the said public IP:
"whiteListedHostNames" : ["localhost","PUBLIC IP HERE"]
Meanwhile please check the reference.

Resources