Can we transfer file from SFTP to server through spring cloud data flow? - spring-cloud-dataflow

My requirement is to get a file from SFTP and make it available client to download. I am forcefully asked to do that using spring cloud data flow.
In the documentation, I saw that there is an SFTP to the JDBC File Ingest tutorial (https://dataflow.spring.io/docs/recipes/batch/sftp-to-jdbc/).
So my question is can we transfer a file through spring cloud data flow rather than reading the file and inserting it into the databae?
Thanks,
Dasun.

Yes you can. It’s similar to the sftp to jdbc example which downloads the file to a shared file system from which the batch job reads it. You can create a simple pipeline like sftp | s3 or sftp l file or sftp l sftp, depending on your specific use case.

Related

incorrect swagger file path for hosted URL in Informatica cloud (mass ingestion)

I am trying to set up mass ingestion in IICS. This will be a streaming ingestion task. The source is REST V2.
 
According to the documentation, I can't provide an absolute path to the Swagger file for this connection. Instead, I need a hosted URL.
I tried hosting the Swagger file on a server that has the Informatica Cloud secure agent installed. When I create a connection everything works.
But when I try to add a connection for mass ingestion I get following error:
What is interesting, I also tried hosting this file on a VM in Azure, and when I try to access this file from a server using an internet browser it works. I can also see requests on the web server, but when I create mass ingestion and define the source I still get an error and I can't see any requests for the Swagger file on the web server.
What is wrong?

Spring Cloud Data Flow Stream Deployment to Cloud Foundry

I am new to spring cloud data flow. I am trying to build a simple http source and rabbitmq sink stream using SCDF stream app.The stream should be deployed on OSCF (Cloud Foundry). Once deployed, the stream should be able to receive HTTP POST Request and send the request data to RabbitMQ.
So far, I have downloaded Data Flow Server using below link and push to cloud foundry. I am using Shall application from my local.
https://dataflow.spring.io/docs/installation/cloudfoundry/cf-cli/.
I also have HTTP Source and RabbitMQ Sink application which is deployed in CF. RabbitMQ service is also bound to sink application.
My question - how can I create a stream using application deployed in CF? Registering app requires HTTP/File/Maven URI but I am not sure how can an app deployed on CF be registered?
Appreciate your help. Please let me know if more details are needed?
Thanks
If you're using the out-of-the-box apps that we ship, the relevant Maven repo configuration is already set within SCDF, so you can freely already deploy the http app, and SCDF would resolve and pull it from the Spring Maven repository and then deploy that application to CF.
However, if you're building custom apps, you can configure your internal/private Maven repositories in SCDF/Skipper and then register your apps using the coordinates from your internal repo.
If Maven is not a viable solution for you on CF, I have seen customers resolve artifacts from s3 buckets and persistent-volume services in CF.

Remote connection using SwiftNIO SSH

I am working on a solution that would Read/Write Server files from remote gateway system to the local storage of iOS device using SwiftNIO SSH. This way I would be able to execute shell commands. I checked in Swift's website but couldn't find specific implementation:
https://swift.org/blog/swiftnio-ssh/
How should I proceed or is there any other workaround?
The implementation is here: https://github.com/apple/swift-nio-ssh. There are some examples in the repository.

Spring Cloud Data Flow java DSL: get logs for stream components

I'm using SCDF 2.5.1 deployed locally via docker-compose and my software sends commands to the SCDF server via the java DSL.
Let's say I create a stream such that
file > :queue
:queue > ftp
where file and ftp are docker deployed apps.
My question is, how can I get the logs for file and ftp?
So far the closest thing I've come up with is
Map<String, String> attributes = scdf.runtimeOperations().streamStatus(streamName).getContent()
.stream().flatMap(stream -> stream.getApplications().getContent().stream()
.filter(app -> app.getName().equals(appName))
.flatMap(appStatus -> appStatus.getInstances().getContent().stream()
.map(AppInstanceStatusResource::getAttributes)))
.findFirst().orElse(Collections.emptyMap());
String logLocation = attributes.get("stdout")
and then mounting logLocation and reading it as a file.
Is there a more elegant solution?
The Java DSL (and subsequently the SCDF REST client) doesn't have log retrieval operation as part of it's REST operations. There is an REST endpoint you can hit the SCDF server to get the logs of stream.
If you would like to contribute, you can submit a proposal/PR here: https://github.com/spring-cloud/spring-cloud-dataflow/pulls

How to keep log file in Spring Cloud Data Flow?

My Spring Cloud Data Flow deleted log file in folder after I stopped it.
Why SCDF does that and How can I keep these log files?
You can customize the logging configuration in the logback config file and pass it as a configuration properties for the SCDF server. Assuming you are trying this with the local data flow server, you can refer this documentation for logback configuration.

Resources