I'm want to write to a log file in a remote server (over a local file which I've successfully wrote to).
in Dropwizard documentation there are only examples of how to write to a local file.
how do I instruct the application in the yml file?
as it turns out, the request was to use greylog.
so here is the reference: gelf
Related
I am trying to set up mass ingestion in IICS. This will be a streaming ingestion task. The source is REST V2.
According to the documentation, I can't provide an absolute path to the Swagger file for this connection. Instead, I need a hosted URL.
I tried hosting the Swagger file on a server that has the Informatica Cloud secure agent installed. When I create a connection everything works.
But when I try to add a connection for mass ingestion I get following error:
What is interesting, I also tried hosting this file on a VM in Azure, and when I try to access this file from a server using an internet browser it works. I can also see requests on the web server, but when I create mass ingestion and define the source I still get an error and I can't see any requests for the Swagger file on the web server.
What is wrong?
A quick question: In Java/Spring/Spring Boot world I know it is possible to have your server load their configuration from a "config server" instead of files in the instance or environment variables.
I am talking of spring-cloud-config-server.
Would it be possible to do the same thing in Rails?
Is there an extension or gem that would allow me to pull config from a KV storage or maybe from a remote storage (S3 for example) on instances starting?
Thanks in advance.
My requirement is to get a file from SFTP and make it available client to download. I am forcefully asked to do that using spring cloud data flow.
In the documentation, I saw that there is an SFTP to the JDBC File Ingest tutorial (https://dataflow.spring.io/docs/recipes/batch/sftp-to-jdbc/).
So my question is can we transfer a file through spring cloud data flow rather than reading the file and inserting it into the databae?
Thanks,
Dasun.
Yes you can. It’s similar to the sftp to jdbc example which downloads the file to a shared file system from which the batch job reads it. You can create a simple pipeline like sftp | s3 or sftp l file or sftp l sftp, depending on your specific use case.
My Spring Cloud Data Flow deleted log file in folder after I stopped it.
Why SCDF does that and How can I keep these log files?
You can customize the logging configuration in the logback config file and pass it as a configuration properties for the SCDF server. Assuming you are trying this with the local data flow server, you can refer this documentation for logback configuration.
I am currently running a rails application and a SpringBoot configuration service in the same local network. Is it possible to configure rails to use the config files provided by the service in Springboot?
More specifically I am looking to fetch the database connection and user data via the service and let rails connect to a remote database.
The service provides theses files via http as json or yml.
Thank you.
Edit: Solved it by using a bash script with wget to pull and assemble config files manually via container scripts that are executed before each deploy.