I have tried to view log of task on Spring Cloud Data Flow UI. However I got the message that: Log could not be retrieved as the task instance is not running when I used the REST ENDPOINT : localhost:9393/dashboard/#tasks/executions/33
When I launch my task for the first time with task id is 33, the log is showed on UI. But when I relaunch the same task again (task id is 34), the log of id 34 is showed but the log of id 33 is gone.
Im using SCDF version 2.2.2 and spring-cloud-deployer-local version 2.0.6.
How to keep log of all task id on the UI?
When the Spring Cloud Data Flow server uses local deployer to handle the task lifecycle management(launch, stop, etc.,), the corresponding task execution log can be obtained only when the task execution status is RUNNING.
This is by design because the local task launcher prunes the task instance history every time a new task instance is launched and hence the access to the log is not available. If you want to explore the code, you can check it here which is based on this Github issue
Related
I have an SQS queue that receives a message with the file name that has been created in a target bucket. The process to send the message is:
csv file is inserted into target_bucket.
A message is sent to an SNS topic.
The SNS topic triggers a lambda function, and this lambda function posts a message into an SQS queue that includes the name of the file that was just created.
To check is messages are arriving to my queue, I do a simple poll from the console.
I know all the components are working just fine because by polling from the AWS Web console I can see the messages. This is an example:
However, the intention is to connect this SQS queue to Matillion so every time a new file is uploaded into my target_bucket a job is executed. This job should read the data from the new file and load it into an SnowFlake table.
I have connected my SQS queue to my Matillion project but every time I load a new file into my target_bucket nothing happens. Here are the project configurations needed for SQS:
I know my queue has access to Matillion because as you can see from the final cell, I have a success message when testing the connection.
Also, I added an environment variable (from Project > Manage Environment Variables) called file_to_load:
And finally, in the S3 Load component (from my job), I also added the file_to_load in the pattern section as shown in the image below:
We are using Jenkins pipeline to run jmeter tests for testing one of our application API. EVeryting is working ok but there are cases where the Application returns an error. We would like to log the request payload for such failures and also the timestamp so that we can investigate in the application about corresponding failures.
Is there a way, I can instruct jmeter to log the Request Data for cases which result in failure?
The easiest option is adding a Listener like Simple Data Writer to your test plan.
The configuration to save the timestamp and payload would look like:
Once the test finishes you will be able to observe requests details (if any) using View Results Tree listener.
More information: How to Save Response Data in JMeter
I'll start this by saying I'm new to using Google Cloud Tasks, so please forgive me if this is an obvious issue.
I've created a new Cloud Task queue using gcloud with the command:
gcloud tasks queues create default
I've then proceeded to add tasks to the queue from a Ruby on Rails applciation, and from the command-line using this command:
gcloud tasks create-http-task --queue=default --url=https://google.com --method GET
I then see the tasks being added to the queue, but the HTTP requests are never made. As well the queue itself says that there's no "Tasks In Queue" even though the ones I've made are listed in the tasks list right below this message:
I've enabled logging with:
gcloud tasks queues update default --log-sampling-ratio=1.0
and can see the tasks being created in the logs, but there are no logs for the individual tasks.
The Cloud Run service I'm invoking has been made publicly accessible, and if I manaully POST the task payload to the url in the task it works. I'm using GET google.com as I assume it's reliably accessible.
Is anyone able to tell me what I'm doing wrong? This is the last item I need to sort to wrap up our projects move to Google Cloud! Thank-you!
In case anyone else runs upon this, there's one more trick to enabling Google Cloud Tasks.
After making sure that App Engine is setup on your project, you also need to make sure that the application itself has not been disabled! It turns out the project I was on was using App Engine many years ago and the only application was disabled in the App Engine settings. Enabling this again made everything work as you'd expect.
You can find the setting by going to "App Engine", "Settings", then checking the "Disable Application" setting.
Good-luck to anyone reading this!
I need to run a shell script (e.g. df) on a server, say Client. The call to this script should be made from another independent rails application, say Monitor via REST Api and return the output in response to Monitor application.
This shell command should run on all application server instances of Client. Though I'm researching on it, it'll be quite helpful if anyone has done this already before.
I need to get following information from Client servers to Monitor application:
Disk space left on each Client server instance ,
Processes running on each Client server instance,
Should be able to terminate non-responsive Client instance.
Thanks
A simple command can be executed via:
result = `df -h /`
But it does not fullfill the requirement run on all application server instances of Client. For this you need to call every instance independently.
Another way can be to run your checks from a cron job and let the Client call Monitor. If a cron is not suited you can create an ActiveJob on every client, collect the data and call Monitor
You should also look for ruby libraries providing the data you need.
For instance sys/filesystem can provide data about your disk stats.
I'm implementing an email reminder to remind the student of returning the books before expired date.
I know Window Service can achieve that but since the machine will have restriction for installing the Window Service, so is there any other ideas?
You don't need a Windows service to perform background tasks, just use Windows scheduled tasks. You can write an exe or batch file and schedule it as frequently as you like.
If the server won't let you run scheduled tasks, you can always create the task as an action in a MVC controller and create a scheduled task on another machine that calls your action.