Is there any way to terminate a user events import bulk load job - google-cloud-recommendation

After the bulk load job was started I need to terminate it on the google side. Is there any method to do it?

Related

Pusher in background job is outpacing my app - missing JS notifications

I'm moving imports of excel files (often large, often small) into the background using Sidekiq, and alerting the user when the importing is complete using Pusher.
At first, in the 'synchronous' flow (Rails app here), it will kick off the excel import, and then it will redirect to a dashboard page which will receive notifications from Pusher when the importing is done.
The problem is the Sidekiq-Pusher flow sometimes goes faster than the redirect to the dashboard page. My JavaScript subscriber won't be initialized in time to receive the message that is being published from within the background process. So the client gets nothing.
Does Pusher offer a way to delay publishing a message until there is a subscriber? Or does Pusher offer a way to 'stockpile' messages until the subscriber springs to life to consume them? Or maybe there is a simpler solution here I have not thought of?
Fyi, I don't want the background job to sleep for a few seconds to make sure the client is ready, and I don't want to use Pusher to trigger a refresh (i.e. save something in a DB, then refresh to display it).
I am happy to provide code samples if desired.
EDIT:
I'm certainly open to using something else besides Pusher if something else can solve my problem.
Invoke the worker after 1 or 2 seconds from the current time so that it gets invoked and show the message after being redirected to the landing page.
TestProcess.perform_at(2.seconds.from_now,parameters)
It will work as expected. Hope it helps :)

Triggering a SWF Workflow based on SQS messages

Preamble: I'm trying to put together a proposal for what I assume to be a very common use-case, and I'd like to use Amazon's SWF and SQS to accomplish my goals. There may be other services that will better match what I'm trying to do, so if you have suggestions please feel free to throw them out.
Problem: The need at its most basic is for a client (mobile device, web server, etc.) to post a message that will be processed asynchronously without a response to the client - very basic.
The intended implementation is to for the client to post a message to a pre-determined SQS queue. At that point, the client is done. We would also have a defined SWF workflow responsible for picking up the message off the queue and (after some manipulation) placing it in a Dynamo DB - again, all fairly straightforward.
What I can't seem to figure out though, is how to trigger the workflow to start. From what I've been reading a workflow isn't meant to be an indefinite process. It has a start, a middle, and an end. According to the SWF documentation, a workflow can run for no longer than a year (Setting Timeout Values in SWF).
So, my question is: If I assume that a workflow represents one message-processing flow, how can I start the workflow whenever a message is posted to the SQS?
Caveat: I've looked into using SNS instead of SQS as well. This would allow me to run a server that could subscribe to SNS, and then start the workflow whenever a notification is posted. That is certainly one solution, but I'd like to avoid setting up a server for a single web service which I would then have to manage / scale according to the number of messages being processed. The reason I'm looking into using SQS/SWF in the first place is to have an auto-scaling system that I don't have to worry about.
Thank you in advance.
I would create a worker process that listens to the SQS queue. Upon receiving a message it calls into SWF API to start a workflow execution. The workflow execution id should be generated based on the message content to ensure that duplicated messages do not result in duplicated workflows.
You can use AWS Lambda for this purpose . A lambda function will be invoked by SQS event and therefore you don't have to write a queue poller explicitly . The lambda function could then make a post request to SWF to initiate the workflow

Long Running Task Rails

I am building a website, and I have an administrator page. The admin will have to run a reporting task, meaning that, the task will iterate all the records fetch information and generate a pdf file. Now this will be heavy on the app and the database.
What is the usual approach for it ? Should I have a button that calls a method of a class or should I have a rake task? I heard that HTTP GET requests have a limit and if the report generation takes more than that then it kills the request.
I would like to use send_data(....) so the user is given a nice download pop up box when the report is done. Will it be better to use a mailer and email it?
Thanks
We have similar functionality in our Rails apps at my job.
We have one URL/action that initiates the request to generate the PDF file, and returns right away saying the request was started successfully.
Then we have another action that we can poll with AJAX that returns whether or not the report is complete, and when it is complete, it gives the user the PDF.
The actual generation is done by a Sidekiq worker which is not subject to the webserver timeout.

How can I execute code after rendering a response to the client in Grails?

I am writing a controller in Grails 2.X which kicks off a longish job. I would like to be able to render something to the web page which states that the process has started, with the id of the job that is in progress, and have that response actually show up in the user's web browser, and then continue processing on the job.
I have tried just using render without returning, and the user's browser just hangs until the entire job has completed, then renders that the job started.
I tried redirecting to a different action that renders my message, but that also hangs the browser until the job is complete.
I have looked into using filters and the afterInterceptor, but as best as I can tell these take effect and do their processing before the final page is sent back to the client. I need to send my final page back to my client and then continue processing.
You will want to kick off a background job. You can use quartz or look at grails 2.3 async items. If it is pretty long using Quartz is probably your best option.
You will want to return something the client can use to query for the state of the job such as a job id or some record you update once processing is finished.

WebSockets server that will complete the job after the connection is made - Ruby, Rails

I want to use something like EventMachine websockets to push status updates to the client as they happen.
My application crawls round a section of a website screen scraping relevant details of a user's search. I want to push any screen scraping captures to the client as they happen. I also want to persist these changes to the database. I also want the job to complete even if the user closes down the browser.
At the moment, the job is initiated from the client (browser) and the job is placed on a resque queue that completes the job. The client polls the database and displays the results.
I want to have a play around with websockets but I don't think I can get the same behaviour. It is more important that the results are persisted and the job completes than the real time pushes.
Am I wrong in the assumption that this cannot be done?
Have you looked at faye. Masseging With Faye(RailsCasts). You can keep on using the rescue queue to get the job completed and push the message to subscriber(your web client) as and when you find the results.

Resources