Is it possible to empty the queue of tasks on google cloud run where I run a plumber API? I will get a lot of pushes in a batch, but I just want to run the script once. So the best solution would be to delete the post requests waiting in the queue at the end of the calculation.
I could not find a good solution yet. Do you have any tips or tricks? Any help is appreciated.
Asked a similar question on Rstudio community HERE
The question is answered on Rstudio community.
Related
We have a task to gather lighthouse metrics periodically (once a minute for several pages)
We want to use pagespeed api.
Maybe there is payed version of it we can use in such case?
What is the price poilicy if it exists?
Thanks!
You can use paid services for it, for example, I'm working on pagespeed.green - scheduled tests will be supported soon. It would be beneficial to know your needs.
I highly suggest you to integrate lighthouse with a reputed automation tool (like protractor) and audit your webpage through it. You can then run those lighthouse + protractor integrated test cases from jenkins/teamcity and periodically publish your report. In this way you can keep track of your website performance absolutely free.
And if you wonder how to integrate lighthouse with Protractor, you can refer here.
Not only protractor, you can integrate lighthouse with puppeteer as well.
Let me know, if you get any success.
check Run chrome lighthouse's audit from command line
and run it on hot-reload, it will runs on any reloads. and you can add this to build script and check its values.
I'm not clear about how to create a pull queue in GCP from an outside application. I've found documentation about pulling messages but not about creating queues.
Can somebody point me out some information about it?
Best Regards
Creating queues from outside of an AppEngine App is currently not available.
Queue management features are coming in the new Cloud Tasks API, which is available in Alpha. You can request to join the Alpha here
I have been looking at using projects built using spring-cloud-task within spring-cloud-dataflow. Having looked at the example projects and the documentation, the indication seems to be that tasks are launched manually through the dashboard or the shell. Does spring-cloud-dataflow provide any way of scheduling task definitions so that they can run for example on a cron schedule? I.e. Can you create a spring-cloud-task app which itself has no knowledge of a schedule, but deploy it to the dataflow server and configure the scheduling there?
Among the posts and blogs I have looked at I noticed the following:
https://spring.io/blog/2016/01/27/introducing-spring-cloud-task
Some of the Q&A afterwards hints at this being a possibility, with the reference to triggers, but I think this was discussed before it was released.
Any advice would be greatly appreciated, many thanks.
There are few ways you could launch Tasks in Spring Cloud Data Flow. Following are the available options today.
Launch it using TriggerTask; with this you could either choose to launch it with fixedDelay or via a cron expression - example here.
Launch it via an event in streaming pipeline. Imagine a use-case where you would want to create a "thumbnail" as and when there's a new image (event) in s3-bucket or in a file-system directory; the "thumbnail" operation could be a task in this case - example here.
Lastly, in the upcoming releases, we will port over "scheduler" functionality from Spring XD to Spring Cloud Data Flow.
Yes, Spring Cloud Data Flow does provide a scheduling option. To enable it, you need to add below arguments while starting the server:
--spring.cloud.dataflow.features.schedules-enabled=true
I will be very appreciate for some help with resolving my issue.
I'm using Rabbitmq and there are a lot of generated queues(with names like amq.gen-pMJVWygd3iLb_buXp1oUyw), which are durable and leave forever.
The problem that such queues has exchange core.timeout, but there are also queue which should handle core.timeout.
So I'm stuck in this moment and can't find where this queues are generates.
According to your clarifications, the problem seems to be that you are letting Rabbit in your code to create durable queues automatically when connecting to an exchange.
Try debugging your class MQ to see where your queue creation is being trigger in the exchange core.timeout.
Check on the docs for more info about Rabbit.
Hope this helps.
I am writing a Web app that will need to run a background process that will poll a web service every minute or so and then query my Rails db and send out alerts to users of the app via Twitter. I have researched this a lot but I feel I am just going around in circles. I have come across delayed_job, background_job and a few other options like creating a custom daemon suggested in a Railscast. Does anyone have any suggestions for the best way to do this? The process will have to run constantly in the background and won't be triggered by an event in the front end. Any help or guidance would be appreciated.
Why don't you just create a rake task and add it to your CRON execution?
You can even use Whenever to configure this for you.
I used Beanstalkd for this and can recommend it.
http://railscasts.com/episodes/243-beanstalkd-and-stalker
You can simply use cron for tasks that has to be executed every X minutes, hours etc.
gem whenever is usefull to setup this with rails: https://github.com/javan/whenever
I don't know much about delayed_job. But you can check out some tutorials, for example this article on heroku: http://devcenter.heroku.com/articles/delayed-job
I used delayed_job for our application.
While working on this, we researched many sites and finally we are able to apply it.
We apply our experiences in the following link
http://www.kyybaventures.com/blog/rails-delayed-job#more-2916
Hope this will help to get started with background process in rails 3.
We can either use backgroundrb or unix crontab.
Crontab will do the job if you don't want to send any heavy loaded process to run asynchronously during the request process cycle of the application.
Backgroundrb consumes lot of memory and cpu in production environment if any of the process hangs out. Also we need to configure a monitor tool to make sure that the background process is running.