I have built a rails app for storing events. I want to generate a report to find the number of events happened in a day. I want to do it asynchronously. I am new to sidekiq and Redis. Can anyone suggest a good resource to study?
My suggestion for this would be to do this in a rake task that would be run on the server once a day.
You can find good resources on how to create rake tasks online and then use this simple gem to make sure the rake task runs once a day on the server.
https://github.com/javan/whenever
I am assuming you have a Profile model. You could use the timestamps in this model created_at to get all the profiles created on a given day. You could then create a CSV or whatever you like with that data and email it to whoever needs the report (how you handle the data is up to you)
You can do all the above in Sidekiq if you wish, I would recommend reading through the gem docs and this getting started guide from the official wiki https://github.com/mperham/sidekiq/wiki/Getting-Started
It's fairly straightforward and once you get your first process working it will start to make more sense.
I would also highly reccomend this video before you start working with sidekiq and redis, to give you an overall background of how sidekiq works and in what use cases it may be helpful to you.
https://www.youtube.com/watch?v=GBEDvF1_8B8
Related
I'm learning full stack development through the free course at appacademy. If you are familiar with it. am almost finished with the rails section and have done also done the SQL & Ruby sections. I have yet to start JavaScript or React.
To test my abilities, I want to create an app. It will mostly be backend + HTML since I haven't covered front end yet.
The app function: ability for someone to provide a list of URLs for reddit posts and then track number of upvotes. I want to do this by scraping the reddit content using this. This is just to test my ability and not a real use case.
I only want to scrape once a day to keep the scraping function minimal. I also want to add a refresh button next to each post, so the user can refresh when they want to.
Questions:
Ill be creating a database that stores the value of upvotes. However, from what I have learned with the program, I do not know how I can use my ruby app to keep running my daily scraper function. If I create a file scraper.rb with the function that is on a time loop, how do I put this file in my ror created app? do i put it in the models, views or controllers folder? Will it run automatically if I run my ruby server? or is this part entirely separate? do I run two ruby apps at once? One for backends and one the actual page
For the refresh button, I think I would need to include a call to the scraper function in my controller before rendering the refreshed text. Does that sound right?
From your description, I would say that the scraper should be a model method and it needs to be wrapped around a rake task.
There are many ways you can achieve the periodic execution. Some of them are:
Whenever gem
Plain old cron job for running the rake task
If you deploy it in a PaaS like heroku, render.com or something similar, there are addons which can be configured to do the cron,like heroku scheduler or render.com cron.
I'm making a blog application and want my application to automatically generate a new blog post every day, starting from a user-defined start date. I've heard of a few gems like Clockwork, Whenever, Rufus-Scheduler and I'm not sure which is the best to do this, or if any of them even can.
Has anyone had experience with using any of these gems for something like this? I'm feeling very confused at the moment.
Thanks!
I can't think of any gem as such that will generate/create a post for your user. You have to write your own script for creating posts and the scheduler/or other gems you mentioned above will just help you run that script in the form of rake task at regular intervals of your choice.
Given an existing rails app with background processes managed by Sidekiq.
How can I migrate calls such as:
Model.delay.some_action()
From Sidekiq's syntax to Active Job's back-end agnostic syntax?
Update:
#craig.karminsky has pointed out the well-written Sidekiq to Active Job Wiki Page. This page addresses mailers.
Old Sidekiq syntax:
MyMailer.delay.send_message()
Active Job syntax:
MyMailer.send_message().deliver_later
That's a good solution for mailers, but how can I migrate non-mailer calls to .delay such as:
NotAMailer.delay.do_something()
Mike Perham, the creator of Sidekiq, has a very good Wiki page up for just such a situation.
I've made a gem for that on top of activejob: https://github.com/cristianbica/activejob-perform_later/. However the gem isn't very tested and this pattern is not a good one. Having code through your application that gets executed delay it will make your app hard to maintain and trigger all kind of bugs
In my Rails app, the user can upload a file and then what I need to do is: when the file is uploaded I want to start a rake task, which parses the file and feeds all the tables in the database, and depending on how large is the file, it can take some time. I assume sth like >> system "rake task_name" in my controller will do the work.
However is it the best practice? Is it safe? Because in that way any user would be starting a rake process. In this Railcast they recommend running rake in background. What would be an alternative or the common practice?
I am using Rails 3.2, so I couldn't use Active Job. I used Sidekiq and worked fine.
Check a very helpful tutorial here
I have to call externals API to fill my database, hosted on Heroku each hours.
For this purpose, I've a ruby script that get all the data from externals API and output on the stdout. Now, I would like to store those results in my database, I have differents ways to do it (Please let a comment if you know a better way).
What I have (Constraints) :
Ruby on Rails Application running on Heroku
PG Database hosted on Heroku
"Cars" model, with "Title", "Description", "Price" attributes, and 1 other nested attribute from "Users" Model (So same schema in PG).
Ruby Script that query the differents externals API
Ruby Script have to be called each hours / 2 hours / days. The script is going to run for about 10 minutes -> 2 hours depending of the number of results
My 3 differents ways to do it :
Running the script on a EC2 Instance, and fill my database with external login directly to the database, not by the Ruby on Rails REST API.
The problem is that it never ask for the Ruby on Rails validators, so for example if my database changed, or if I have to validate some data, it won't.
Running the script on a EC2 Instance, and fill my database with cll to my RoR REST API, so filling the data with JSON / XML. The problem is that I think if I have > 1000 calls from the API, it can make my dynos suffer with high load.
Running my script on a specific dyno on Heroku (I need some informations, I can't find some informations on Heroku)
(Please let a comment if you know a better way)
What do you think ? I need something really evolutive, if tomorrow i change my "Cars" model, everything has to be easy to make the switch between old and new model.
Thank you.
I would think that the best approach would be to use a background process to perform the work. Gems like http://sidekiq.org/ and DelayedJob all have the ability to schedule jobs (which then reschedule themselves for 2 hours later in your case).
On Heroku, workers run seperate to your web dynos so won't interfere with the performance it also keeps things simple in that you don't need to expose an API since you'll have direct access to your models from the worker.
There are plenty of Heroku docs on this subject;
https://devcenter.heroku.com/articles/background-jobs-queueing
https://devcenter.heroku.com/articles/delayed-job
You can do this by writing your scripts as a Rake tasks and then use Heroku Scheduler to schedule your task(s) to run at specific intervals:
https://devcenter.heroku.com/articles/scheduler
You can separate your tasks by schedule if you have multiple, and then just add multiple schedulers. They run in one-off dynos (which you pay for at the normal rate), and since they're running from the same code base can leverage all your existing app code (models, libs, etc).