How to handle redis pub/sub in rails - ruby-on-rails

I have a rails app which talks to a socket.io app via Redis.
I can create a message on the rails app and socket.io broadcasts it, but I have no idea how to handle the incoming messages (i.e. I want a service to always be listening to Redis and process the incoming messages).
Can you please tell me how I can achieve this?

you can use sidekiq gem here which is by default using redis so whenever you enqueue a job to redis via perform_later method provided by sidekiq. Sidekiq worker will run it whenever a worker is free.
For Pub/sub as you mentioned in the header you can go for Action Cable feature provided by Rails.

Related

Is there a way to start a Rails puma webserver within rails without a db connection?

I have a microservice that is taking in webhooks to process but it is currently getting pounded by the sender of said webhooks. Right now I am taking them and inserting the webhooks into the db for processing but the data is so bursty at times that I don't have enough bandwidth to manage the flood of requests and I cannot scale anymore as I'm out of db connections. The current thought is to just take the webhooks and throw them into a Kafka queue for processing; using Kafka I can scale up the number of frontend workers to whatever I need to handle the deluge of requests and I have the replayability of Kafka. By throwing the webhooks into Kafka, the frontend web server no longer needs a pool of db connections as it literally is just taking the request and throwing into the queue for processing. Does anyone have any knowledge on removing the db connectivity from Puma or have an alternative to do what's being asked?
Currently running
ruby 2.6.3
rails 6.0.1
puma 3.11
Ended up using Puma's before fork and on_worker_boot methods to not re-establish the database connection for those particular web workers within the config

It seems that sidekiq doesn't support actioncable in Rails5?

So I'm trying to broadcast a message in one of the jobs. the job is being executed but it's not broadcasting the message.
Is it because actioncable requires Puma to perform message broadcasting?
So I guess the reason why it wasn't working was because the cable adapter wasn't set to redis in config/cable.yml

How does Redis work with Rails and Sidekiq

Problem: need to send e-mails from Rails asynchronously.
Environment: Windows 7, Ruby 2.0, Rails 4.1, Sidekiq, Redis
After setting everything up, starting Sidekiq and starting Redis, I can see the mail request queued to Redis through the monitor:
1414256204.699674 "exec"
1414256204.710675 "multi"
1414256204.710675 "sadd" "queues" "default"
1414256204.710675 "lpush" "queue:default" "{\"retry\":true,\"queue\":\"default\",\"class\":\"Sidekiq::Extensions::DelayedMailer\",\"args\":[\"---\\n- !ruby/class 'UserMailer'\\n- :async_reminder\\n- - 673\\n\"],\"jid\":\"d4024c0c219201e5d1649c54\",\"enqueued_at\":1414256204.709674}"
But the mailer method never seems to get executed. The mail doesn't get sent and none of the log messages show up.
How does Redis know to execute the job on the queue and does something else need to be setup in the environment for it to know where the application resides?
Is delayed_job a better solution?
I started redis in one window, bundle exec sidekiq in another window, and rails server in a third window.
How does an item on the redis queue get picked up and processed? Is sidekiq both putting things on the redis queue and checking to see if something was added that needs to be processed?
Redis is used just for storage. It stores jobs to be done. It does not execute anything. DelayedJob uses your database for job storage instead of Redis.
Rails process pushes new jobs to Redis.
Sidekiq process pops jobs from Redis and executes them.
In your MONITOR output, you should see LPUSH commands when Rails sends mail. You should also see BRPOP commands from Sidekiq.
You need to make sure that both Rails and Sidekiq processes use the same Redis server, database number, and namespace (if any). It's a frequent problem that they don't.

rails: deploy workers for delayed_job

Is there any good practices about setting up a queue to work with delayed_job in rails ?
For more precisions: I intend to ping some web hooks with my rails api. If using delayed_jobs, the PseudoCode could look like
get :ping do
present ping: :pong #grape style
# Bad, synchronous idea:
MyAwesomeTracker.send(event: "ping") # thi will wait for the server answer before it goes on
#Better: put it in a queue using delay_job:
MyAwesomeTracker.delay.send(event: "ping") # this will go to the queue
end
Now wether I use job_delay or resque, I'm able to send events into the queue, which is great.
The actual question: Is there any good practices for deploying workers whenever I deploy my api ?
What about worker failures ? Is there any environnement where a worker can be restarted after a crash/failure ?
I've seen a worker can be launched by running rake some_command, but what I'm wondering is how to set up an environment where a simple cap production deploy would both set up the api application, and some workers that listen to the queue.
Thanks in advance !

Enqueue and run jobs in Sidekiq from two Rails servers

I have two servers: web server (front-end) and analytic (backend) server. I need to pass a job from front-end server to back-end server through Sidekiq.
My hack is:
Install Sidekiq in both web server and backend server. I now have front-end Sidekiq and back-end Sidekiq.
Configure front-end Sidekiq so that it points to Redis server of the back-end Sidekiq. In other words, two Sidekiq shares the same Redis database server.
Now, I need to enqueue a job from front-end Sidekiq, then execute a code from back-end Sidekiq.
How I should go about doing it?
Sidekiq is a distributed messaging queue, and the whole purpose of it is for use cases like you described. Just setup a queue for the front-end to read, and a queue for the back-end to read. When you read it from the front-end queue, insert it back to the back-end queue.

Resources