ActionController::InvalidAuthenticityToken coming suddenly - ruby-on-rails

I have old project in Rails 5, I have to add Api and it was working fine, 3 days ago, but now it suddenly started to give me ActionController::InvalidAuthenticityToken I have done no changes in any controller related to web, but added few Gems includes rspec-rails, jwt and creating its Api, but suddenly on chrome it is giving me this error.
When I started work I tested and it was working fine, and on Safari browser it works fine. But on Chrome it gives this error. Following line is added in my application, if I disable this error goes, but I think that will make it unsecure.
protect_from_forgery with: :exception, prepend: true
I check few answers where long list that it s old issue, but I am working on many rails project and I never saw this issue! Some post direct me to use https so I also used https but issue for chrome is still there.
Any help

I originally had only a me-too comment.
But with sheer luck, I happen to know the answer.
It is not your code that changes; it's the browsers.
Please check the news related to Same-Site policy changes from Google.
Basically, the cookie is not working in your environment anymore because of changes in the browser, rendering the CSRF token unusable.
You have to config Rails.application.config.session_store in an initializer; unfortunately, there is no one-liner fixed all in this situation; it depends on the environment and situations.

Just put the below the line in your ApplicationController
skip_before_action :verify_authenticity_token

Related

ActiveAdmin taking wrong http method for update and destroy actions

Rails version - 5.2
Active admin version - 2.9.0
I have installed and configured active admin in my rails API application. Everything is working fine, except for the update, delete action of any controller, and logout of the admin user.
Here is my applicaiton.rb file
I have added method override in application.rb file though it is taking the POST request method for any update or delete request. It is working fine in my local even though it is taking POST request but when I deployed the code on the staging environment. I have found this thing. On my staging environment, that route is not present hence it is giving 404 error.
Below is the screenshot of the Update admin user request.
Can someone please help me to fix this issue?
I have finally fixed the issue. I am assuming the issue might be with my staging web server configuration otherwise it was working fine in my local in both the environments local and staging.
Post the answer here so it might help people in future.
By default the browser only supports for GET and POST requests. If we want to use any other request methods then we need to pass that request method in the parameter _method. You can read more about it here.
That wasn't happening in my case though i have added config.middleware.use Rack::MethodOverride in application.rb.
For resolving the issue, I have added the use Rack::MethodOverride in my config.ru file. It means before running the rails application it will use this method. I have added this code and that's it everything is working fine now.

Getting a white page when testing new gem/engine. Logs show views rendering, but the page is empty

I've been developing a gem/engine for a little bit, and using an existing app of mine to test it out. Everything has worked great. Then I went to add it to another app and suddenly it wasn't working. I figured it was because that app had some weird set up, but I've tried on a couple of other apps now and they all do the same thing.
When I go to visit the gem in the browser, the page is completely empty. It receives nothing from the server. No head, style, script, body. Just a blank page. I can visit other routes for these apps just fine and the pages load as expected. However, any of the routes added by the engine do this same load issue. The strangest part is: In my server logs, I see the correct controller hit, I see the views being rendered, I get the 200 OK at the end. I can do puts in the gem views and it will show up in the logs during the load process. There is absolutely no sign that anything went wrong anywhere, but yet... White page. I'm at a loss as where to even start debugging this. Does anybody have any experience with anything of the sort?
I created a new test app to verify and things worked as expected, but for some reason other existing apps have this issue.
Working apps:
Rails 5.0.2
ruby 2.7.2p137
Fresh/blank app: (Working)
Rails 6.1.4
ruby 2.7.2p137
Apps that don't work:
Rails 6.1.3.1
ruby 2.7.2p137
Rails 6.1.4
ruby 3.0.2p107
I'm not even sure where to start debugging this as there is no "error" so I'm at a bit of a loss.
The repo for the gem is here: https://github.com/Rockster160/command_proposal but I'm mostly looking for thoughts on how to work through why this issue is happening and out how to debug it.
Turns out this was caused because the app it was added in had links inside the layout. (Like a nav bar) I had a rescue block that caught those issues and attempted to look up the route in the main_app, however, that rescue still recognized that an error was thrown so rendered the empty file. It seems odd that would be the result, but I finally discovered that was the case.
To fix the routing issue, I had to include the main app routes as a helper in my main engine controller:
helper Rails.application.routes.url_helpers
However, this broke my engine routes, so I had to go through each of my engine's routes and explicitly call them off the engine.
tasks_path # Old way
command_proposal.tasks_path # New way

Delayed_Job on Heroku showing "Forbidden" error

I am using delayed_job gem on a heroku deployment. It has been working fine for a few months, but suddenly when I try to access to access mydomain.com/delayed_job it shows one word: "Forbidden"
When I check heroku logs it doesn't show an error, but does show that the page was requested.
Any idea why this would happen? It is especially confusing since it has been working fine until now.
In the end I tried accessing the same page in a different browser and it works. Therefore the error likely has something to do with the local browser cache.
I had this too; it seems to be similar to the problem described in this issue. The CSRF protection gets false positives.
In my case, I only use DelayedWebAdmin in development, so I created an initializer to disable session protection. This would probably be a bad idea in production, but here's how to disable it in development:
config/initializers/delayed_job_web.rb:
if Rails.env.development?
class DelayedJobWeb
disable :sessions
set :protection, false
end
end

Missing cookie in Rails 4

For some reason the session cookie on my app is not being set properly in production. This problem seemed to have just appeared overnight, with no changes on my end that I can think of. There is only one domain involved.
A session cookie is set when I run the app in development on localhost, so there is something strange happening with the server. If I inspect the cookies on the server side, it gives me a list, but the cookie is not being set in the browser. Also, I can manually create a test cookie on the server side, and it shows up on the browser. It's only the session cookie that is not showing up.
I tried changing the session store from memcached to cookiestore, which doesn't seem to have helped - still no session cookie. So I don't think it's the session_store code.
Using Rails 4.0.2 and passenger 4.0.19 with whatever version of nginx it installs. ruby 1.9.3. Any help would be appreciated - I'm completely stumped.
They already fix this in github repo, and is being release at any moment.
Anyway, if someone is in rails2, and still has this bug, or don't want to update Passenger, we could fixed it doing:
class ApplicationController < ActionController::Base
after_filter :set_headers
def set_headers
response.headers["Date"] = "#{Time.now.utc}"
end
end
UPDATE
Here it is the official post explaining what happened.
I've just update Phusion Passenger gem to 4.0.30. it is quite straight forward and has the fix for this bug. Oficial Instruction here

Cookies not saving

I'm working on an application in localhost:3000 and I just started working with cookies and can't get them to stay saved after I quit out of chrome. I checked my preferences and they were fine, cookies from other websites like stackoverflow are being retained. I've tried multiple ways of saving the cookies including
cookies.permanent[:guest_user_id] = create_guest_user.id
and it's not working (create_guest_user is a method for implementing a guest_user, taken from https://github.com/plataformatec/devise/wiki/How-To:-Create-a-guest-user). Also, Devise isn't saving anything either when I check remember me at the login page. I even added
Devise::TRUE_VALUES << ["on"]
as was recommended by another post and that didn't work for me either. I'm using rails 3.1.1, formtastic 2.0.2, and devise 1.5.1. I'm running Mac OS 10.6.8 and chrome 15.0.874.121. Thanks for the help.
UPDATE: I even tried changing my hosts file as was recommended here Can I use localhost as the domain when setting an HTTP cookie? and it still isn't working. Am I missing something obvious?
I guess there just isn't a relatively easy way of solving this problem in chrome. I tried out a bunch of solutions from various sources with little success. Not sure why google has this design in chrome as its developer tools are great and I'm reluctant to switch browsers.

Resources