Is there a good way to enforce and/or detect (at runtime) that a specific page rendered by rails is only using absolute paths?
For context, we'd like to setup a reverse proxy (not a redirect!) such that www.my-website.com/some-path actually loads a page from a rails application on a subdomain - say, blog.my-website.com/some-path.
The problem is that if any URL on the page (say, the CSS/JS assets, a link, a form path, whatever) is relative (e.g. /packs/css/some-path/index-fe88bb82.css) then the page won't load correctly and/or will have broken functionality. This can be easily fixed by updating everything to use absolute URLs (e.g. https://blog.my-website.com/packs/css/some-path/index-fe88bb82.css). However, this feels fragile and error prone without some tooling to support it.
What I'd ideally like is some tooling to either enforce "absolute paths only" (e.g. could calling any of the _path methods raise an exception?!), or detect if this has happened (I could scan the rendered HTML for relative paths, but that seems expensive!).
The solution only needs to work on a specific controller action, not the entire application.
Does anyone have a suggestion for what strategy I could try? (Maybe even some tool outside the rails ecosystem to just monitor the live pages??)
Related
I'm trying to host multiple rails apps for my blog. Kind of like www.blog.com/app1 will have one rails app, www.blog.com/app2 will have another. How do I do it?
Although I agree with downvotes as pointed out by the first comment, I had this problem myself several months ago and actually didn’t even try to solve it as I realized how many implications this has. Existing answers on Stack Overflow address either slightly different or narrower issue so they may use some things mentioned here but don’t elaborate on implications or alternatives, yet there’s an interesting overview (and also other answer to that question). Anyway, I took it as a challenge and dived in.
First, there are multiple approaches depending on your scenario:
All applications are code which you maintain – it’s probably the best to explore something called engines. They are like mini RoR applications mountable to certain path within normal RoR application. It has many benefits like sharing the same runtime or simple isolation configured in on place.
If there are no AJAX with URL or similar dynamisms or that they are actually AHAH (i.e., asynchronous HTML and HTTP – returning HTML fragments instead of XML or JSON data) which is very natural for Rails although often not used, you can use sophisticated proxy modules like mod_proxy_html which rewrite links inside HTML documents while proxying. Similar modules exist for nginx but are not part of standard distribution.
RoR has a configuration option relative_url_root which allows deployment to subdirectories. It’s very fragile and often buggy, many gems or engines break when you use it, so beware. When you get it right, it looks like magic. However, your configuration relating to subdirectory will be scattered throughout different software configs and your code.
I created an example repository while exploring the last option. README should say everything necessary to run the code.
The most important observation from this small project is that when using relative URL root, you almost certainly want to scope all your routes. There are different setups possible, but they are even more complicated (which doesn’t mean they don’t make sense). For examples see the answer with overview mentioned above.
By default (without scoped routes), only asset paths are prefixed with relative URL root, but not action route paths even though it makes URLs generated by helpers useless unless translated by mod_proxy_html or probably more custom solution.
Other important observation, which relates to official guide, code “out there” and answers to similar questions here on Stack Overflow, is that it’s good to avoid forward slash at the beginning of relative URL root. It behaves inconsistently between tests and the rest of the code. Yet it can be used nicely around your code – see scope definition in routes config or dummy controller test case.
I got to these and other observations by creating two very simple and almost identical Rails 5.2 applications. Each has one action (dummy#action) which has a route scoped to relative URL root. This action, or its view specifically does two important things to verify that everything works:
it outputs the result of calling root_path helper which shows we have correctly setup URL/path helpers (thanks to scoped route in config/routes.rb)
it loads static asset which isn’t served by Rails application but directly by Apache HTTP Server and which is referenced by image_path helper
You can see that virtual host configuration has rather extensive list of URLs which shouldn’t be passed via proxy and rely on aliased directories. However, this is application specific and very configurable, so simpler setup with different directory layout is definitely achievable but entirely separate topic.
If you like Passenger and don’t want to use proxying in your HTTP server, you can find more information in their deployment tutorial.
My goal is to embed a full Ruby on Rails app in a Wordpress site. Ideally, the staff should be able to edit the non-Rails parts of the site just like any Wordpress site, including content, theme, menus, etc. When the user clicks a link to a certain page, it should show the Rails content within the Wordpress template—headers, menus, sidebars, etc.—all of which should look the same as the rest of the site. I'd much prefer to do the presentation of the Rails content within Rails, where I can use Slim, CoffeeScript, SASS, and all of the other built in presentational magic, rather than setting Rails up as just a JSON server and having to muck around with PHP to retrieve and format the data.
I've tried a few techniques so far, but each has its downsides:
First, I tried embedding the Rails app in an iframe, but iframes are clunky, and I couldn't get it to expand or contract to the height of the content.
Second, I tried creating a special Wordpress template file that loads the Rails content using PHP's file_get_contents function. That worked okay, but required a lot of jankiness to get URLs to transfer over, including having to add a question mark into the URL for a certain subpage to get Wordpress to ignore it and pass it through. It's also a little slow because it's loading content from two different dynamic systems. And I never could figure out how to get cookies to pass through to Rails, which is essential for this app.
My third solution was to create a blank page on Wordpress (not linked to anywhere) and write a method in Rails that pulls that page, creates a layout file, and then uses that as the layout for the content. So the whole page is actually being served by Rails, but to the user it looks just like Wordpress, and because Wordpress doesn't use relative URLs in links, they all work just fine. The trouble there is that changes to the Wordpress template (including menus, template files, theme options) don't take effect on the Rails pages until that method is run. I set up a cron job to run it every 15 minutes, but that's not ideal, and something tells me there must be a better way.
This seems like something that would be pretty common, but I haven't been able to find any solutions online. Has anyone else made this work?
I am taking over development on a rails 3.2 application and am looking for the best way to improve page loading time. The site itself is more of a large dynamic website than an actual web application (the site is http://korduroy.tv/, a surf lifestyle community site), and while there is a couple of small pieces that differ from user to user, most of the site is the same experience for everyone.
Page loading time is fairly slow, and from looking at the server logs, it seems to be because each page is loading so much dynamic content (for example, most pages are loading resources from 10+ models). While I hope to go through and refactor what I can, I am looking for some basic performance wins. Knowing that most of the site is the same for every user, is their a way to aggressively cache the content on the server or even serve static content that has been generated through some kind of background job?
My initial thought was to create a job that uses a static site generator, maybe something like Jekyl, and basically creates a static copy of the site, which could then be served on a cdn. My gut is telling me this is probably not the way to do it, plus there are some pages (such as the user profile page), that need to be served dynamically.
Any recommendations would be great. Disclaimer, I come from front end land and have very little knowledge of best practices when it comes to server side optimizations. Thanks!
From what you write, I believe your biggest gain will be in implementing a fragment cache using a memcache store. See http://guides.rubyonrails.org/caching_with_rails.html as the definitive guide on rails caching.
You might be able to get away with a page cache or action caches for some of the content that doesn't depend on the user (like the homepage), but unless you're serving up millions of requests a day, I'm not sure this is necessary.
I notice that while the javascript and css seems to be compiled according to the rails asset pipeline, the images are missing the sha1 hashes that allow aggressive browser caching of the resources (since you don't have to worry about the contents changing, as they get new hashes when you change the images). The key here is enabling the asset pipeline, making sure you compile your assets as part of deployment (rake assets:precompile) and using the image_tag and asset_path helpers (or image-url sass helper). Also make sure that nginx is responding with code 304 (not modified) to your browser when you're refreshing a page. This won't affect the load on the rails server (unless you have both nginx and rails running on the same server), but will reduce the average page load time.
You can look into more advanced techniques like caching your sql queries, or optimizing these, but this can lead to increased complexity making it harder to maintain your codebase. Try the view caching first, and see if that gets the load time to an acceptable level.
I'm just learning about SproutCore now, seems great. But I can't find a good answer on deployment options.
I'm starting small. Just implementing a single page of a complex site with SproutCore. Right now, that page is dynamically generated and served from my django based server. I serve all of my static files (.js, .css, images, etc) off of a CDN.
The page represents one customer.
So, on that dynamic page, it knows:
What customer we should be looking at, the ID, name, etc.
Where my media should be loaded from (absolute HTTP path)
How do I get a SproutCore based app to deploy and run in an environment like this?
I imagine I can upload the built sproutcore app to my CDN. Then in my html page, somehow reference it. But how does that SproutCore app know what server to request backend data from (I'd rather not hard code it)? It can't be installed in the root of the CDN, so how does it know how to load things relative to itself? I could tell it an absolute URL to load from at run time. With some pain, I could even tell it an absolute URL to load from at build time.
No answers on this one, here's what I did...
Ended up moving to Ember.js (aka SproutCore 2). That follows a completely normal "add .js to a page and serve it normally" model and doesn't have any interesting deployment worries, so it's a no-brainer.
I'd love to use page caching on a Rails site I run. The information on each page is mostly constant, but the queries that need to be run to collect the information are complicated and can be slow in some cases.The only obstacle to using page caching is that the administrative interface is built into the main site so that admin operations can be performed without leaving the page of interest.
I'm using Apache+mod_rails (Passenger). Is there a way to indicate to Apache that .html files should be ignored when the current user either has a session variable or a cookie named 'admin'*? The session variable need not be evaluated by Apache for validity (since it will be evaluated by Rails in this case).
Is there a way to indicate to Apache that .html files should be ignored when the current user either has a session variable or a cookie named 'admin'*?
I believe it is not really possible. Even if it is, I guess should be very tricky.
Instead you can use Action Caching. A quote from docs:
One of the issues with page caching is
that you cannot use it for pages that
require checking code to determine
whether the user should be permitted
access.
This sounds like exactly your case.
But if you still really need Page Caching via web server, I think you better implement separate pages for admin and non-admin.
This is because of one reason. When you enable Page Caching rails doesn't process the request at all and thus there is no way to know if user is authenticated or not.
You probably can overcome this using Dynamic Page Caching. The idea is basically to add the "admin" part from the JavaScript. I don't personally like this a lot though.
One more update: quick search brought me to this article.
The idea is to cache page conditionally and plug mod_rewrite to serve admin pages.
Will work for you but is pretty dirty solution.