What are the pros and cons of rails page caching vs a 3rd party vendor? - ruby-on-rails

There are a few 3rd party addons you can add to a Heroku app to manage caching. Why would you use them vs using the built in caching framework?

I am assuming these 3rd party addons work like CloudFlare or at least work on the same basic principles.
Caching Framework
You control when the cache expires, allow for fresher more relavent content.
Your site doesn't go down/get messed up/looks fugly when their service goes down.
You can permanently cache things that will never change.
Can make your own CDN with your own logic and setup.
Fragment caching, meaning only part of the page expires instead of the whole thing, leading to less dog piles.
3rd Party Service
Fire and forget.
Cheap
Usually pull all images, js, and css files into their 'CDN' also.
Some claim added security because your site is now basically behind their servers now, though I haven't really read anything that said this was anything but market double talk.

Related

Is it possible to build a Progressive Web App without client side rendering?

My company's site is mostly server rendered (we make some use of Structured Page Fragments) but we'd like to look into building a Progressive Web App.
Does it make sense to build a Progressive Web App by implementing service worker caching for server rendered pages? Alternatively, should we rather explore moving to client side rendering?
Note that we'd like to do as much rendering as we can on the server as we support many very slow devices.
Yes, service workers are definitely not restricted to client side rendering.
You can cache whatever you want. For example, this WordPress plugin caches WordPress content.
Server-rendered pages imply that there is some degree of dynamic or personalized content on your site—otherwise, you'd just be serving static HTML.
I'd encourage you to initially think about it from the perspective of how you want to deal with your dynamic content, both when you're online and offline. Reading through Jake Archibald's Offline Cookbook will give you a good overview of the different strategies that could be implemented.
Once you're set on a caching strategy for your dynamic content, the next step is to implement it. The "gold standard" approach is to use an App Shell + dynamic content architecture, but it can take some refactoring to get an existing application onto this architecture, especially one where the initial HTML returned by the server contains dynamic/personalized elements.
If refactoring is too daunting a task, or if server-only rendering is a hard-requirement, then you can still make use of service worker caching, but you'll probably end up treating your would-be shell as if it were dynamic content. This means a pure cache-first strategy might not be "safe", but a cache/network race might work, or at least network, falling back to cache.
Using both of those strategies, you'd end up with a web app that would work offline, but you'll likely end up caching duplicated data (i.e. if /page1 and /page2 share some common HTML structure, you'd end up caching that twice). You'd also take a performance and bandwidth-consumption hit since you'd have to go to the network more often than you would with an App Shell, but that could be mitigated somewhat via proper HTTP browser caching headers. (Which you should think about anyway, for browsers that lack service worker support.)

Best solution for client side templating

I am working on a project with very strict company security rules which means I am unable to create CMS pages using a local server. As a result the company still makes use of old technologies such as shtml includes. This means that node.js is out of the picture. I have been researching angular.js, handlebars.js and various other means of client-side template solutions. However, most require some sort of third party tool (outside of node) in order to get these working. I am only allowed to use CSS/javascript libraries on flat pages.
Any suggestions?

Why doesn't same-domain policy affect native mobile apps?

We were trying to create a mobile HTML5 web app. We call a service hosted in domain xyz.com using javascript and we run into the same-domain origin policy issue. We have to use CORS to make the cross domain requests. But if I make the same request using a native iOS app, it works fine even without the access-control headers that are needed for CORS.
This may seem like a noob question, but why does same origin policy only apply when making calls using javascript for web apps and not for native apps?
It's completely arbitrary to be honest but there is some thought behind the policy. In Javascript land you are much less in direct control of what's being executed and from where.. take a look at this page alone and you'll see several different server sources.. and so the same origin policy was implemented in order to minimise the risk of running arbitrary, untrusted code from third parties within your browser.
In native land you have more control and must actively choose to instantiate a JS context and run whatever you've received.. so it seems reasonable to suspend the same origin requirement in that case.

How can I retrieve updated records in real-time? (push notifications?)

I'm trying to create a ruby on rails ecommerce application, where potential customers will be able to place an order and the store owner will be able to receive the order in real-time.
The finalized order will be recorded into the database (at the moment SQLite), and the storeowner will have a browser window open, where the new orders will appear just after the order is finalized.
(Application info: I'm using the HOBO rails framework, and planning to host the app in Heroku)
I'm now considering the best technology to implement this, as the application is expected to have a lot of users sending in a lot of orders:
1) Each browser window refreshes the page every X minutes, polling the server continuously for new records (new orders). Of course, this puts a heavy load on the server.
2) As above, but poll the server with some kind of AJAX framework.
3) Use some kind of server push technology, like 'comet' asynchronous messaging. Found Juggernaut, only problem is that it is using Flash and custom ports, and this could be a problem as my app should be accessible behind corporate firewalls and NAT.
4) I'm also checking node.js framework, seems to be efficient for this kind of asynchronous messaging, though it is not supported in Heroku.
Which is the most efficient way to implement this kind of functionality? Is there perhaps another method that I have not thought of?
Thank you for your time and help!
Node.js would probably be a nice fit - it's fast, loves realtime and has great comet support. Only downside is that you are introducing another technology into your solution. It's pretty fun to program in tho and a lot of the libraries have been inspired by rails and sinatra.
I know heroku has been running a node.js beta for a while and people were using it as part of the recent nodeknockout competition. See this blog post. If that's not an option, you could definitely host it elsewhere. If you host it at heroku, you might be able to proxy requests. Otherwise, you could happily run it off a sub domain so you can share cookies.
Also checkout socket.io. It does a great job of choosing the best way to do comet based on the browser's capabilities.
To share data between node and rails, you could share cookies and then store the session data in your database where both applications can get to it. A more involved architecture might involve using Redis to publish messages between them. Or you might be able to get away with passing everything you need in the http requests.
In HTTP, requests can only come from the client. Thus the best options are what you already mentioned (polling and HTTP streaming).
Polling is the easier to implement option; it will use quite a bit of bandwidth though. That's why you should keep the requests and responses as small as possible, so you should definitely use XHR (Ajax) for this.
Your other option is HTTP streaming (Comet); it will require more work on the set up, but you might find it worth the effort. You can give Realtime on Rails a shot. For more information and tips on how to reduce bandwidth usage, see:
http://ajaxpatterns.org/Periodic_Refresh
http://ajaxpatterns.org/HTTP_Streaming
Actually, if you have your storeowner run Chrome (other browsers will follow soon), you can use WebSockets (just for the storeowner's notification though), which allows you to have a constant connection open, and you can send data to the browser without the browser requesting anything.
There are a few websocket libraries for node.js, but i believe you can do it easily yourself using just a regular tcp connection.

What components of your site do you typically "offload" or embed?

Here's what I mean. In developing my ASP.NET MVC based site, I've managed to offload a great deal of the static file hosting and even some of the "work". Like so:
jQuery for my javascript framework. Instead of hosting it on
my site, I use the Google CDN
Google maps, obviously "offloaded" - no real work being performed on my
server - Google hosted
jQueryUI framework - Google CDN
jQueryUI CSS framework - Google CDN
jQueryUI CSS framework themes - Google CDN
So what I'm asking is this, other than what I've got listed...
What aspects of your sites have you been able to offload, or embed, from outside services?
Couple others that come to mind...
OpenAuth - take much of the authentication process work off your site
Google Wave - when it comes out, take communication work off of your site
In the past I've used Amazon AWS.
Their S3 service was cheap for hosting Images and video.
The EC2 service is also good for additional computational power or just removing load from your server.
In additon to Pay for hosted services you can use Youtube or Vimeo to host videos and they API will allow you to upload and host videos.
There are also APIs for may other services depending on exactly what you're wanting to do. If you looking at adding functionality to your site but without hosting the service it would be worth checking out http://www.programmableweb.com/
Even though Google's CDN has smaller files and faster response times, I'm now using Microsoft's CDN for jQuery. Why? Big Brother.
In some high-security companies, they only allow access to known domains. Users at those companies had problems because their firewalls didn't know googleapis.com, and blocked jQuery. They knew microsoft.com, so ajax.microsoft.com worked.
I've suggested to google that they change their URL from ajax.googleapis.com to something.google.com to avoid the issue in the future.

Resources