I am looking appropriate resources on "how to replicate session (user session in a stateful app) and cached objects (retrieved from an underlying database through transactional operations) of an application server". The app server can preferably Rails or any other popular one which fully supports MVC framework (using ActiveRecord or DataMapper design patterns).
It will be also helpful to know similar thing about Memcached replication internals as well if it supports this kind of replication.
If anyone can further suggest on how to integrate a NoSQL key-value stores to keep session and cached objects generated in an app server like Rails that would another appreciated answer.
My goal is to find out suitable way to replicate an app server instance for performance (either for local users or geo-distributed) and high-availability. Any point-out to current industry practice and available solutions will be much helpful in this regard.
Thanks a lot in advance.
Related
I am building a Rails app that will live in the subdomain of a Symfony 2 app (PHP). They will be used by the same type of users, and so we want them to go back and forth between the apps and ideally only login once.
I've seen some solutions on how to share session between subdomains for the same kind of apps, but no solution for sharing session between Symfony and Rails. Is it possible?
I was particularly intrigued that Rails session_store can use a database backend -- the default is cookie. That makes me wonder if both apps were to use the database backend, would they be able to share the sessions?
What other alternatives can I use to make this work, if it can work?
Sure it is possible, but I don't know how much work you have to put in ;)
The main thing is the Session ID. It all depends on that. You must make both applications use the same session storage, else it's impossible.
It doesn't matter if you store the SessionId in a database, in a file (would be pretty slow) or somewhere else, as long as both applications use it.
As you mentioned, Rails supports sessions inside a database out of the box. There's also another way in Rails: Memchached Storage. It's more or less an own database, which is optimized for fast key-value lookups.
So you should look if there's a Symfony plugin that supports either SessionId in database or in a Memcache Storage.
Have a look here: http://watsonbox.github.com/blog/2012/05/01/sharing-session-between-rails-and-php/
Forgot to mention: Both applications must also use the same SESSION_ID name inside the cookie ;)
I'm currently designing a new website built on MVC and I wonder what is the right way to manage state.
The state should contain the userId and some structs of the user info, and should be kept during the whole session of the user while he's logged in (across http requests)
The important criteria:
1) Support scalability
2) Performance
The easy way is to use the Session object, but it doesn't support scalability. If different requests during the session go through different IIS servers, the session won't be kept. Although I've heard of load balancing tools which route all requests of a single session through the same machine, I'm not sure that it's a good practice to rely on it (isn't it?)
Another option that I've read about, is keeping the state data in special state servers which are running a RAM DB (like Cassandra for Linux or Redis for Windows). But it seems to me an overkill at this stage of the development.
Do you have any other suggestions?
I would like to start with something simple at the moment, but keep the design ready for a more advanced solution at the future.
Any best practice or code/design suggestions will be appreciated.
Thanks,
Edi.
(1) Use Sql Server to Store Session State
(2) Use Memcached as a Session State Provider
(3) Cook up your own solution using Caching on an external caching provider: look into using something like the ServiceStack Caching Framework. Using this, you can use Redis, Memcached, Azure or AWS to handle caching.
Next, create a KeyFactory to handle generation of keys for specific items. The item keys would include the UserId (which you would always have from FormsAuthentication UserId (assuming that you are using FormsAuthentication). Then store any Session data for the user in the cache. Using this approach you are using Caching in place of Session, and the cache can be shared across multiple servers.
Note: you can have different approaches regarding clearing out the user's data whenever they begin a new session. Potential approaches include:
Include the user's session start dateTime in the cacheKey, and auto-expire entries when they are no longer fresh
Clear out all potential entries for a user when they begin a new session
If you are using .NET 4.5 and dependent on the type and amount of information you are keeping on users you may want to look at using claims to store information about the user. In .NET 4.5 all Principals inherit from ClaimsPrincipal. ClaimsPrincipal already uses claims to store the user name, roles and other information. You can create your own service to transform claims, which will allow you to add additional information to the Principal user.
I have built two rails apps that need to communicate and send files between each other. For example one rails app would send a request to view a table in the other apps' database. The other app would then render json of that table and send it back. I would also like one app to send a text file stored in its public directory to the other app's public directory.
I have never done anything like this so I don't even know where to begin. Any help would be appreciated. Thanks!
You requirement is common for almost all the web apps irrespective of rails, Communicating with each other is required by most modern web apps. But there is a small understanding that you need to get hold on,
Web sites should not directly access each others internal data (such as tables), (even if they are build by the same language (in this case Rails) by the same developer),
That is where the web-services comes in to play, So you should expose your data through web services so that not only rails application can consume that, but also any app that knows how to consume a web service will get benefit.
Coming back to your question with Rails, rails supports REST web services out of the box, So do some googling about web services, REST web services with rails
HTH
As a starting point, look at ActiveResource.
Railscast
docs
Message queuing systems such as RabbitMQ may be used to communicate things internally between different apps such as a "mailer" app and a main "hub" application.
Alternatively, you can use a shared connection to something like redis stick things onto a "queue" in one app and read them for processing from the other.
In recent Rails versions, it is rather easy to develop API only applications. In the Rails core master, there was even a special application type for these apps briefly (until it got yanked again). But it is still available as a plugin and probably one day becomes actually part of Rails core again. See http://blog.wyeworks.com/2012/4/20/rails-for-api-applications-rails-api-released for more information.
To actually develop and maintain the API of the backend service and make sure both backend and frontend have the same understanding of the resources, you can use ROAR which is great way to build great APIs.
Generally, you should fully define your backend application with an API. Trying to be clever and to skip some of the design steps will only bring you headaches in the long run...
Check out Morpheus. It lets you create RESTful services and use familiar ActiveRecord syntax in the client.
I'm trying to create a ruby on rails ecommerce application, where potential customers will be able to place an order and the store owner will be able to receive the order in real-time.
The finalized order will be recorded into the database (at the moment SQLite), and the storeowner will have a browser window open, where the new orders will appear just after the order is finalized.
(Application info: I'm using the HOBO rails framework, and planning to host the app in Heroku)
I'm now considering the best technology to implement this, as the application is expected to have a lot of users sending in a lot of orders:
1) Each browser window refreshes the page every X minutes, polling the server continuously for new records (new orders). Of course, this puts a heavy load on the server.
2) As above, but poll the server with some kind of AJAX framework.
3) Use some kind of server push technology, like 'comet' asynchronous messaging. Found Juggernaut, only problem is that it is using Flash and custom ports, and this could be a problem as my app should be accessible behind corporate firewalls and NAT.
4) I'm also checking node.js framework, seems to be efficient for this kind of asynchronous messaging, though it is not supported in Heroku.
Which is the most efficient way to implement this kind of functionality? Is there perhaps another method that I have not thought of?
Thank you for your time and help!
Node.js would probably be a nice fit - it's fast, loves realtime and has great comet support. Only downside is that you are introducing another technology into your solution. It's pretty fun to program in tho and a lot of the libraries have been inspired by rails and sinatra.
I know heroku has been running a node.js beta for a while and people were using it as part of the recent nodeknockout competition. See this blog post. If that's not an option, you could definitely host it elsewhere. If you host it at heroku, you might be able to proxy requests. Otherwise, you could happily run it off a sub domain so you can share cookies.
Also checkout socket.io. It does a great job of choosing the best way to do comet based on the browser's capabilities.
To share data between node and rails, you could share cookies and then store the session data in your database where both applications can get to it. A more involved architecture might involve using Redis to publish messages between them. Or you might be able to get away with passing everything you need in the http requests.
In HTTP, requests can only come from the client. Thus the best options are what you already mentioned (polling and HTTP streaming).
Polling is the easier to implement option; it will use quite a bit of bandwidth though. That's why you should keep the requests and responses as small as possible, so you should definitely use XHR (Ajax) for this.
Your other option is HTTP streaming (Comet); it will require more work on the set up, but you might find it worth the effort. You can give Realtime on Rails a shot. For more information and tips on how to reduce bandwidth usage, see:
http://ajaxpatterns.org/Periodic_Refresh
http://ajaxpatterns.org/HTTP_Streaming
Actually, if you have your storeowner run Chrome (other browsers will follow soon), you can use WebSockets (just for the storeowner's notification though), which allows you to have a constant connection open, and you can send data to the browser without the browser requesting anything.
There are a few websocket libraries for node.js, but i believe you can do it easily yourself using just a regular tcp connection.
Our current project at work is a new MVC web site that will use a WCF service primarily to access a 3rd party billing system via a web service as well as a small SQL database for user personalization. The WCF service uses nHibernate for the SQL database.
We'd like to implement some sort of web farm for load balancing as well as failover and maintenance. I'm trying to decide the best way to handle nHibernate's caching and database concurrency if there are multiple WCF services running.
Some scenarios I've been thinking about...
1) Multiple IIS servers, one WCF server. With this setup, the WCF server would be a single point of failure, but there would be no issues with nHibernate caching or database concurrency.
2) Multiple IIS servers, each with it's own WCF service. This removes a single point of failure, but now nHibernate on one machine would not know about database changes done by another machine.
Some solutions to number 2 would be to use an IStatelessSession so we're not doing any caching and nHibernate is always fetching directly from the database. This might be the most feasible as our personalization database has very few objects in it. I'm also considering a 2nd-level cache such as memcached or Velocity, but it may be overkill for this system.
I'm putting this out there to see if anyone has experience doing this sort of architecture and to get some ideas for a solution. Thanks!
am i missing something here, i don't see a problem with nhibernate on the webservers.
application cache would not be a problem as each nhibernate box would keep it's own cache which would be populate from the datastore. look at creating a table that can be monitored for reasons to do a cache refresh. we used to do this using using CacheDependency class in .net 2.0 that would detect changes to a column and then remove the relevant item from the cache. so if a user inserts a new product, the cache would be dropped and the next call to get the products would load the cache again. it's old but check out: http://msdn.microsoft.com/en-us/magazine/cc163955.aspx#S2 for the concept. cheers
I would suggest not doing caching until not doing caching becomes a problem. Your DB will do its own caching to save you searching for the same data repeatedly, so the only thing you have to worry about is data across the wire. Judging by your description, you're not going to have a problem there. If you ever get to a stage where you do, use a distributed cache - allowing your servers to cache separately will cause you bouncing data problems on refresh.