Store Selected User Info in Database - symfony1

Using Symfony 1.4.x (with Propel), I've been given a task that requires me to share specific user info with multiple external systems. This is currently stored as a session (in memory) attribute, but I need to get it into a database so that I can create an API that will provide that info to authorized consumers.
I'd rather not overhaul the system to store all session data in the database (unless it's trivial and can handle namespaces), but I can't find any information on a recommended way for the myUser class to write data to the database. Is it possible to do this since the class doesn't have a model, per se (that I'm aware of)? Are there any recommended solutions or best practices for doing this?
Thanks.
UPDATE
If I were to boil this all the way down to its bare essentials, I guess the key question is this: What's the "best" way to read from/write to a database from the myUser class? Or, alternatively, is there another path that's recommended to accomplish the same end result?

Will storing result of json_encodeing or serializeing of
$myUserInstance->getAttributeHolder()->getAll()
do the job?

In the absence of a good means of accessing a database from the myUser class, I opted for a slightly different path. I installed memcached in a location accessible by both apps and extended PHP's Memcached class to apply a few customizations. The apps can now share information by writing specially formatted keys to the memcached instance.
I opted not to overhaul my existing cache storage mechanism (why upset the apple cart?) and am reading from/writing to memcached selectively for information that truly needs to be shared.

Related

What is the accepted method for a global state?

I was wondering what would you guys consider the best way to go about having some simple stuff stored across sessions without using the DB.
I'm looking to have like having 'modes' to a website. So it can be in mode a or b, and depending on the mode, buttons would do different things.
Would using Rails.cache.read and write be the best option ? I've heard it has issues with heroku if you leave the cache as filesystem, then has problems as a memcache because of multi-threading ?
I'm really trying to avoid having a whole table on DB getting used for users checking a global state of the site each request.
In order to have a "global" state, then you need to create a singke dependency for each instance of your application.
In fact, you can't rely on cookies or sessions, are they are client-oriented and they are not shared between clients.
The database is the most common approach. You may be able to use the file system in some cases, but not for Heroku as there may be several different instances of your app running under different file systems.
Any solution that can easily be shared across instances will work:
Memory database like Redis
SQL or NoSQL database
Cache systems, as long as they are not specific to one instance. Memcached may work, but it's not persistent (hence you may lose the state)
External storage (such as Amazon S3)
To me, a relational database or a memory database such as Redis seems to be the most plausible solution.
If you want per-user setting - consider storing in session (which can be stored in cookies), or directly in cookies.
Both methods end up storing some data (but not lots of it, because cookies are passed by browser with each request) inside clients' browsers.
You could put it in a table - just so that you have it, but then make the value available via the ApplicationContoller, with a simple cache method in between
so something like: - not tested!!
def get_mode
if #mode_timeout.nil? or #mode_timeout < Time.now
#mode = ModeModel.first.mode
#mode_timeout = Time.now + 60.seconds
end
#mode
end
You'll have to create a model, or you could update if via a controller with a set_mode method instead, but that would be more transient.
Then you can just call get_mode from your controller.

Is this an ok design decision? is there a better way?

So, For the sake of performance, I'm using database sessions. I figure that while the sessions are server side, I might as well store commonly accessed objects in the session. So, I'm storing serialized versions of the current_user, current_account, and the current_user's permissions.
The User model handels a lot of the permissions methods (things like user.can_do_whatever), but since i'm trying to be more efficient, and store commonly accessed things in the session (this allows for far fewer DB accesses), does it make sense / break any design standards to (upon each request) store the session in an instance variable in the current_user ?
As of right now, I can't think of any alternatives.
ROR application have by default a RESTful design. One rules of REST is stateless. that mean each request from client to server must contain all of the information necessary to understand the request, and cannot take advantage of any stored context on the server.
If you have trouble with Database performance, use a cache system like memcached wich is already integrated in rails (Caching with Rails).
I found a couple of references warning against storing non-primitive data types in the session, but they were all just warnings, and boiled down to: Storing complex objects is "Expecially discouraged" [sic], but if you decide you need to... well, just be careful.
Anyway, I'm kinda taken by the idea of having the users table double as the sessions table, but serialization still seems a bit sketchy. If you're just trying to cut down the number of DB requests, what about storing IDs and using :joins when looking up your user (might require a bit of hackery to get that worked into the default session loading). That avoids synchronization problems and serialization sketchiness, and still only generates a single DB query. Just make sure to use :joins and not :include, as the latter generates a query for each table.
Hope that helps!

How should I go about using a rdbms and mongodb in a rails app?

I'm currently testing the waters with mongoid and have so far begun on an ecommerce store. Now of course mongoid doesn't have transactions so I'd like to ideally use mongoid for most of the app including authentication, authorization, product information etc.
However, the lack of transactions necessitate a return to an rdbms. The rdbms would be used purely to record financial transactions.
Is this possible in rails and has anyone done it?
I have limited experience with rails in general but I imagine having the secure part mounted as a engine and urls scoped under secure.myapp.com or myapp.com/secure/ and the user would be redirected to the ssl while rack takes care of things like shared sessions.
Would this work? Or has anyone found a better way of implementing this?
It is possible to mix mongoDB and a traditional RDMS, but you may have to do some extra coding on your part if you want ActiveRecord objects to communicate with MongoDB objects, since the ORMs are different. Keep in mind that while it is true that MongoDB does not support transactions across multiple documents, it does support 'transactional' atomic updates - which means that if all the data you are updating is contained within a single document you don't have to worry about transactions. MongoDB also supports safe updates, allowing you to verify that data has been written to n different replica servers and has been persisted to disk.
As for shared sessions between HTTPS and HTTP - this is not something you have to worry about. You'll define your session store as either MongoDB, MySQL, Memcached or, my recommendation, Cookies. As long as you define your domain as '.myapp.com' the cookies will be shared across all subdomains of your application regardless of the protocol.
While I can't comment directly on the rails aspect of the question, as with the first poster's response, MongoDB does support transactional updates. It's probably simpler to implement your entire system in Mongo, or in an RDBMS.
The real question is what is the motivation behind using mongo here? What are you hoping to gain from a document database model? Do you just want to rip RoR objects directly to mongo?
Just a suggestion, (abstractly) but you could just strictly define your objects up front, and represent that definition in your RDBMS. It will probably save you a lot of time if you don't have a clear motivation for using Mongo. Mongo is an awesome technology, but it's best for sorting through data and cataloging data, rather representing strict data structures (not that it's incapable of doing so, necessarily, but with a document database, you have a lot more flexibility with the content of each object within your db).
Good luck!

Configure Symfony for use with Memcached

I have 2 Symfony applications (1 using 1.2.x, another using 1.4.x and both using Propel) that need to share some specific session information. Although I have no experience with memcached, my sense--after some reading--is that it may be able to serve as an external (FAST) repository that each app could read and write to. Unfortunately, I can't find much information about how to use it with Symfony in any capacity, much less in the quasi-cache, quasi-messaging server I'm envisioning.
My questions, I suppose, are:
Am I mistaken in believing that memcached be used in this manner and access by multiple systems?
How can I configure Symfony to access a memcached repository?
Thanks.
This explains one approach fairly well (you don't need the view cache stuff, just the second half about making a singleton available and configuring it):
http://dev.esl.eu/blog/2009/06/05/memcached-as-singleton-in-symfony/
edit: now 404, but still available here
You can then use:
sfMemcache::getInstance()->set()
and
sfMemcache::getInstance()->get()
(same as the methods here as sfMemcache subclasses Memcache).
As long as both apps point to the same memcache, you should be able to share data between them like this.

Generate new models and schema at runtime

Let's say your app enables users to create their own tables in the database to hold their own, custom data. Each table would have it's own schema. What are some good approaches?
My first stab involved dynamically creating migration files and model files bu I'd like to run this on heroku where you can't write to the filesystem.
I'm thinking eval may be the way to go to create and run the migration class and the model class. But I want to make sure the model class exists when a new process of the app is spawned. Can probably do this by storing these class definition with each user as they create new tables and then run through them all at startup. But now it's convulted enough that I may be missing something obvious.
It's probably a better idea not to generate new classes on runtime. Besides all of the security risks, each thread's startup time will be abominable if you ever get a significant number of users.
I would suggest rethinking your app design and aim at generic tables to hold the user's custom data. If you have examples of data structures that users can create we might be able to help.
Have you thought about a non-sql database for those tables? Look at CouchDB - there are several plugins on Github that integrate it with rails. Records in the database are JSON documents, with arbitrary key-value structure. May be perfect for a user-defined schema.
There is (was?) a cool Wiki project, called Informl. It was a Wiki, not just for web pages but for web applications. (Get it? It's informal because it's a Wiki, it's got forms because it is an application, and it's user-generated, thus Web 2.0, which means that according to an official UN resolution it is legally required to have a name which is missing a vwl.)
So, in other words, it was not just about user-generated content, but also user-generated structured data.
They did this by generating PostgreSQL-specific SQL at runtime to create new tables and then have ActiveRecord reload the schemas.
The code is up on RubyForge. It's based on Rails 1.2.3. I guess you could do much better than that today, especially with the upcoming extensibility interfaces in Rails 3.

Resources