I have a Symfony app that populates the "widgets" of a portal application and I'm noticing something (that seems) odd. The portal app has iframes that make calls to the Symfony app. On each of those calls, a random user key is passed on the query string. The Symfony app stores that key its session using myUser->setAttribute(). If the incoming value is different from what it has in session, it overwrites the session value.
In pseudo-code (and applying a synchronous nature for clarity even though it may not exist):
# Widget request arrives with ?foo=bar
if the user attribute 'foo' does not equal 'bar'
overwrite the user attribute 'foo' with 'bar'
end
What I'm noticing is that, on a portal page with multiple widgets (read: multiple requests coming in more or less simultaneously) where the value needs to be overwritten, each request is trying to overwrite. Is this a timing problem? When I look at the log prints, I'd expect the first request that arrives to overwrite and subsequent requests to see that the user attribute they received matches what was just put into cache by the initial request.
In this scenario, it could be that subsequent requests begin (and are checked) even before the first one--the one that should overwrite the cached value--has completely finished. Are session values not really available to subsequent requests until one request has completed entirely or could there be something else that I'm missing?
Thanks.
Attributes of the user do not get written to storage until the end of the request (in sfUser::shutdown). Attributes get loaded into sfUser at the beginning of a request. So in this case, the second request would have to be initiated after the first request is finished. Your best options are probably
Add hardRead and hardWrite methods to sfUser (look at what sfUser::initialize and sfUser::shutdown do respectively).
Use another method of storing the information that has better support for concurrency. The database or potentially the caching system you're using could work. For example, I think this could be done using APC cache.
Note that depending on what class you're using for storage, user attributes may not get written to $_SESSION at all. Symfony supports using many methods for storing user attributes (e.g. database, cache).
Related
Currently we are using Breeze.js and Angular to develop our applications. Due to some persistent legacy issues, we have two databases ('Kenya' and 'Rwanda') that cannot be merged at this time, but have the same schema and metadata. Most of the time, the client knows which database to hit and passes the request through the .withParameters() function or the .saveOptions() function. Sometimes we want to request the same query from both databases (for example, if we are requesting a list of all available countries), and we use a EntityManager wrapper on the client to manage this and request the same query from each database. This is implemented through a custom EFContextProvider which uses the data returned to determine the appropriate database and creates the appropriate context in CreateContext().
To further complicate things, in some instances one or the other database won't exist (these are local deployments created through filtered replication), but the client won't know this. Therefore, when querying for a list of all countries, it issues two requests and one will cause failures because the context cannot be instantiated properly.
This is easy enough to detect on the Server. What I would like to do is to detect whether the requested context is available and, if not, return a 200 response and an empty set.
I can detect this in the Breeze DBContextProvider CreateContext() method, but cannot figure out how to cause the request to fallback gracefully to a empty-set response.
Thanks
Not exactly what I was looking for, but it probably makes more sense since most of the work is being done on the client-side:
Instead of trying to change the controller, I added a getAvailableDatabases to the C# controller actions and use that to determine which of the databases I will query from the client.
This is a Rails 3 project.
Am I abusing the use of cookies if I store query values there? I have a dataset that a user can "drill-down" through, so as the user clicks through the data, he amasses a bunch of query values that further limit the data presented on the next request.
Right now I'm doing this with a cookie, and it works great, except that I can't figure out to check to see if cookies are enabled. So some people using IE are giving me fits because the app just fails with no errors.
I used to put values like this in a session variable, which worked great until it mysteriously didn't, i.e. when memcached aged or cleared them out. I wouldn't want to keep the values in a session in the db because I don't want the extra hits on every request.
So I suppose I could put the values either in hidden form fields, or append them to the links on the page that I'm presenting each time. Is there a conventional Rails Way to do this that I'm missing?
If you're showing a different set of results, the URL should reflect this. This makes URL query parameters the natural choice. This provides several benefits:
There is no state at all. You don't have to store anything or break the stateless nature of HTTP.
There is a one-to-one correspondence between sets of query results and URLs.
You can link to query results.
Works on everything, ever.
My application (Asp.Net MVC) has great interaction with the user interface (jQuery/js). For example, setting various searches charts, moving the gadgets on the screen and more .. I of course want to keep all data for each user. So that data will be available from any page in the Dumaine and the user will accepts his preferences.
Now I keep all data in a cookie because it did not seem logical asynchronous access to the server each time the user changes something and thet happens a lot.When the user logout from the application I save the cookie to the database.
The Q is how to save the settings back to the db - from the client to the server.
because the are a lot of interactin that I want to record.
example scanrios: closing widget,moving widget,resizing menues, ordering columens..
I want to record that actions. if I will fire ajax saving rutine for each action
ןt will be too cumbersome. Maybe I have no choice..
Maybe I should run an asynchronous saving all of a certain interval seconds.
The problem is the cookie becomes very large. The thought that this huge cookie is attached to each server request makes me feel that my attitude is wrong.
Another problem cookies have size limit. It varies from your browser but I definitely have been close to the border - my cookie easily become 4kb
Is there another solution?
Without knowing your code, have you considered storing the users preferences in a/your database. A UserPreference table with columns for various settings is a possibility.
You could update it via AJAX/JSON if you had a 'Save Preferences' option, or just update it on postback.
EDIT 1: After thinking about it, I think having an explicit 'save preferences' button would be beneficial and practical.
Somewhere on your page, where the use edits the things that generate the cookie, put an button called save, then hook up a jQuery click handler. On click, build a CSV string or another method of storing the preferences for posting back to the server, then use $.post to send it back to an action method in a controller.
Once there, store it in the database somehow (up to you exactly how), then return a JSON array with a success attribute, to denote whether the preference storing was successful.
When the page is loading, get the preferences out of the database and perform you manipulation.
Another solution would be to store the user preferences into the session and write some server side logic (like action filter) that would write those preferences as JSON encoded string on each page (in a script tag towards the end of the markup) making them available to client scripts.
What want to be accomplished is:
I want to "synchronize web browsers". my site has a "wait URL" where when browser gets there it will be kept waiting till another browser also go there and then both will be presented with a quiz-like game.
Right now the wait url will call each second to Rails to check if other player came to the game. How can in the Rails framework detect a different client connecting to the same URL?
As the controller is recreated per request looks like is not the place, not the view for sure and storing this in the model looks really clumsy.
Also, after the pairing I need to check and compare every answer of the paired users so somehow that information must be retained
What you're trying to do is share information between users. So the database or memcached are the most sensible.
Simplest: I'd create an ActiveRecord object, perhaps called Quiz, instances of which people join by virtue of going to a URL, e.g using default routes:
http://yoursite.com/quizes/join/3434
You'd need an ajax poller poller to notify the others; use periodically_call_remote for this -- you could use render :nothing => true by default and render something else if there was an error to keep it efficient. You can also use the frequency method as a basis to determine whether people leave the quiz as well (e.g. if frequency is 1s, then assume someone has left if they didn't ping after 5-10s).
Assuming these users are not registered with the site so don't have some kind of user id you could store I would suggest using the session. It is a per user data store. By default the session is stored in an encrypted cookie on the users machine. However you can use ActiveRecord as the session store and could maybe query that table directly?
Store the URL in the session and do a search for it at a later time. You can normally only access the current users session using the Rails 'session' hash but maybe (untested) if you created a model called Session (or maybe something more specific like 'WaitingGamers') which used the sessions table you could lookup the information you need.
I would guess when using ActiveRecord as the session store the session data is stored as a serialised hash. Use Marshall to turn it back in to a regular hash and find the data you stored in there.
I'm not a rails expert, but since all the state resides in your database that would be the place to keep this information.
You could keep a "waiting users" table, and in the "wait URL" view check if the user is already in the table. If not, add him to the table. Then, check if there is another user waiting (maybe there's more than one?) and if so, match them up and delete them from the table.
Another improvement would be to keep a timestamp for each user in the "waiting users" table, which gets updated in the view - this would serve as a keep-alive that will enable you to detect users that left the "wait URL" page or closed the browser.
Warning: some of this may be very wrong-headed, so please let me know if my assumptions are incorrect.
Here's what I'm trying to accomplish:
I'm using restful-authentication for login. However, as I am using flex/ruby_amf for my UI, I have to separately authenticate each connection from flex.
The way I decided to do that was by having the log-in screen redirect to the embedded flash page, inserting the session-id as a flashvar. The flash app sends the session-id with every request, and a before filter on all of the relevant controllers checks to see if the user associated with the session identified by the session-id is logged on.
The way I associate a user with session is by adding a 'user_id' column to the sessions table, and doing an sql "update sessions set user_id...'" type query called from the login function.
However, the user_id only gets updated the 2nd time the user logs in. A little investigating showed that the record in the sessions table does not yet exist during execution of the login function.
So, if everything up to this point makes sense, and conforms to best-practices, etc., then my question is:
At what point in time is the record in the sessions table created? Is there a way to update the session object in the login function and have rails write the user_id to the database for me?
The behavior of sessions in rails is a real mystery to me. I'd appreciate any help.
Thank you.
In Rails 2.3, the session is saved after the Rack application has finished its processing. In traditional Rails applications, this will be after the request is fully processed: before filters, controller action, view rendering, and after filters. Look in actionpack/lib/action_dispatch/vendor/rack-1.1.pre/rack/session/abstract/id.rb.
If you think about it, this makes perfect sense. Writing the session to its store every time you place something in the session would incur a lot of extra overhead.
It's Rails, so if you want to mess with it enough, sure, you can monkeypatch yourself a way to write the session to store anytime you wish. I don't recommend it. You'll end up having to rework the code constantly as Rails evolves.
You are right that for ActiveRecord::SessionStore, one row does map to one session. The data column is an encoded form of every object you put in the session. Each time a request comes in, Rails has to reconstitute the session as it existed by creating new instances of all the objects you previously stored in it.