why saving the news record in typo3 7.6? - timeout

timeout saving news record in typo3 7.6 in addition to saving only part of the news content, the BE is very slow and also the association to tags is practically blocked, while the FE works regularly

Related

Cache layer to json api (on rails)

I have a small social website on rails (planing to port to phoenix) that use react on view and backend is just a JSON API,
with more o less 3000 users online at any moment. It runs with postgres/memcached
When user, for example, visits its feed page, I do:
Select activities from database (20 per page)
Select last 4 comments from each activity from database (justo 1 select)
Select all users referenced by activity or comment from database (select users.* from users where id in (1,3,4,5,...100) )
I have a cache layer (memcached) that when I will load users, first I try load from memcached, if it not exists I read from database and put it on cache.
BUT I also have some "listenners" on users model (and over others referenced models like address and profile) to invalidate cache if any field change.
The problem:
This cache demand a lot of code.
Sometimes cache run out of sync.
I hate to have this listeners and they are "side effects"
My question is: Any one is doing something like that??
I search A LOT over all google about cache layer to json api and looks like that everyone is just using database directly.
I know that Rails has it own soluction (and I gess that phoenix dont has one), but it always end up using update_at column, that means, I have to go to database anyway.
alternative:
Live with date, life is not pretty
Buy a more powerful postgres instance... any one is using memcached like that.
Remover listeners, put some expires_in (1 or 2 minutos... or more) and let app
show out of sync data for a couple of minutes.
thanks for any help!

Rails best way to record lots of client side events

I want to record a whole lot of Vimeo events, by using their JS API, and AJAX posts. The data will be saved in Postgres via a Rails app. I want to record events like 'play', 'pause', seek', '1% played', '50% played' along with where in the video they are, all in chronological order. This is so that I can then piece together exactly what is happening in each video (e.g. does the user keep rewinding, then the video is not explaining concepts properly, are they skipping to the end - is the video boring them etc.).
I would like the system to be able to scale to handle 5 000 customers, not all obviously watching the videos at once. Will Postgres and Rails be able to handle this barrage of http posts, without bringing the site to a standstill and use no more than a few gigs ram?
Is there an alternative way of saving this data?

"State" management for asp.net mvc multi partial view + ajax app

I am trying to convert my asp.net mvc4 app, which had fairly heavy use of SessionState, into a stateless app. I understand that I can store this information in the DB, and intend to do so.
My question, though, is about my particular architecture. My app has a main 'page' consisting of a number of partial view panels, which each have actions in them that can affect the other panels. What i've been doing up to now is storing the entire state of the viewModel (lots of inter-related EF list collections and 'record' objects) in the session, and its been working great. Except when the session just randomly dies.
So, I need to get this data out of the session, and into the DB where I can rebuild the thing at need. My concern is that, if I store the info in the database, every single action done on screen might affect 3-5 different panels, each with their own State updates, thats a minimum of 10 round trips to the DB for every interaction!
What are some strategies I can use to make this idea more scalable?
EXTRA INFO
The view in question here is a sort of POS shopping cart system. There are panels for selecting events, selecting/adding items to the cart, editing cart items, selecting contacts, editing contacts, displaying the cart items, displaying the cart 'subtotals', and finally, a panel with a [checkout] button.
Selecting a new event will change the list of available items. Selecting an item to add to the cart will change the cart item list, subtotals, as well as the checkout panel. Same for editing a cart item.
The main concern is how to recover from a lost session, as I've found the built-in asp.net session code too unreliable. My testers have encountered issues with sessions timing out, and then my app not having any kind of recovery process. When its installed on 1500 sites, each with an average of 10 users, its going to be a plague of lost session issues, and I need to combat that before it becomes a real problem.
I agree that I'm not going stateless...wrong choice of words used in a rush. I'm just trying to move that state into a form that I can rely on past the session failure. My main idea presently is to continue using the session as the local cache for the viewModel data, but to have a fallback operation that can rebuild the viewModel from DB if the session one is lost somehow.
You shouldn't necessarily be using a database to store (what sounds like) data that only needs to be persisted in the short term.
If these changes to the other partials are only relevant in the context of the current "master view," then I would suggest using jQuery AJAX to send off the requests, parse the response JSON and update the other views. Tutorials on jQuery AJAX and ASP.NET MVC are easy to find, if you don't already have the knowledge:
http://www.codeproject.com/Articles/41828/JQuery-AJAX-with-ASP-NET-MVC
This way, you don't need to make a bunch of round trips. If the changes need to be persisted beyond the context of the current view, make ONE round trip to the database to perform the update and then simply update all of the other partials from the in-memory response from the AJAX call.
You don't need to read from secondary storage multiple times when you already have all of the information you need in-memory. Just do the reading and writing once.
I decided to go with a hybrid approach. I'm still using session, but I'm building out a DB 'recovery' option, so that if the session portion is lost, the DB will be able to provide the values needed to rebuild the session seamlessly.
Seems to be working well, so far.

backbone js store (large) remote collection locally

I'm working on a backbone js html5 application for use on an iPad that has to work in on- and offline- mode.
The dataset I'm working with is relatively large:
a remote collection of about 300 categories (each with about 3 properties)
a remote collection of about 4000 items (each with about 150 properties and a foreign key to a single category)
When in online mode, firstly the collection of categories should be fetched and stored locally. This information is quite stale so I only need to update this data every now and then.
When a category is selected the app should remotely fetch the items for that category (max 100). The user can also make a selection of items to store locally for use in offline mode. The amount of items that can be stored locally has to be limited for performance puprposes.
And now for my question ;-):
I would like to know the best way to go about fetching the remote collections and storing them in an offline database (probably webSQL?). Are there performance caveats that I should be on the lookout for?
If anyone is interested I can share come of the code I am working on. Currently the project is a combination of Backbone js, require js and JQuery Mobile.
Kind regards,
Jasper

How can I persist objects between requests with ASP.NET MVC?

I'm just starting to learn ASP.NET MVC and I'd like to know how I can retain model objects between subsequent requests to controller action methods?
For example say I'm creating a contact list web app. Users can create, update, rename, and delete contacts in their list. However, I also want users to be able to upload a contact list exported from other programs. Yet I don't want to just automatically add all the contacts in the uploaded file I want to give the user a secondary form where they can pick which uploaded contacts should be actualy added to their list.
So first I have a ContactController.Upload() method which shows an upload form. This submits to ContactController.Upload(HttpPostedFileBase file) which reads the file that was posted into a set of Contact model objects. Then I want to display a list of all the names of the contacts in the list and allow the user to select those that should be added to their contact list. This might be a long list that needs to be split up into multiple pages, and I might also want to allow the user to edit the details of the contacts before they are actually added to their contact list.
Where should I save the model objects between when a user uploads a file and when they finally submit the specific contacts they want? I'd rather not immediately load all the uploaded contacts into the back end database, as the user may end up only selecting a handful to actually add. Then the rest would need to be deleted. Also I would have to account for the case when a user uploads a file, but never actually completes the upload.
From what I understand an instance of a controller only lasts for one request. So should I create a static property on my Contact controller that contains all the latest uploaded contact model object collections? And then have some process that periodically checks the age of these collections and clears out any that are older then some specified expiration time?
A static property on the controller is trouble. First off, it won't work in a web farm and second it you'd have to deal with multiple requests from different users. If you really don't want to use your database you could use the ASP.NET Session.
No, you don't want a static property, as that would be static to all instances of the controller, even for other users.
Instead, you should create a table used to upload the data to. This table would be used as an intermediary between when the user uploads the data, and completes the process. Upon completion, you copy the contacts you want to keep into your permanent table, then delete the temporary data. You can then run a process every so often that purges incomplete data that is older than a specified time limit.
You could also use the HttpContext.Cache, which supports expiration (and sliding expiration) out-of-the box.
Alternatively, and perhaps even better (but more work) you could use cookies and have the user modify the data using javascript in her browser before finally posting it to you.
However, I'd strongly recommend to store the uploaded information in the database instead.
As you pointed out, it might be a lot of data and the user might want to edit it before clicking 'confirm'. What happens if the user's machine (or browser) crashes or she has to leave urgently?
Depending on the way you store the data the data in this scenario will probably be lost. Even if you used the user id as a cache key, a server restart, cache expiration or cache overflow would cause data loss.
The best solution is probably a combination of database and cookie storage where the DB keeps the information in a temporary collection. Every n minutes, or upon pagination, the modified data is sent to the server and updated in the DB.
The problem with storing the data in session or memory is what happens if the user uploads 50k contacts or more. You then have a very large data set in memory to deal with which depending on your platform may effect application performance.
If this is never going to be an issue and the size of the imported contacts list is manageable you can use either the session or cache to store the dataset for further modifications. Just remember to clear it when the user has committed the changes, you don't want a few heavy datasets hanging around in session.
If you store the dataset in session using your application controller then it will be available to all controllers while it is needed.

Resources