In my project, a user is uploading a text file, that needs to be read.
File, can be of any size, the file I am using is 1 MB and has ~1500 lines. The file can be bigger as well. Hence instead of putting all in db, i thought of processing the file and retaining the data in instance variable.
But instance variables are not available across HTTP request. Hence what are the options available to me to retain the instance variable values across HTTP request.The other reason for not choosing DB was, I dont need the data to be persisted. As long as user is logged in, data needs to be present for that time duration only. Once user is logged out, I can discard the data.
Please let me know if you need further information.
As #xyious advises, I would say avoid storing that much data in session, it is just not a good practice. You could, however, do the following:
Setup a system-wide configuration setting that holds a path where you store temporary files, in this case, the files uploaded by the user
Generate a random (maybe with SecureRandom.hex) filename when the user uploads the file and store this file in the path mentioned on point #1
Store this random filename in the user session, that way, even if you change between requests you can still access the filename
On each request, whenever you need to process the data, pull the filename from the user's session and join the path of the setting of #1, read the file from the filesystem and do the processing as necessary
Add a callback on your login/sessions controller so that when a user logs out you go and find the filename and delete it before logging out, that way you don't keep unused files around
I would advise against it, but you could store the data in a session variable, or in a cookie.
Why would you need that much data to be stored while the user is logged in ? Is it possible to just save important bits ?
Using instance variables to store content is not a right approach since you don't have a limit on the size of file uploaded and you end up passing the data everytime.
Firstly, decide something on the size limit since you expect text file from users and then upload the file temporarily with a reference path in DB. This file can be cleaned up when required and will make accessing the content simple. To further improve this, enable caching mechanism and setup a caching server for the uploaded files.
If you are not fine with this then other option i can think of is using session variables which is already suggested. So this data will stay per session which fits your requirement. you can just session[:file_Data] = "put parsed content here"
Related
In my application, I have a textarea input where users can type a note.
When they click Save, there is an AJAX call to Web Api that saves the note to the database.
I would like for users to be able to attach multiple files to this note (Gmail style) before saving the Note. It would be nice if the upload could start as soon as attached, before saving the note.
What is the best strategy for this?
P.S. I can't use jQuery fineuploader plugin or anything like that because I need to give the files unique names on the server before uploading them to Azure.
Is what I'm trying to do possible, or do I have to make the whole 'Note' a normal form post instead of an API call?
Thanks!
This approach is file-based, but you can apply the same logic to Azure Blob Storage containers if you wish.
What I normally do is give the user a unique GUID when they GET the AddNote page. I create a folder called:
C:\TemporaryUploads\UNIQUE-USER-GUID\
Then any files the user uploads at this stage get assigned to this folder:
C:\TemporaryUploads\UNIQUE-USER-GUID\file1.txt
C:\TemporaryUploads\UNIQUE-USER-GUID\file2.txt
C:\TemporaryUploads\UNIQUE-USER-GUID\file3.txt
When the user does a POST and I have confirmed that all validation has passed, I simply copy the files to the completed folder, with the newly generated note ID:
C:\NodeUploads\Note-100001\file1.txt
Then delete the C:\TemporaryUploads\UNIQUE-USER-GUID folder
Cleaning Up
Now. That's all well and good for users who actually go ahead and save a note, but what about the ones who uploaded a file and closed the browser? There are two options at this stage:
Have a background service clean up these files on a scheduled basis. Daily, weekly, etc. This should be a job for Azure's Web Jobs
Clean up the old files via the web app each time a new note is saved. Not a great approach as you're doing File IO when there are potentially no files to delete
Building on RGraham's answer, here's another approach you could take:
Create a blob container for storing note attachments. Let's call it note-attachments.
When the user comes to the screen of creating a note, assign a GUID to the note.
When user uploads the file, you just prefix the file name with this note id. So if a user uploads a file say file1.txt, it gets saved into blob storage as note-attachments/{note id}/file1.txt.
Depending on your requirement, once you save the note, you may move this blob to another blob container or keep it here only. Since the blob has note id in its name, searching for attachments for a note is easy.
For uploading files, I would recommend doing it directly from the browser to blob storage making use of AJAX, CORS and Shared Access Signature. This way you will avoid data going through your servers. You may find these blog posts useful:
Revisiting Windows Azure Shared Access Signature
Windows Azure Storage and Cross-Origin Resource Sharing (CORS) – Lets Have Some Fun
I have a object that i want to store for a moment. The object is in a controller for now, the controller will generate a view. A AJAX request is made from the view to next controller. For that moment i need the object previously stored. Previously, i used session and it worked well. But not sure it is the right thing to do. Is session the answer for this or is there anything else?
I have used cache also.but as per the cache concept.It will access for all the users.So one user data will be override to another.So the cached object data will be change for the same user.I need to handle the data storage for an particular user(Independent).
How is it possible? anyother approach is there please share me.
In Controller I have used Httpcontext.cache["key"]=dataset;
but some one suggested like this.but its not displaying
Explain:
In Controller: httpcontext.current.cache is not coming.
HttpContext.Currenthandler and HttpContext.Currentnotification properties only coming.So How can we handle the temp data storage in MVC.
Please help me.
You could use TempData if you want to store data for the next request only. If data should be accessible between multiple requests, then use Session. Here is short explanation of each one with examples.
As Alex said you could use TempData but if you want to use the data in multiple request, you could use TempData.Keep("YourKey") after reading the value to retain the data for the next request too. For your Information TempData internally uses Session to store your data (temporarily)
I would recommend URL parameters for a HTTP Get, or hidden form fields for a HTTP Post, if this is short lived. This is highly about avoiding the session.
But if it should really persist, then a database might be a reasonable location. Imagine a shopping cart that you don't want to dump just because a session timed out; because you'd like to remind the user next time about items they still haven't purchased.
Why not use the session? I don't generally recommend using the session, as you could find yourself with a global variable that two different browser windows are manipulating. Imagine a glass. One window is trying to fill it with Ice Tea. Another window is trying to fill it with Lemonade. But what do you have? Is it Lemonade? Is it Ice Tea? Or is it an Arnold-Palmer? If you try to put too much stuff on the session, and overly expect it to just be there, you might create an application that is non-deterministic if heaven forbid a user opens a second window or tab, and switches back and forth between the windows.
I'm more ok with Temp Data, if you truly have no other options. But this is not for persisting data for more than a second. Temp data will disappear after the first request reads it, as in, it's meant for a very temporary usage.
I personally only use TempData if I have to do a redirect where I can't otherwise keep it with me, or if I need to have that data for say generating a PDF or image that is going to be called via a HTTP Get by a viewer on the actual page, and then only if the model data is too large for the GET url ( many browsers only support just over 2000 characters, which long description or many fields could fill up.)
But again, pushing items around in hidden form variables, or in url parameters can be safe, because you have no multiple window use conflicts (each carries around its own data for peace of mind.)
I'd like to hold a collection of uploaded files for a user (where there might be multiple requests for each file, or even multiple requests per-file for chunking), but I'm struggling to find the appropriate scope. Once they're done, another request will say so, and the collection will dump its data to physical files and a DB entry and empty itself.
Ben here: http://buildstarted.com/2011/07/17/asp-net-mvc-3-file-uploads-using-the-fileapi/ uses a static collection, but that would be inappropriate for multiple users.
You need to store the files somewhere semi-permanent. Session could be reset along with the app domain, so you can't rely on it 100%.
Just have a separate file/db location or flag which lets you know the whole set of files is not completed.
I'm just starting to learn ASP.NET MVC and I'd like to know how I can retain model objects between subsequent requests to controller action methods?
For example say I'm creating a contact list web app. Users can create, update, rename, and delete contacts in their list. However, I also want users to be able to upload a contact list exported from other programs. Yet I don't want to just automatically add all the contacts in the uploaded file I want to give the user a secondary form where they can pick which uploaded contacts should be actualy added to their list.
So first I have a ContactController.Upload() method which shows an upload form. This submits to ContactController.Upload(HttpPostedFileBase file) which reads the file that was posted into a set of Contact model objects. Then I want to display a list of all the names of the contacts in the list and allow the user to select those that should be added to their contact list. This might be a long list that needs to be split up into multiple pages, and I might also want to allow the user to edit the details of the contacts before they are actually added to their contact list.
Where should I save the model objects between when a user uploads a file and when they finally submit the specific contacts they want? I'd rather not immediately load all the uploaded contacts into the back end database, as the user may end up only selecting a handful to actually add. Then the rest would need to be deleted. Also I would have to account for the case when a user uploads a file, but never actually completes the upload.
From what I understand an instance of a controller only lasts for one request. So should I create a static property on my Contact controller that contains all the latest uploaded contact model object collections? And then have some process that periodically checks the age of these collections and clears out any that are older then some specified expiration time?
A static property on the controller is trouble. First off, it won't work in a web farm and second it you'd have to deal with multiple requests from different users. If you really don't want to use your database you could use the ASP.NET Session.
No, you don't want a static property, as that would be static to all instances of the controller, even for other users.
Instead, you should create a table used to upload the data to. This table would be used as an intermediary between when the user uploads the data, and completes the process. Upon completion, you copy the contacts you want to keep into your permanent table, then delete the temporary data. You can then run a process every so often that purges incomplete data that is older than a specified time limit.
You could also use the HttpContext.Cache, which supports expiration (and sliding expiration) out-of-the box.
Alternatively, and perhaps even better (but more work) you could use cookies and have the user modify the data using javascript in her browser before finally posting it to you.
However, I'd strongly recommend to store the uploaded information in the database instead.
As you pointed out, it might be a lot of data and the user might want to edit it before clicking 'confirm'. What happens if the user's machine (or browser) crashes or she has to leave urgently?
Depending on the way you store the data the data in this scenario will probably be lost. Even if you used the user id as a cache key, a server restart, cache expiration or cache overflow would cause data loss.
The best solution is probably a combination of database and cookie storage where the DB keeps the information in a temporary collection. Every n minutes, or upon pagination, the modified data is sent to the server and updated in the DB.
The problem with storing the data in session or memory is what happens if the user uploads 50k contacts or more. You then have a very large data set in memory to deal with which depending on your platform may effect application performance.
If this is never going to be an issue and the size of the imported contacts list is manageable you can use either the session or cache to store the dataset for further modifications. Just remember to clear it when the user has committed the changes, you don't want a few heavy datasets hanging around in session.
If you store the dataset in session using your application controller then it will be available to all controllers while it is needed.
I need to do periodical background task fetching data from a weather site.
How in rails can I fetch json from a remote url and store it on the server? Also, there doesn't seem to be any point in storing this in a db so how in rails can I store a variable available to all users?
Why don't you store the data in the cache (local OR memcached).
Write/update the cache when you retrieve the feed data:
def load_feed
Rails.cache.write("feed_data", get_data_from_feed)
end
Read from cache, when you need to access the data:
def read_feed
Rails.cache.fetch("feed_data") { get_data_from_feed }
end
I will only partialy answer your question.
If you want to store it as some variable availble to all users, then probably you can create a my_new_variable.rb in config/initializers. And you can generate some code that defines and initializes your variable (maybe constant is better). Then is a bad part of this approach - you have to restart your server. If you are using passanger, then just touch tmp/restart.txt and it's done.
You can also store it in yml file and load it on server start.
Even if you would store it in a different way, probably the easiest way is to restart server to load this new variable. Otherwise on every request you should check if there is new variable availble (on example check last update time on a file), or reload this file.
So, for me it looks like the easiest way is to store it in db.