I'm using NSURLConnection to interact with the server side and I have observed that when the server take time to respond the system allows about 40 mo.
I don't know if I'm the only one to have this problem.
thanks in advance.
Yes this is possible in case if your data for response is large in size. Generally what we do, we create instance of NSData and append all downloaded data to this variable.This works perfect when your data is comparatively small. If you have large data in response, the better way is to create file in Document directory and append all data to that file when connection receives data. Read this data after connection finishes loading.
This concept of saving data is applicable on android also.
Related
I am developing an iOS app with objective-C. With my 'GET' request am getting a large chunk of json data in below format.
[ { #"value": #1,#"day":#2,#"hour":#1} , {#"value": #1,#"day":#1,#"hour":#1 }....]
Note, This array always contain 168 fixed number objects.
Inside my app I have different UI controls which suppose to show different chunks of the obtained data. For example clicking 'Button1' suppose to show ob1---obj10 and so on.
In theory its all working, but I am not happy with my design approach.
Because for each button press I am calling the api to get the entire data set again and extracting the required data.
Ideally what I think should happen is I should store the data locally upon first 'GET' request and different classes within my app should be able to extract required information.
Same method should apply to my 'POST' requests. I am confused with what options I have, and what is best practice in this situation. I can think of following
Store data in an Array ?
Store the data in adatabase like sqlite ?
Finally plists ?
Using core data is bit of an overkill ?
168 records where each is roughly 40 bytes gives about 7 Kb of data, which is probably reduced down to something like 2 Kb if your API server supports gzip compression. This is nothing according to modern web standards.
You could download it completely, parse and save to a variable in your view controller. I recommend creating a "model" class (as in MVC) to wrap your data nicely and factor out the rows selection logic.
CoreData (or SQLite) is definitely an overkill for this amount of data.
You can use something like a plist or NSKeyedArchiver to cache the data to disk if you need offline support, i.e. you want to have an app working without an internet connection.
I have been developing an app with cloud/server data source. So naturally the process is one thing at a time. So at present, to fill my tables, I query the server which returns an array. The array contains urls to images and then from the urls I load the images using SDWebImage.
Now I am entering the stage of development where I need to implement Core Data for all the data in my tables (i.e. texts and images). So far I am considering the following approaches:
I can load the array from the server into core data (imagine properties as: firstName, lastName, photoUrl, shortBio) and then pass the photo url from core data to SDWebImage to display the image in the table cells. OR
I can load the array and the image into Core Data (i.e. load the array into core data in the background and then for each row, load the image into core data)
Of course the point here is that if I use SDWebImage it will save the image in its own caching system, which from my limited understanding may or may not be entirely consistent with what is in core data. On the other hand I don't understand core data enough to know if it handles saving images well in terms of performance (i.e. knows it's an image and therefore handle the file linking).
So What is the best way to do this? Can SDWebImage work in harmony with Core Data? Or is SDWebImage redundant since core data is good enough all by itself?
Another thing to note is that presently, my data loads from server immediately and then the images come as SDWebImage loads each into its UIImageView. This may not be a problem with Core Data since ideally the image will be in the local DB. Any thoughts?
Based on your question and comments, it seems you are trying to locally cache images that were retrieved through an HTTP request.
The URL loading system is already caching the images. There is no need to implement another layer of caching on top of that, wether it be SDWebImage or CoreData. When an HTTP response is received from the server, the server includes "freshness" information that informs the client how long and under what conditions that response is valid. The URL loading system, by default, obeys those rules. You can check the freshness information of responses using a tool like Charles or REDBot. The server is the only party in this conversation that can know how long a response is valid for.
The URL loading system does not, by default, cache to the filesytem - only in-memory. This is easy to change:
cache = [[NSURLCache alloc] initWithMemoryCapacity:(1024*1024*512) diskCapacity:(1024*1024*1024 * 100) diskPath:#"Cache.db"];
[NSURLCache setSharedURLCache:cache];
Or when using NSURLSession:
cache = [[NSURLCache alloc] initWithMemoryCapacity:(1024*1024*512) diskCapacity:(1024*1024*1024 * 100) diskPath:#"Cache.db"];
[sessionConfiguration setURLCache:cache];
session = [NSURLSession sessionWithConfiguration:sessionConfiguration];
Storing image data in Core Data is something in general to be avoided, at least with the NSSQLiteStoreType persistent store. Even small images are large in the SQLite database and cause fragmentation, etc. that impact performance. If you are going to store image data in Core Data, it's preferable to use external storage - either by use Core Data's own external records storage, or by storing image data on the filesystem and referencing it in managed objects by using URLs.
However, if you are using Core Data OR SDWebImage to "cache" images, you are ignoring the freshness information that was returned in the server's response unless you implement your own validation layer - which the URL loading system is already doing for you!
Regardless of the question semantics, I think you need some more hard information to inform your decision.
It would be useful for you to know that storing large images (larger than, say, thumbnails) is not recommended and will lead to performance issues. Core Data has an option for large data where you can check "Store in External Record File". Alternatively, you can administer your own cache data (in this way you can flexibly update the images via download on a per-need basis on each device sharing the same data). The accepted best practice is to only store the image URL in Core Data (typically relative to the application directory) and handle the file storage / display separately.
I do not know SDWebImage, but this might provide some of the functionality that you need in the light of the above.
I hope this helps you make your decision about the data architecture.
My IOS App periodically collects data that is time and location stamped (from GPS). That data is sent to a server if the app is online at the time the data is collected. BUT, if a data connection is not available I want the data cached to the iPhone/iPad until a data connection is obtained. When the connection is restored the data is uploaded and the cache is cleared. The data is in the form of an array of strings of SQL. Typically the array might contain a dozen or few dozen strings--nothing too large. The data should be persistent until it is cleared even if the app or device is shut down. Suggestions?
Have you thought about using NSUserDefaults for small amounts of data it is a value, key map solution and very simple. For large amounts of data something like JSON is good. Simply make the JSON when the data is created and then when connection in the app is detected parse the JSON and send it to the server.
I think the best way is to work with CoreData or you store the values in a local file.
I have to download thousands or millions of hotposts from a web service and store them locally in core data. The json response or file is about 20 or 30 MB, so download will take time. I guess mapping and store it in core data will also take time time.
Can I do it in restkit? or has been designed just for reasonable size responses?
I see I can track progress when downloading a large file, even I see I can know when mapping starts or finishes: http://restkit.org/api/latest/Protocols/RKMapperOperationDelegate.html
Probably I can also encapsulate the core data operation to avoid blocking the UI.
What do you think? Do you think is this feasible? Or should I select a more manual approach? I would like to know your opinion. Thanks in advance.
Your problem is not encapsulation or threading, it's memory usage.
For a start, thousands or millions of 'hot posts' are likely to cause you issues on a mobile device. You should usually be using a web service that allows you to obtain a filtered set of content. If you don't have that already, consider creating it (possibly by uploading the data to a service like Parse.com).
RestKit isn't designed to use a streaming parser, so the full JSON will need to be deserialised into memory before it can be processed. You can try it, but I suspect the mobile device will be unhappy if the JSON is 20 / 30 MB.
So, create a nice web service or use a streaming parser and process the results yourself (which, could technically be done using RestKit mapping operations).
My situation is that I have a universal app that talks to an sql database via odata. When the user retrieves data over the line I want to save that to the device so that if the user stops the app or the app crashes than I can rehydrate the saved device data and we will not have to re-retrieve the data when the app starts again.
My question is for this sitatuation is it more beneficial to user coredata to save the data to an sqllite db or should I save the data to the documents directory? The data can be serialized into an NSData object which could be saved straight to the device from what I have read, where as saving NSData objects to sqllite is not what it is designed for.
Im looking for the most performant of the two options and also the option that will not restrict as much on size restrictions.
Looking forward to any advice that you can give me.
Thanks in advance
If the size of the data is small enough to fit in memory with no problems, then you will probably get the best performance from serializing an NSData object.
If, however, the data reaches the point where it strains memory usage, you will want to use something like Core Data or sqlite to persist it to the disk and only load objects in memory you are using at the moment.