Scenario:
As a user I am able to take (an unlimited amount of) photos and videos which are stored in the apps documents folder. Each of these media files gets a record within a Sqlite database with additional information (for exeample a caption). All this is possible to do completely offline.
Back online I get a dialog with a list of all the videos and photos I took and a button which starts an upload process.
Each file is uploaded after the other together with its metadata by making a multipart POST request to the server. The response of the server is stored together with the metadata in the Sqlite database (so there is no fire and forget).
Reliable solutions?
If I am reading and understanding this chart correctly, the most simple solution would be to wrap each of these uploads within a Task. Side effect: after 10 minutes every task would be cancelled, which becomes a problem by having a slow connection or very large files (for example a very long video).
The recommended way would be to use NSUrlSession/Background transfer service.
Which leads me to my question:
Is it possible to wrap multipart POSTs in NSURLSessionDataTasks and would this be reliable, even if the task is running longer than 10 minutes or the user is suspending the app?
As I am a Xamarin/C# guy I would really appreciate some sample snippets for a working multipart upload, even if it's in Objective-C ;-).
Almost and... yes.
Background Transfer service works with NSUrlSessionDownloadTasks and NSUrlSessionUploadTasks only. Not NSUrlSessionDataTasks, as described here.
Other than this "basic" limitation, it's safe to use background transfer service with upload tasks.
The 10-minute-freepass-in-the-background no longer applies on iOS 7 (basically, it's there, but different), however, with NSURLSession and background transfer service you do not need it.
I've a blog post here for background transfer service, based on download tasks.
An important thing to note is that, starting a task basically means that it will actually start sometime and actually finish some other time. This depends on whether the device is on cellular or Wi-Fi and other factors which are (probably) only known to iOS (and Apple).
Related
I have to load a lot of data onto an iOS device with 128GB storage, for use by my app. The data is around 2,000 files of around 40Mb each, total is around 80GB - 100Gb.
I control the iOS device and the load machine/program and the local network they're on, and time is not critically important (if it takes a week to load, that's OK). I can format the data as required to facilitate the load.
I've done some iOS programming, but I'm not sure where to start looking for a solution to this. If you can outline the broad approach to use and which iOS docs to read up on, that's all I need.
Hoping for a solution where I can format the data and write the program, plug the iPad in to the Mac and say 'start loading' and come back when it's done.
We discovered that it is possible to load data one device at a time using iTunes, but that isn't a good answer for us.
We added a 'load data' button into the application. When triggered it loads a configuration file from a hard coded local network address, then retrieves all the data files listed in the configuration file using REST GETs from the local server, storing them into the application /Documents directory.
This is a good approach for us; it is secure, allows multiple devices to load data at the same time, and doesn't need any manual file loading : start the app, press 'load', and wait for it to finish.
I have successfully implemented Sync Logic to sync my local data (text related data) with web server (was referrign this link) and that is working.
but I have to sync images too. here is the way I want to do that,
user will create a product (title, 5 images (1 compulsory, 4 optional), description).
Now, I can follow the same scenario like one I did above but getting confusion in image uploading, as it would be not feasible I can upload it all the same time or what if data saved on server and internet goes off hence images not uploaded, there might change other user won't see the images.
Can anyone have the best approcah? please guide me.
Thanks,
The challenge here is that you are dealing with a long transfer time, due to a large chunk of data, am I right? If that's the case, then I would recommend the protocol below. I would also use the same protocol for the small data syncs as well, since there is also a risk that the sync is interrupted.
Set a flag on the local device that a sync is in progress.
Upload to a temporary location on the server.
Once the upload is complete, call a web service on the server, to notify the server that the transfer has completed. A web service is preferable, since that will allow you to receive a response once the web service is completed. The web service moves the files to their final destination.
When you receive the response, you can update the flag set in step 1, and you will know that the files are where they should be on the server.
I have an app on the app store that uses AFNetworking 2.x to download large files in the background with NSURLSession-based downloads, because the user will often put the app in the background (it gets terminated after a while of course, but the downloads finish all the same. Wonderful). This app is working well. Usually users are only downloading a few files at a time.
Now I need to make another similar app, but this time instead of a few large files, it is very likely that the user will want to download a large number of smallish files: for example, 500 files that are 1-5mb each. Again, the app will often be put in the background, so I want to stay with NSURLSessionDownloadTask unless there's a really good reason not to.
My question is, can I simply create 500 NSURLSessionDownloadTasks all at once? Does AFNetworking do some clever throttling so as not to overload the system? Or does iOS do it? Or does nothing do it, and I have to painfully track & organize the state of transfers across restarts of my app (ie. because it gets put in the background eventually terminated) ?
If anyone knows the limits of how many NSURLSessionDownloadTasks you can create reliably simultaneously, that would be awesome...
thanks!
p.s. I greatly prefer obj-c to swift, thx :)
Last I checked (haven't looked at the iOS 9 betas), task creation was unexpectedly expensive and also superlinear. On my test runs:
50 tasks -> ~1.5s
200 tasks -> ~11.5s
500 tasks -> ~55s
Since my file count was often a 5-digit number, scheduling everything at once wasn't a solution for me. My approach (which isn't in production yet, I stopped working on the feature in favour of other things), combines persistence with NSURLSessionDownloadTask and uses the session identifiers to sort out which logical download a particular file belongs to. Further downloads are scheduled from one of the delegates depending on whether I'm on the normal lifecycle or coming from -application:handleEventsForBackgroundURLSession:completionHandler: (debugging this situation can get painful; NSUserDefaults is your friend). The theory seems sound, I can see that tasks do get scheduled, but I'm currently stuck getting the iOS downloader daemon to conform to my will.
If the server-side zip as suggested by Benjamin Jimenez is an option for you, do yourself a favour and use that instead.
The Apple staff member "eskimo" on apple developer forums helped me find the answer, which you can see in this forum post:
https://forums.developer.apple.com/thread/11621
Pasting here the relevant parts:
(me) I've read through this thread and the one you linked to here
(https://devforums.apple.com/message/938057#938057) and I have a
question about best practices to download 10,000-20,000 files via
NSURLSessionDownloadTasks. (Disclaimer, i'm using AFNetworking 2.x).
I'm targeting iOS 8 and newer, so answers do not have to work on iOS
7. How can we compute a reasonable batch (group) size ? I understand the resume-rate limiter means one wants the batch size to be higher,
but there's an unknown max limit of simultaneous task requests that
will crash the daemon.
(me) My assumption here is that when the user opens my app and it runs
for some time in the foreground, then the rate limiter is "reset" or
similar -- so now things will flow nicely again. Is this assumption
correct?
(eskimo) Yes. Also, starting with iOS 8, if the user brings your app
to the front then iOS will automatically give tasks a 'kick'. I've
forgotten the exact mechanics of this but I'm pretty sure it's covered
in WWDC 2014 Session 707 What's New in Foundation Networking.
Problem statement:
I want to upload a large binary (such as an audio clip) from an iOS app to S3, and I'd like to make the app's handling of disconnects (or low connectivity) as robust as possible, preferably by uploading the binary as a series of chunks.
Unfortunately, neither the AWSiOS SDK, nor ASI's S3 framework seem to support to multi-part uploads, or indicate that they plan to add support. I realize that I can initiate a 'longish' upload using beginBackgroundTaskWithExpirationHandler: and that'll give me a window of time to complete the upload (currently 600 seconds, I believe), but what's to be done if I'm not in a situation to complete said upload within that timeframe?
Aside from worrying about completing tasks within that time frame, is their a 'best practice' for how an app should resume uploads, or even just break a larger upload into smaller chunks?
I've thought about writing a library to talk to S3's REST API specifically for multi-part uploads, but this seems like a problem other have either been solved, or realized needn't be solved (perhaps for being completely in appropriate for the platform).
Another (overly complicated) solution would be chunking the file on the device, uploading those to S3 (or elsewhere) and have them re-assembled on S3 via a server process. This seems even more unpalatable than rolling my own library for multi-part upload.
How are others handling this problem?
Apparently I was looking at some badly out of date documentation.
in AmazonS3Client see:
- (S3MultipartUpload * AmazonS3Client)initiateMultipartUploadWithKey:(NSString *)theKey withBucket:(NSString *)theBucket
Which will give you a S3MultipartUpload which will contain an uploadId.
You can then put together an S3UploadPartRequest using initWithMultipartUpload: (S3MultipartUpload *) multipartUpload and send that as you usually would.
S3UploadPartRequest contains an int property partNumber where you can specify the part # you're uploading.
you can write some code to do so, you can refer code from http://dextercoder.blogspot.in/2012/02/multipart-upload-to-amazon-s3-in-three.html. Core java code, steps can be used for iOS.
My app needs to encode a large amount of audio data to an M4A file. I am currently using AVAssetWriter, which works fine, except that it takes a few minutes to encode all the data.
Instead of asking the user to keep the app running until the process has finished, I would like to pause the encoding when the app terminates and continue on relaunch.
Unfortunately, AVAssetWriter doesn't seem to support this, as it always creates a new file when initializing.
Do you know any other APIs that I could use? Maybe a third-party library?
This is exactly what background processing is intended for. As long as you can complete within 10 minutes, you can use beginBackgroundTaskWithExpirationHandler: to ask the system to let you keep running after the user switches applications. See Completing a Finite-Length Task in the Background in the iOS Application Programming Guide. Not only is this the easiest to use, but it'll give the best user experience.
The only issue you'd face is if it's possible for an audio file to take longer than 10 minutes in a non-resumable way. If that's a real possibility, then you'll need another solution.