Using AVURLAsset on a custom NSURLProtocol - ios

I have written a custom NSURLProtocol (called "memory:") that allows me to fetch stored NSData items from a NSDictionary based on a name. For example, this code registers the NSURLProtocol class and adds some data:
[VPMemoryURLProtocol register];
[VPMemoryURLProtocol addData:data withName:#"video"];
This allows me to refer to the NSData via a url like "memory://video".
Below is my custom NSURLProtocol implementation:
NSMutableDictionary* gMemoryMap = nil;
#implementation VPMemoryURLProtocol
{
}
+ (void)register
{
static BOOL inited = NO;
if (!inited)
{
[NSURLProtocol registerClass:[VPMemoryURLProtocol class]];
inited = YES;
}
}
+ (void)addData:(NSData *)data withName:(NSString *)name
{
if (!gMemoryMap)
{
gMemoryMap = [NSMutableDictionary new];
}
gMemoryMap[name] = data;
}
+ (BOOL)canInitWithRequest:(NSURLRequest *)request
{
NSLog(#"URL: %#, Scheme: %#",
[[request URL] absoluteString],
[[request URL] scheme]);
NSString* theScheme = [[request URL] scheme];
return [theScheme caseInsensitiveCompare:#"memory"] == NSOrderedSame;
}
+ (NSURLRequest *)canonicalRequestForRequest:(NSURLRequest *)request
{
return request;
}
- (void)startLoading
{
NSString* name = [[self.request URL] path];
NSData* data = gMemoryMap[name];
NSURLResponse* response = [[NSURLResponse alloc] initWithURL:[self.request URL]
MIMEType:#"video/mp4"
expectedContentLength:-1
textEncodingName:nil];
id<NSURLProtocolClient> client = [self client];
[client URLProtocol:self didReceiveResponse:response
cacheStoragePolicy:NSURLCacheStorageNotAllowed];
[client URLProtocol:self didLoadData:data];
[client URLProtocolDidFinishLoading:self];
}
- (void)stopLoading
{
}
I am not sure whether this code works or not but that is not what I have a problem with. Despite registering the custom protocol, canInitWithRequest: is never called when I try to use the URL in this code:
NSURL* url = [NSURL URLWithString:#"memory://video"];
AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator* imageGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
CMTime time = CMTimeMakeWithSeconds(0, 600);
NSError* error;
CMTime actualTime;
CGImageRef image = [imageGen copyCGImageAtTime:time
actualTime:&actualTime
error:&error];
UIImage* uiImage = [UIImage imageWithCGImage:image];
CGImageRelease(image);
image is always nil if I use "memory://video" but works fine if I use "file:///...". What am I missing? Why isn't canInitWithRequest not being called? Does AVFoundation only support specific URL protocols and not custom ones?
Thanks

Certainly the underpinnings used to only support particular URL schemes— as an eBook developer I've seen this happen for any media type loaded through a URL such as epub:// or zip://. In those cases, on iOS 5.x and earlier, tracing through the relevant code would wind up in a QuickTime method which compared the URL scheme against a small number of supported ones: file, http, https, ftp and whatever it is that iTunes uses-- I forget what it's called.
In iOS 6+ there is a new API in AVFoundation, however, which is designed to help here. While I've not used it personally, this is how it should work:
NSURL* url = [NSURL URLWithString:#"memory://video"];
AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil];
////////////////////////////////////////////////////////////////
// NEW CODE START
AVAssetResourceLoader* loader = [asset resourceLoader];
id<AVAssetResourceLoaderDelegate> delegate = [SomeClass newInstanceWithNSURLProtocolClass: [VPMemoryURLProtocol class]];
[loader setDelegate: delegate queue: some_dispatch_queue];
// NEW CODE END
////////////////////////////////////////////////////////////////
AVAssetImageGenerator* imageGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
CMTime time = CMTimeMakeWithSeconds(0, 600);
With this in place, you need only implement the AVAssetResourceLoader protocol somewhere, which is very simple as it contains only one method. Since you already have an NSURLProtocol implementation, all your real work is done and you can simply hand off the real work to the Cocoa loading system or your protocol class directly.
Again, I'll point out that I've yet to actually make use of this, so the above is entirely theoretical.

Related

File loading doesn't work with NSURLSession via FTP

Using URL session FTP download is not working. I tried using below code.
Approach 1
NSURL *url_upload = [NSURL URLWithString:#"ftp://user:pwd#121.122.0.200:/usr/path/file.json"];
NSMutableURLRequest *request = [[NSMutableURLRequest alloc] initWithURL:url_upload];
[request setHTTPMethod:#"PUT"];
NSString *docsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSURL *docsDirURL = [NSURL fileURLWithPath:[docsDir stringByAppendingPathComponent:#"prova.zip"]];
NSURLSessionConfiguration *sessionConfig = [NSURLSessionConfiguration defaultSessionConfiguration];
sessionConfig.timeoutIntervalForRequest = 30.0;
sessionConfig.timeoutIntervalForResource = 60.0;
sessionConfig.allowsCellularAccess = YES;
sessionConfig.HTTPMaximumConnectionsPerHost = 1;
NSURLSession *upLoadSession = [NSURLSession sessionWithConfiguration:sessionConfig delegate:self delegateQueue:nil];
NSURLSessionUploadTask *uploadTask = [upLoadSession uploadTaskWithRequest:request fromFile:docsDirURL];
[uploadTask resume];
Approach 2
NSURL *url = [NSURL URLWithString:#"ftp://121.122.0.200:/usr/path/file.json"];
NSString * utente = #"xxxx";
NSString * codice = #"xxxx";
NSURLProtectionSpace * protectionSpace = [[NSURLProtectionSpace alloc] initWithHost:url.host port:[url.port integerValue] protocol:url.scheme realm:nil authenticationMethod:nil];
NSURLCredential *cred = [NSURLCredential
credentialWithUser:utente
password:codice
persistence:NSURLCredentialPersistenceForSession];
NSURLCredentialStorage * cred_storage ;
[cred_storage setCredential:cred forProtectionSpace:protectionSpace];
NSURLSessionConfiguration *sessionConfiguration = [NSURLSessionConfiguration defaultSessionConfiguration];
sessionConfiguration.URLCredentialStorage = cred_storage;
sessionConfiguration.allowsCellularAccess = YES;
NSURLSession *session = [NSURLSession sessionWithConfiguration:sessionConfiguration delegate:self delegateQueue:nil];
NSURLSessionDownloadTask *downloadTask = [session downloadTaskWithURL:url];
[downloadTask resume];
The error I get is as follows:
the requested url is not found on this server
But the same url is working in terminal with SCP command and file is downloading successfully
First of all, you should consider switching from ftp to sftp or https protocol, since they are much more secure and address some other problems.
Having that said, ftp protocol is not strictly prohibited in iOS (unlike, say, http), and you still can use it freely. However NSURLSession is not designed to work with ftp-upload tasks out of the box. So you either have to implement a custom NSURLProtocol which adopts such a request or just use other means without NSURLSession.
Either way you will have to rely on the deprecated Core Foundation API for FTP streams. First create a CFWriteStream which points to the destination url on your ftp-server like this:
CFWriteStreamRef writeStream = CFWriteStreamCreateWithFTPURL(kCFAllocatorDefault, (__bridge CFURLRef)uploadURL);
NSOutputStream *_outputStream = (__bridge_transfer NSOutputStream *)writeStream;
And specify the user's login and password in the newly created object:
[_outputStream setProperty:login forKey:(__bridge NSString *)kCFStreamPropertyFTPUserName];
[_outputStream setProperty:password forKey:(__bridge NSString *)kCFStreamPropertyFTPPassword];
Next, create an NSInputStream with the URL to the source file you want to upload to (it's not neccesarily, to bound the input part to the streams API, but I find it consistent, since you anyway have to deal with streams):
NSInputStream *_inputStream = [NSInputStream inputStreamWithURL:fileURL];
Now the complicated part. When it comes to streams with remote destination, you have to work with them asynchronously, but this part of API is dead-old, so it never adopted any blocks and other convenient features of modern Foundation framework. Instead you have to schedule the stream in a NSRunLoop and wait until it reports desired status to the delegate object of the stream:
_outputStream.delegate = self;
NSRunLoop *loop = NSRunLoop.currentRunLoop;
[_outputStream scheduleInRunLoop:loop forMode:NSDefaultRunLoopMode];
[_outputStream open];
Now the delegate object will be notified about any updates in the status of the stream via the stream:handleEvent: method. You should track the following statuses:
NSStreamEventOpenCompleted - the output stream has just established connection with the destination point. Here you can open the input stream or do some other preparations which became relevant shortly before writing the data to the ftp server;
NSStreamEventHasSpaceAvailable - the output stream is ready to receive the data. Here is where you actually write the data to the destination;
NSStreamEventErrorOccurred - any kind of error what may occur during the data transition / connection. Here you should halt processing the data.
Be advised that you don't want to upload a whole file in one go, first because you may easily end up with memory overflow in a mobile device, and second because remote file may not consume every byte sent immediately. In my implementation i'm sending the data with chunks of 32 KB:
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode {
switch (eventCode) {
case NSStreamEventOpenCompleted:
[_inputStream open];
return;
case NSStreamEventHasSpaceAvailable:
if (_dataBufferOffset == _dataBufferLimit) {
NSInteger bytesRead = [_inputStream read:_dataBuffer maxLength:kDataBufferSize];
switch (bytesRead) {
case -1:
[self p_cancelWithError:_inputStream.streamError];
return;
case 0:
[aStream removeFromRunLoop:NSRunLoop.currentRunLoop forMode:NSDefaultRunLoopMode];
// The work is done
return;
default:
_dataBufferOffset = 0;
_dataBufferLimit = bytesRead;
}
}
if (_dataBufferOffset != _dataBufferLimit) {
NSInteger bytesWritten = [_outputStream write:&_dataBuffer[_dataBufferOffset]
maxLength:_dataBufferLimit - _dataBufferOffset];
if (bytesWritten == -1) {
[self p_cancelWithError:_outputStream.streamError];
return;
} else {
self.dataBufferOffset += bytesWritten;
}
}
return;
case NSStreamEventErrorOccurred:
[self p_cancelWithError:_outputStream.streamError];
return;
default:
break;
}
}
At the line with // The work is done comment, the file is considered uploaded completely.
Provided how complex this approach is, and that it's not really feasible to fit all parts of it in a single SO answer, I made a helper class available in the gist here.
You can use it in the client code as simple as that:
NSURL *filePathURL = [NSBundle.mainBundle URLForResource:#"895971" withExtension:#"png"];
NSURL *uploadURL = [[NSURL URLWithString:#"ftp://ftp.dlptest.com"] URLByAppendingPathComponent:filePathURL.lastPathComponent];
TDWFTPUploader *uploader = [[TDWFTPUploader alloc] initWithFileURL:filePathURL
uploadURL:uploadURL
userLogin:#"dlpuser"
userPassword:#"rNrKYTX9g7z3RgJRmxWuGHbeu"];
[uploader resumeWithCallback:^(NSError *_Nullable error) {
if (error) {
NSLog(#"Error: %#", error);
} else {
NSLog(#"File uploaded successfully");
}
}];
It doesn't even need to be retained, because the class spawns a thread, which retain the instance until the work is done. I didn't pay too much attention to any corner cases, thus feel free to let me know if it has some errors or doesn't meet the required behaviour.
EDIT
For GET requests the only difference from any other protocol is that you pass login and password as part of URL and cannot use any secure means to do the same. Apart from that, it works straightforward:
NSURLComponents *components = [NSURLComponents componentsWithString:#"ftp://121.122.0.200"];
components.path = #"/usr/path/file.json";
components.user = #"user";
components.password = #"pwd";
[[NSURLSession.sharedSession dataTaskWithURL:[components URL] completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable
response, NSError * _Nullable error) {
NSLog(#"%#", response);
}] resume];

NSURLSession - use uploadTaskWithStreamedRequest with AWS IOS SDK

I've searched for days for a way to upload an IOS asset without creating a copy of the file in temp directory without luck. I got the code working with a temp copy but copying a video file that could be anywhere from 10MB to 4GB is not realistic.
The closest I have come to reading the asset in read-only mode is the code below. Per the apple documentation this should work - see the following links:
https://developer.apple.com/library/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html
I have enabled these keys:
<key>com.apple.security.assets.movies.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.music.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.pictures.read-write</key>
<string>YES</string>
<key>com.apple.security.files.downloads.read-write</key>
<string>YES</string>
Here is the code:
// QueueController.h
#import <AVFoundation/AVFoundation.h>
#import <AWSS3.h>
#import <Foundation/Foundation.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import "Reachability1.h"
#import "TransferType.h"
#import "TransferModel.h"
#import "Util.h"
#interface QueueController : NSObject<NSURLSessionDelegate>
#property(atomic, strong) NSURLSession* session;
#property(atomic, strong) NSNumber* sessionCount;
#property(atomic, strong) NSURLSessionConfiguration* configuration;
+ (QueueController*)sharedInstance;
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType;
#end
#implementation QueueController {
NSOperationQueue* copyQueue;
NSOperationQueue* transferQueue;
NSMutableArray* inProcessTransferArray;
NSMutableArray* pendingTransferArray;
bool isTransferring;
}
static QueueController* sharedInstance = nil;
// Get the shared instance and create it if necessary.
+ (QueueController*)sharedInstance {
#synchronized(self) {
if (sharedInstance == nil) {
sharedInstance = [[QueueController alloc] init];
}
}
return sharedInstance;
}
- (id)init {
if (self = [super init]) {
appDelegate =
(RootViewControllerAppDelegate*)[UIApplication sharedApplication]
.delegate;
copyQueue = [[NSOperationQueue alloc] init];
transferQueue = [[NSOperationQueue alloc] init];
transferQueue.maxConcurrentOperationCount = MAX_CONCURRENT_TRANSFERS;
inProcessTransferArray = [[NSMutableArray alloc] init];
pendingTransferArray = [[NSMutableArray alloc] init];
isTransferring = false;
if (self.session == nil) {
self.configuration = [NSURLSessionConfiguration
backgroundSessionConfigurationWithIdentifier:#"transferQueue"];
self.session = [NSURLSession sessionWithConfiguration:self.configuration
delegate:self
delegateQueue:transferQueue];
}
}
return self;
}
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType {
// Create a transfer model
NSUserDefaults* defaultUser = [NSUserDefaults standardUserDefaults];
NSString* user_id = [defaultUser valueForKey:#"UserId"];
TransferModel* transferModel = [[TransferModel alloc] init];
transferModel.mediaItem = mediaItem;
transferModel.transferType = transferType;
transferModel.s3Path = user_id;
transferModel.s3file_name = mediaItem.mediaName;
transferModel.assetURL =
[[mediaItem.mediaLocalAsset defaultRepresentation] url];
ALAssetRepresentation* mediaRep =
[mediaItem.mediaLocalAsset defaultRepresentation];
transferModel.content_type =
(__bridge_transfer NSString*)UTTypeCopyPreferredTagWithClass(
(__bridge CFStringRef)[mediaRep UTI], kUTTagClassMIMEType);
#synchronized(pendingTransferArray) {
if ((!isTransferring) &&
(transferQueue.operationCount < MAX_CONCURRENT_TRANSFERS)) {
isTransferring = true;
if (transferModel.transferType == UPLOAD) {
/**
* Read ALAsset from NSURLRequestStream
*/
NSInvocationOperation* uploadOP = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(uploadMediaViaLocalPath:)
object:transferModel];
[transferQueue addOperation:uploadOP];
[inProcessTransferArray addObject:transferModel];
}
} else {
// Add to pending
[pendingTransferArray addObject:transferModel];
}
}
}
- (void)uploadMediaViaLocalPath:(TransferModel*)transferModel {
#try {
/**
* Fetch readable asset
*/
NSURL* assetURL =
[[transferModel.mediaItem.mediaLocalAsset defaultRepresentation] url];
NSData* fileToUpload = [[NSData alloc] initWithContentsOfURL:assetURL];
NSURLRequest* assetAsRequest =
[NSURLRequest requestWithURL:assetURL
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:60.0];
/**
* Fetch signed URL
*/
AWSS3GetPreSignedURLRequest* getPreSignedURLRequest =
[AWSS3GetPreSignedURLRequest new];
getPreSignedURLRequest.bucket = BUCKET_NAME;
NSString* s3Key = [NSString stringWithFormat:#"%#/%#", transferModel.s3Path, transferModel.s3file_name];
getPreSignedURLRequest.key = s3Key;
getPreSignedURLRequest.HTTPMethod = AWSHTTPMethodPUT;
getPreSignedURLRequest.expires = [NSDate dateWithTimeIntervalSinceNow:3600];
// Important: must set contentType for PUT request
// getPreSignedURLRequest.contentType = transferModel.mediaItem.mimeType;
getPreSignedURLRequest.contentType = transferModel.content_type;
NSLog(#"mimeType: %#", transferModel.content_type);
/**
* Upload the file
*/
[[[AWSS3PreSignedURLBuilder defaultS3PreSignedURLBuilder]
getPreSignedURL:getPreSignedURLRequest]
continueWithBlock:^id(BFTask* task) {
NSURLSessionUploadTask* uploadTask;
transferModel.sessionTask = uploadTask;
if (task.error) {
NSLog(#"Error: %#", task.error);
} else {
NSURL* presignedURL = task.result;
NSLog(#"upload presignedURL is: \n%#", presignedURL);
NSMutableURLRequest* request =
[NSMutableURLRequest requestWithURL:presignedURL];
request.cachePolicy = NSURLRequestReloadIgnoringLocalCacheData;
[request setHTTPMethod:#"PUT"];
[request setValue:transferModel.content_type
forHTTPHeaderField:#"Content-Type"];
uploadTask =
[self.session uploadTaskWithStreamedRequest:assetAsRequest];
[uploadTask resume];
}
return nil;
}];
} #catch (NSException* exception) {
NSLog(#"exception: %#", exception);
} #finally {
}
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didSendBodyData:(int64_t)bytesSent
totalBytesSent:(int64_t)totalBytesSent
totalBytesExpectedToSend:(int64_t)totalBytesExpectedToSend {
// Calculate progress
double progress = (double)totalBytesSent / (double)totalBytesExpectedToSend;
NSLog(#"UploadTask progress: %lf", progress);
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didCompleteWithError:(NSError*)error {
NSLog(#"(void)URLSession:session task:(NSURLSessionTask*)task "
#"didCompleteWithError:error called...%#",
error);
}
- (void)URLSessionDidFinishEventsForBackgroundURLSession:
(NSURLSession*)session {
NSLog(#"URLSessionDidFinishEventsForBackgroundURLSession called...");
}
// NSURLSessionDataDelegate
- (void)URLSession:(NSURLSession*)session
dataTask:(NSURLSessionDataTask*)dataTask
didReceiveResponse:(NSURLResponse*)response
completionHandler:
(void (^)(NSURLSessionResponseDisposition disposition))completionHandler {
//completionHandler(NSURLSessionResponseAllow);
}
#end
But I'm receiving this error:
(void)URLSession:session task:(NSURLSessionTask*)task didCompleteWithError:error called...Error Domain=NSURLErrorDomain Code=-999 "cancelled" UserInfo=0x17166f840 {NSErrorFailingURLStringKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV, NSLocalizedDescription=cancelled, NSErrorFailingURLKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV}
userInfo: {
NSErrorFailingURLKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSErrorFailingURLStringKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSLocalizedDescription = cancelled;
}
Thanks in advance for your help.
Regards,
-J
A couple of comments regarding using NSURLSessionUploadTask:
If you implement didReceiveResponse, you must call the completionHandler.
If you call uploadTaskWithStreamedRequest, the documentation for the request parameter warns us that:
The body stream and body data in this request object are ignored, and NSURLSession calls its delegate’s URLSession:task:needNewBodyStream: method to provide the body data.
So you must implement needNewBodyStream if implementing a NSInputStream based request.
Be forewarned, but using a stream-base request like this creates a request with a "chunked" transfer encoding and not all servers can handle that.
At one point in the code, you appear to try to load the contents of the asset into a NSData. If you have assets that are that large, you cannot reasonably load that into a NSData object. Besides, that's inconsistent with using uploadTaskWithStreamedRequest.
You either need to create NSInputStream or upload it from a file.
You appear to be using the asset URL for the NSURLRequest. That URL should be the URL for your web service.
When using image picker, you have access to two URL keys: the media URL (a file:// URL for movies, but not pictures) and the assets library reference URL (an assets-library:// URL). If you're using the media URL, you can use that for uploading movies. But you cannot use the assets library reference URL for uploading purposes. You can only use that in conjunction with ALAssetsLibrary.
The ALAssetPropertyURL is a purely URL identifier for the asset i.e to identify assets and asset groups and I dont think you can use it directly to upload to a service.
You could use AVAssetExportSession to export the asset to temp url if other methods are teditious.
i.e
[AVAssetExportSession exportSessionWithAsset:[AVURLAsset URLAssetWithURL:assetURL options:nil] presetName:AVAssetExportPresetPassthrough];

XCDYouTubeVideoPlayer opens and close

I've start working with XCDYouTubeVideoPlayer. It does seem to have, some small issues. When I use the method (like below) to call the player it opens it and close it right away.
I've imported following frameworks:
mediaplayer
AVfoundation
and added the `XCDYouTubeVideoPlayerViewController.h and .m
In the viewController.m I've added this method:
- (IBAction) play:(id)sender
{
[self.view endEditing:YES];
NSString *link = #"m01MYOpbdIk";
XCDYouTubeVideoPlayerViewController *videoPlayerViewController = [[XCDYouTubeVideoPlayerViewController alloc] initWithVideoIdentifier:link];
[self presentMoviePlayerViewControllerAnimated:videoPlayerViewController];
}
At the moment its opening and closing the XCDYouTubeVideoPlayer right a way. What am i doing wrong?
I not understand why, but for me , in iOS8 the line
XCDYouTubeVideoPlayerViewController.m at #79
was calling the method :
- (id) initWithContentURL:(NSURL *)contentURL
{
#throw [NSException exceptionWithName:NSGenericException reason:#"Use the
` initWithVideoIdentifier:` method instead." userInfo:nil];
}
I just comment it and works!
XCDYouTubeVideoPlayer library build the youtube "streaming link", by appending the url with the signature provided. It seems that the "url" now has the signature along with it and the "sig" key comes null, which thereby negates the if statement in the function
(NSURL *) videoURLWithData:(NSData *)data error:(NSError * __autoreleasing *)error
Goto file
XCDYouTubeVideoPlayerViewController.m at #248
you will see a for-loop
for (NSString *streamQuery in streamQueries)
{
NSDictionary *stream = DictionaryWithQueryString(streamQuery, queryEncoding);
NSString *type = stream[#"type"];
NSString *urlString = stream[#"url"];
NSString *signature = stream[#"sig"];
if (urlString && signature && [AVURLAsset isPlayableExtendedMIMEType:type])
{
NSURL *streamURL = [NSURL URLWithString:[NSString
stringWithFormat:#"%#&signature=%#",
urlString,
signature]];
streamURLs[#([stream[#"itag"] integerValue])] = streamURL;
}
change it to this (remove the signature variable from if statement and modify the URLWithString)
for (NSString *streamQuery in streamQueries)
{
NSDictionary *stream = DictionaryWithQueryString(streamQuery, queryEncoding);
NSString *type = stream[#"type"];
NSString *urlString = stream[#"url"];
if (urlString && [AVURLAsset isPlayableExtendedMIMEType:type])
{
NSURL *streamURL = [NSURL URLWithString:urlString];
streamURLs[#([stream[#"itag"] integerValue])] = streamURL;
}

Downloading data and processing asynchronous with NSURLSession

I am using NSURLSession to download xml files and then I want to do different processing to this files, like parsing them:
-(void)parseFeed:(NSURL *)url
{
NSMutableURLRequest* request = [NSMutableURLRequest requestWithURL:url];
NSURLSessionDataTask* task = [FeedSessionManager.sharedManager.session dataTaskWithRequest:request completionHandler:^(NSData* data, NSURLResponse* response, NSError* error)
{
Parser* parser = [[Parser alloc] initWithData:data];
[self.feeds addObjectsFromArray:[parser items]];
}];
[task resume];
}
Parser object will parse the xml file using NSXMLParser. The parseFeed:(NSURL*)url is called from the ViewController:
Downloader* downloader = [[Downloader alloc] init];
[downloader parseFeed:[NSURL URLWithString:#"http://www.engadget.com/tag/features/rss.xml"]];
NSArray* items = [downloader feeds];
And this is how I create the NSURLSession object:
-(id)init
{
if(self = [super init])
{
_session = [NSURLSession sessionWithConfiguration:FeedSessionConfiguration.defaultSessionConfiguration delegate:self delegateQueue:nil];
}
return self;
}
Of course this approach doesn't work for me. Inside parseFeed method I want to wait until all data is downloaded and processed. Only then I want to access the self.feeds array in the ViewController.
Can someone point me into the right direction into doing this ? Or maybe point me to a different approach ?
I have used ASIHTTPRequest but now no longer maintained but you can use AFHTTPClient's operation queue
AFHTTPClient *client = [[AFHTTPClient alloc] initWithBaseURL:nil];
// Important if only downloading one file at a time
[client.operationQueue setMaxConcurrentOperationCount: 1];
NSArray *videoURLs; // An array of strings you want to download
for (NSString * videoURL in videoURLs) {
// …setup your requests as before
[client enqueueHTTPRequestOperation:downloadRequest];
}

Cached images in UIWebView take longer to load? (NSURLProtocol?)

I'm serving up local images to my UIWebView via NSURLProtocol (which means the image is returned almost immediately), but I'm experiencing an issue where cached images (images being displayed again after their first load) take longer to load. Is there something in my NSURLProtocol causing this?
#implementation URLProtocol
+ (BOOL) canInitWithRequest:(NSURLRequest *)request {
return [request.URL.scheme isEqualToString:#"file"] ||
[request.URL.scheme isEqualToString:#"http"];
}
+ (NSURLRequest*) canonicalRequestForRequest:(NSURLRequest *)request {
return request;
}
- (void) startLoading {
id<NSURLProtocolClient> client = self.client;
NSURLRequest* request = self.request;
NSString *fileToLoad = request.URL.absoluteString;
NSURLResponse *response;
if([fileToLoad hasPrefix:#"http://app-fullpath/"]){
fileToLoad = [fileToLoad stringByReplacingOccurrencesOfString:#"http://app-fullpath/" withString:#""];
} else {
fileToLoad = [[NSURL URLWithString:fileToLoad] path];
}
NSData* data = [NSData dataWithContentsOfFile:fileToLoad];
response = [[NSHTTPURLResponse alloc] initWithURL:[request URL] statusCode:200 HTTPVersion:#"HTTP/1.1" headerFields:[NSDictionary dictionary]];
[client URLProtocol:self didReceiveResponse:response cacheStoragePolicy:NSURLCacheStorageNotAllowed];
[client URLProtocol:self didLoadData:data];
[client URLProtocolDidFinishLoading:self];
}
- (void) stopLoading { }
#end
Any speed suggestions, javascript/html or iOS?
My problem was that UIWebView gives text a much higher priority than images, so text is laid out first, then images are processed. In order to fix that I created a DOM representation of my HTML & Images, then I replaced all images with images loaded via javascript (new Image()) and they show instantly.

Resources