YouTube Live API Stream Status and Quality Callback - youtube-api

In the "Live Control Room" of a YouTube Live broadcast, I can see a "Stream Status" view which shows me details of the video being sent to YouTube's RTMP endpoint.
I hit the liveStreams endpoint to get the "status" of the stream, but that only returns active, meaning that the video stream is being successfully sent to YouTube's RTMP endpoint, but no information about video data or quality.
Is this information exposed somewhere in the API? Can I also see additional details about the video, such as the bitrate, fps, etc. being sent to YouTube so I can verify my encoder is working correctly? Or does that check need to be done on the client-side and check the video right after it leaves the encoder before hitting the RTMP endpoint. I'm writing an iOS application, so using the "Live Control Room" on the web isn't a viable solution for me.
Here's what I'm doing on the broadcasting side to check the liveStream status:
- (void)checkStreamStatus {
[self getRequestWithURL:[NSString stringWithFormat:#"https://www.googleapis.com/youtube/v3/liveStreams?part=id,snippet,cdn,status&id=%#", self.liveStreamId] andBlock:^(NSDictionary *responseDict) {
NSLog(#"response: %#", responseDict);
// if stream is active, youtube is receiving data from our encoder
// ready to transition to live
NSArray *items = [responseDict objectForKey:#"items"];
NSDictionary *itemsDict = [items firstObject];
NSDictionary *statusDict = [itemsDict objectForKey:#"status"];
if ([[statusDict objectForKey:#"streamStatus"] isEqualToString:#"active"]) {
NSLog(#"stream ready to go live!");
if (!userIsLive) {
[self goLive]; // transition the broadcastStatus from "testing" to "live"
}
} else {
NSLog(#"keep refreshing, broadcast object not ready on youtube's end");
}
}];
}
getRequestWithURL is just a generic method I created to do GET requests:
- (void)getRequestWithURL:(NSString *)urlStr andBlock:(void (^)(NSDictionary *responseDict))completion {
NSURL *url = [NSURL URLWithString:urlStr];
NSMutableURLRequest * request = [NSMutableURLRequest requestWithURL:url];
[request addValue:[NSString stringWithFormat:#"Bearer %#", [[NSUserDefaults standardUserDefaults] objectForKey:#"accessToken"]] forHTTPHeaderField:#"Authorization"];
[request setHTTPMethod:#"GET"];
// Set the content type
[request setValue:#"application/json" forHTTPHeaderField:#"Content-Type"];
[NSURLConnection sendAsynchronousRequest:request queue:[[NSOperationQueue alloc] init] completionHandler:^(NSURLResponse *response, NSData *data, NSError *connectionError) {
[self parseJSONwithData:data andBlock:completion];
}];
}
- (void)parseJSONwithData:(NSData *)data andBlock:(void (^)(NSDictionary * responseDict))completion {
NSError *error = nil;
NSDictionary *responseDict = [NSJSONSerialization JSONObjectWithData:data
options:kNilOptions
error:&error];
if (error) {
NSLog(#"error: %#", [error localizedDescription]);
}
completion(responseDict);
}
Here's what I'm doing on the consumer side to check the video quality:
I am using the YTPlayerView library from Google.
- (void)notifyDelegateOfYouTubeCallbackUrl: (NSURL *) url {
NSString *action = url.host;
// We know the query can only be of the format http://ytplayer?data=SOMEVALUE,
// so we parse out the value.
NSString *query = url.query;
NSString *data;
if (query) {
data = [query componentsSeparatedByString:#"="][4]; // data here is auto, meaning auto quality
}
...
if ([action isEqual:kYTPlayerCallbackOnPlaybackQualityChange]) {
if ([self.delegate respondsToSelector:#selector(playerView:didChangeToQuality:)]) {
YTPlaybackQuality quality = [YTPlayerView playbackQualityForString:data];
[self.delegate playerView:self didChangeToQuality:quality];
}
...
}
But the quality "auto" doesn't seem to be a supported quality constant in this library:
// Constants representing playback quality.
NSString static *const kYTPlaybackQualitySmallQuality = #"small";
NSString static *const kYTPlaybackQualityMediumQuality = #"medium";
NSString static *const kYTPlaybackQualityLargeQuality = #"large";
NSString static *const kYTPlaybackQualityHD720Quality = #"hd720";
NSString static *const kYTPlaybackQualityHD1080Quality = #"hd1080";
NSString static *const kYTPlaybackQualityHighResQuality = #"highres";
NSString static *const kYTPlaybackQualityUnknownQuality = #"unknown";
...
#implementation YTPlayerView
...
/**
* Convert a quality value from NSString to the typed enum value.
*
* #param qualityString A string representing playback quality. Ex: "small", "medium", "hd1080".
* #return An enum value representing the playback quality.
*/
+ (YTPlaybackQuality)playbackQualityForString:(NSString *)qualityString {
YTPlaybackQuality quality = kYTPlaybackQualityUnknown;
if ([qualityString isEqualToString:kYTPlaybackQualitySmallQuality]) {
quality = kYTPlaybackQualitySmall;
} else if ([qualityString isEqualToString:kYTPlaybackQualityMediumQuality]) {
quality = kYTPlaybackQualityMedium;
} else if ([qualityString isEqualToString:kYTPlaybackQualityLargeQuality]) {
quality = kYTPlaybackQualityLarge;
} else if ([qualityString isEqualToString:kYTPlaybackQualityHD720Quality]) {
quality = kYTPlaybackQualityHD720;
} else if ([qualityString isEqualToString:kYTPlaybackQualityHD1080Quality]) {
quality = kYTPlaybackQualityHD1080;
} else if ([qualityString isEqualToString:kYTPlaybackQualityHighResQuality]) {
quality = kYTPlaybackQualityHighRes;
}
return quality;
}
I created a issue for this on the project's GitHub page.

I received a reply from Ibrahim Ulukaya about this issue:
We are hoping to have more information to that call, but basically active indicates the good streaming, and your streaming info is https://developers.google.com/youtube/v3/live/docs/liveStreams#cdn.format where you set, and can see the format.
So the answer for the time being is no, this information is not available from the YouTube Livestreaming API for the time being. I will updated this answer if/when the API is updated.

It seems Youtube Live streaming API has been updated to show Live stream health status with this property: status.healthStatus.status
See their latest API for more info.

Related

iOS / MapKit cache management issue with MKTileOverlay and PINCache library

I am using MapKit in order to create satellite and radar animation by adding MKTileOverlay over the mapView.
With an UISlider and a PlayButton I was able to create an animation, like a GIF by playing with the alpha of the MKOverlayRenderer (setting them to 0 or 0.75 according to the position of my slider).
The animation is quite smooth, all my satellite and radar tiles are loaded properly over the mapView.
I am encountering one issue with the cache management.
I realized that MapKit didn't use cache for my tileOverlay that's why I used the library PINCache in order to save my tiles so that it doesn't request and download the images each time I'm playing the animation.
My implementation :
I override the method URLForTilePath in order to build my URL to get my tile images.
- (NSURL *)URLForTilePath:(MKTileOverlayPath)path{
double latMin, latMax, longMin, longMax;
path.contentScaleFactor = 1.0;
NSMutableArray *result = [[NSMutableArray alloc] init];
result = getBBoxForCoordinates((int)path.x, (int)path.y, (int)path.z);
longMin = [[result objectAtIndex:0] doubleValue];
latMin = [[result objectAtIndex:1] doubleValue];
longMax = [[result objectAtIndex:2] doubleValue];
latMax = [[result objectAtIndex:3] doubleValue];
NSString *finalURL = self.url;
finalURL = [finalURL stringByReplacingOccurrencesOfString:#"DATE"
withString:_date];
NSString *bbox = [NSString stringWithFormat:#"bbox=%f,%f,%f,%f", longMin, latMin, longMax, latMax];
finalURL = [finalURL stringByReplacingOccurrencesOfString:#"BBOX"
withString:bbox];
return [NSURL URLWithString:finalURL];
}
And the key method that will call URLForTilePath is my implementation of loadTileAtPath :
- (void)loadTileAtPath:(MKTileOverlayPath)path
result:(void (^)(NSData *data, NSError *error))result
{
if (!result)
{
return;
}
NSString *str = self.isRadar == true ? [NSString stringWithFormat:#"Radar%#", self.date] : [NSString stringWithFormat:#"Satellite%#", self.date];
NSData *cachedData = [[PINCache sharedCache] objectForKey:str];
if (cachedData)
{
result(cachedData, nil);
}
else
{
NSURLRequest *request = [NSURLRequest requestWithURL:[self URLForTilePath:path]];
[NSURLConnection sendAsynchronousRequest:request queue:[NSOperationQueue mainQueue] completionHandler:^(NSURLResponse *response, NSData *data, NSError *connectionError) {
NSString *str = self.isRadar == true ? [NSString stringWithFormat:#"Radar%#", self.date] : [NSString stringWithFormat:#"Satellite%#", self.date];
[[PINCache sharedCache] setObject:data forKey:str block:nil];
result(data, connectionError);
}];
}
}
Basically what I'm trying to achieve is :
Check if I have cached Data, if so then get the object.
If not, I make a request in order to download the tile with the URL given by URLForTilePath.
I then set the Object to the Cache.
The string str is my Key for the cache management
I have 2 important values in order to sort and differentiate the tiles, the type (Radar or Satellite, different image, different URL), and the Date.
I can see that the cache management is working, the Overlays are rendering way faster but the main issue I'm encountering is that it doesn't load and build the tiles at the according coordinate.
My mapView is like a puzzle of the world wrongly built.
With my piece of code, can you see something wrong I made ?
I couldn't find the reason to this memory/cache management issues with mapKit.
I decided to try with GoogleMap, it tooks me less than 2 hours to implement it, and the results are amazing.
I wanted to respect the Apple community by using Apple solution, but Google Map provided me so much more performances.

iOS Share Extension unable to get shared URL from Chrome

I am trying to implement share extension for my app. Its working good in safari browser and youtube app (i.e) when i share from these apps i get the public.url which is the url to be shared.
When i tried the same in chrome it was not showing the extension. When i added the NSExtensionActivationSupportsText under the NSExtensionActivationRule to true its started showing. But when i try to share the contents i am unable to fetch the URL String which is to be shared. I get only contentText.
I have followed the approach shown in this link https://stackoverflow.com/a/31942744/6199038.
I have also added the demoProcessor.js
var MyPreprocessor = function() {};
MyPreprocessor.prototype = {
run: function(arguments) {
arguments.completionFunction({"URL": document.URL, "pageSource": document.documentElement.outerHTML, "title": document.title, "selection": window.getSelection().toString()});
}
};
var ExtensionPreprocessingJS = new MyPreprocessor;
I am using SLComposeServiceViewController
In my ShareViewController.m i am trying to get the data as shown below,
for safari i used this which is working fine
NSExtensionItem *item = self.extensionContext.inputItems.firstObject;
NSItemProvider *itemProvider = [[item.userInfo valueForKey:NSExtensionItemAttachmentsKey] objectAtIndex:0];
if ([itemProvider hasItemConformingToTypeIdentifier:#"public.url"]) {
[itemProvider loadItemForTypeIdentifier:#"public.url" options:nil completionHandler:^(NSURL *url, NSError *error) {
urlString = url.absoluteString;
NSLog(#"urlString %#",urlString);
}];
}
Then i modified my code to this to get the URL from chrome, Which is not working.
for (NSExtensionItem *item in self.extensionContext.inputItems) {
for (NSItemProvider *itemProvider in item.attachments) {
if ([itemProvider hasItemConformingToTypeIdentifier:(NSString *)kUTTypePropertyList]) {
[itemProvider loadItemForTypeIdentifier:(NSString *)kUTTypePropertyList options:nil completionHandler:^(NSDictionary *jsDict, NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
NSDictionary *jsPreprocessingResults = jsDict[NSExtensionJavaScriptPreprocessingResultsKey];
NSString *selectedText = jsPreprocessingResults[#"selection"];
NSString *pageTitle = jsPreprocessingResults[#"title"];
NSString *URL = jsPreprocessingResults[#"URL"];
NSLog(#"selectedText %#",selectedText);
NSLog(#"pageTitle %#",pageTitle);
NSLog(#"URL %#",URL);
});
}];
break;
}
}
}
PLEASE ADVICE
I have solved it myself. As it was not entering inside the "loadItemForTypeIdentifier" method. So i had modified my method to the below code
for (NSExtensionItem *item in self.extensionContext.inputItems) {
for (NSItemProvider *itemProvider in item.attachments) {
NSString *URLinPlainText = [item.attributedContentText string];
if ([itemProvider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeURL]) {
[itemProvider loadItemForTypeIdentifier:(NSString *)kUTTypeURL options:nil completionHandler:^(NSURL *url, NSError *error) {
urlString = url.absoluteString;
NSLog(#"<< URL >> %#",urlString);
return;
}];
}
else if (URLinPlainText) {
// In Some app i got the URL in a Plain text mode
if([URLinPlainText containsString:#"http"]){
urlString = [sharedPlainText stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSLog(#"<< URL >> %#",urlString);
return;
}
}
}
}
And also i have removed the demoProcessor.js.
Now i am able to get the URL from almost all the News Apps, Browsers like (Chrome, firefox, safari) and in some of the apps i am getting the shared url in the form of plain text which is then i had to remove the empty spaces and convert it to NSString.
According to my understanding the demoprocessor.js is used if user wants to access the url page properties for ex(title, baseURL, og:image, og:description etc...) which works only for safari browser.

NSURLSession - use uploadTaskWithStreamedRequest with AWS IOS SDK

I've searched for days for a way to upload an IOS asset without creating a copy of the file in temp directory without luck. I got the code working with a temp copy but copying a video file that could be anywhere from 10MB to 4GB is not realistic.
The closest I have come to reading the asset in read-only mode is the code below. Per the apple documentation this should work - see the following links:
https://developer.apple.com/library/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html
I have enabled these keys:
<key>com.apple.security.assets.movies.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.music.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.pictures.read-write</key>
<string>YES</string>
<key>com.apple.security.files.downloads.read-write</key>
<string>YES</string>
Here is the code:
// QueueController.h
#import <AVFoundation/AVFoundation.h>
#import <AWSS3.h>
#import <Foundation/Foundation.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import "Reachability1.h"
#import "TransferType.h"
#import "TransferModel.h"
#import "Util.h"
#interface QueueController : NSObject<NSURLSessionDelegate>
#property(atomic, strong) NSURLSession* session;
#property(atomic, strong) NSNumber* sessionCount;
#property(atomic, strong) NSURLSessionConfiguration* configuration;
+ (QueueController*)sharedInstance;
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType;
#end
#implementation QueueController {
NSOperationQueue* copyQueue;
NSOperationQueue* transferQueue;
NSMutableArray* inProcessTransferArray;
NSMutableArray* pendingTransferArray;
bool isTransferring;
}
static QueueController* sharedInstance = nil;
// Get the shared instance and create it if necessary.
+ (QueueController*)sharedInstance {
#synchronized(self) {
if (sharedInstance == nil) {
sharedInstance = [[QueueController alloc] init];
}
}
return sharedInstance;
}
- (id)init {
if (self = [super init]) {
appDelegate =
(RootViewControllerAppDelegate*)[UIApplication sharedApplication]
.delegate;
copyQueue = [[NSOperationQueue alloc] init];
transferQueue = [[NSOperationQueue alloc] init];
transferQueue.maxConcurrentOperationCount = MAX_CONCURRENT_TRANSFERS;
inProcessTransferArray = [[NSMutableArray alloc] init];
pendingTransferArray = [[NSMutableArray alloc] init];
isTransferring = false;
if (self.session == nil) {
self.configuration = [NSURLSessionConfiguration
backgroundSessionConfigurationWithIdentifier:#"transferQueue"];
self.session = [NSURLSession sessionWithConfiguration:self.configuration
delegate:self
delegateQueue:transferQueue];
}
}
return self;
}
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType {
// Create a transfer model
NSUserDefaults* defaultUser = [NSUserDefaults standardUserDefaults];
NSString* user_id = [defaultUser valueForKey:#"UserId"];
TransferModel* transferModel = [[TransferModel alloc] init];
transferModel.mediaItem = mediaItem;
transferModel.transferType = transferType;
transferModel.s3Path = user_id;
transferModel.s3file_name = mediaItem.mediaName;
transferModel.assetURL =
[[mediaItem.mediaLocalAsset defaultRepresentation] url];
ALAssetRepresentation* mediaRep =
[mediaItem.mediaLocalAsset defaultRepresentation];
transferModel.content_type =
(__bridge_transfer NSString*)UTTypeCopyPreferredTagWithClass(
(__bridge CFStringRef)[mediaRep UTI], kUTTagClassMIMEType);
#synchronized(pendingTransferArray) {
if ((!isTransferring) &&
(transferQueue.operationCount < MAX_CONCURRENT_TRANSFERS)) {
isTransferring = true;
if (transferModel.transferType == UPLOAD) {
/**
* Read ALAsset from NSURLRequestStream
*/
NSInvocationOperation* uploadOP = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(uploadMediaViaLocalPath:)
object:transferModel];
[transferQueue addOperation:uploadOP];
[inProcessTransferArray addObject:transferModel];
}
} else {
// Add to pending
[pendingTransferArray addObject:transferModel];
}
}
}
- (void)uploadMediaViaLocalPath:(TransferModel*)transferModel {
#try {
/**
* Fetch readable asset
*/
NSURL* assetURL =
[[transferModel.mediaItem.mediaLocalAsset defaultRepresentation] url];
NSData* fileToUpload = [[NSData alloc] initWithContentsOfURL:assetURL];
NSURLRequest* assetAsRequest =
[NSURLRequest requestWithURL:assetURL
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:60.0];
/**
* Fetch signed URL
*/
AWSS3GetPreSignedURLRequest* getPreSignedURLRequest =
[AWSS3GetPreSignedURLRequest new];
getPreSignedURLRequest.bucket = BUCKET_NAME;
NSString* s3Key = [NSString stringWithFormat:#"%#/%#", transferModel.s3Path, transferModel.s3file_name];
getPreSignedURLRequest.key = s3Key;
getPreSignedURLRequest.HTTPMethod = AWSHTTPMethodPUT;
getPreSignedURLRequest.expires = [NSDate dateWithTimeIntervalSinceNow:3600];
// Important: must set contentType for PUT request
// getPreSignedURLRequest.contentType = transferModel.mediaItem.mimeType;
getPreSignedURLRequest.contentType = transferModel.content_type;
NSLog(#"mimeType: %#", transferModel.content_type);
/**
* Upload the file
*/
[[[AWSS3PreSignedURLBuilder defaultS3PreSignedURLBuilder]
getPreSignedURL:getPreSignedURLRequest]
continueWithBlock:^id(BFTask* task) {
NSURLSessionUploadTask* uploadTask;
transferModel.sessionTask = uploadTask;
if (task.error) {
NSLog(#"Error: %#", task.error);
} else {
NSURL* presignedURL = task.result;
NSLog(#"upload presignedURL is: \n%#", presignedURL);
NSMutableURLRequest* request =
[NSMutableURLRequest requestWithURL:presignedURL];
request.cachePolicy = NSURLRequestReloadIgnoringLocalCacheData;
[request setHTTPMethod:#"PUT"];
[request setValue:transferModel.content_type
forHTTPHeaderField:#"Content-Type"];
uploadTask =
[self.session uploadTaskWithStreamedRequest:assetAsRequest];
[uploadTask resume];
}
return nil;
}];
} #catch (NSException* exception) {
NSLog(#"exception: %#", exception);
} #finally {
}
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didSendBodyData:(int64_t)bytesSent
totalBytesSent:(int64_t)totalBytesSent
totalBytesExpectedToSend:(int64_t)totalBytesExpectedToSend {
// Calculate progress
double progress = (double)totalBytesSent / (double)totalBytesExpectedToSend;
NSLog(#"UploadTask progress: %lf", progress);
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didCompleteWithError:(NSError*)error {
NSLog(#"(void)URLSession:session task:(NSURLSessionTask*)task "
#"didCompleteWithError:error called...%#",
error);
}
- (void)URLSessionDidFinishEventsForBackgroundURLSession:
(NSURLSession*)session {
NSLog(#"URLSessionDidFinishEventsForBackgroundURLSession called...");
}
// NSURLSessionDataDelegate
- (void)URLSession:(NSURLSession*)session
dataTask:(NSURLSessionDataTask*)dataTask
didReceiveResponse:(NSURLResponse*)response
completionHandler:
(void (^)(NSURLSessionResponseDisposition disposition))completionHandler {
//completionHandler(NSURLSessionResponseAllow);
}
#end
But I'm receiving this error:
(void)URLSession:session task:(NSURLSessionTask*)task didCompleteWithError:error called...Error Domain=NSURLErrorDomain Code=-999 "cancelled" UserInfo=0x17166f840 {NSErrorFailingURLStringKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV, NSLocalizedDescription=cancelled, NSErrorFailingURLKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV}
userInfo: {
NSErrorFailingURLKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSErrorFailingURLStringKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSLocalizedDescription = cancelled;
}
Thanks in advance for your help.
Regards,
-J
A couple of comments regarding using NSURLSessionUploadTask:
If you implement didReceiveResponse, you must call the completionHandler.
If you call uploadTaskWithStreamedRequest, the documentation for the request parameter warns us that:
The body stream and body data in this request object are ignored, and NSURLSession calls its delegate’s URLSession:task:needNewBodyStream: method to provide the body data.
So you must implement needNewBodyStream if implementing a NSInputStream based request.
Be forewarned, but using a stream-base request like this creates a request with a "chunked" transfer encoding and not all servers can handle that.
At one point in the code, you appear to try to load the contents of the asset into a NSData. If you have assets that are that large, you cannot reasonably load that into a NSData object. Besides, that's inconsistent with using uploadTaskWithStreamedRequest.
You either need to create NSInputStream or upload it from a file.
You appear to be using the asset URL for the NSURLRequest. That URL should be the URL for your web service.
When using image picker, you have access to two URL keys: the media URL (a file:// URL for movies, but not pictures) and the assets library reference URL (an assets-library:// URL). If you're using the media URL, you can use that for uploading movies. But you cannot use the assets library reference URL for uploading purposes. You can only use that in conjunction with ALAssetsLibrary.
The ALAssetPropertyURL is a purely URL identifier for the asset i.e to identify assets and asset groups and I dont think you can use it directly to upload to a service.
You could use AVAssetExportSession to export the asset to temp url if other methods are teditious.
i.e
[AVAssetExportSession exportSessionWithAsset:[AVURLAsset URLAssetWithURL:assetURL options:nil] presetName:AVAssetExportPresetPassthrough];

Using AVURLAsset on a custom NSURLProtocol

I have written a custom NSURLProtocol (called "memory:") that allows me to fetch stored NSData items from a NSDictionary based on a name. For example, this code registers the NSURLProtocol class and adds some data:
[VPMemoryURLProtocol register];
[VPMemoryURLProtocol addData:data withName:#"video"];
This allows me to refer to the NSData via a url like "memory://video".
Below is my custom NSURLProtocol implementation:
NSMutableDictionary* gMemoryMap = nil;
#implementation VPMemoryURLProtocol
{
}
+ (void)register
{
static BOOL inited = NO;
if (!inited)
{
[NSURLProtocol registerClass:[VPMemoryURLProtocol class]];
inited = YES;
}
}
+ (void)addData:(NSData *)data withName:(NSString *)name
{
if (!gMemoryMap)
{
gMemoryMap = [NSMutableDictionary new];
}
gMemoryMap[name] = data;
}
+ (BOOL)canInitWithRequest:(NSURLRequest *)request
{
NSLog(#"URL: %#, Scheme: %#",
[[request URL] absoluteString],
[[request URL] scheme]);
NSString* theScheme = [[request URL] scheme];
return [theScheme caseInsensitiveCompare:#"memory"] == NSOrderedSame;
}
+ (NSURLRequest *)canonicalRequestForRequest:(NSURLRequest *)request
{
return request;
}
- (void)startLoading
{
NSString* name = [[self.request URL] path];
NSData* data = gMemoryMap[name];
NSURLResponse* response = [[NSURLResponse alloc] initWithURL:[self.request URL]
MIMEType:#"video/mp4"
expectedContentLength:-1
textEncodingName:nil];
id<NSURLProtocolClient> client = [self client];
[client URLProtocol:self didReceiveResponse:response
cacheStoragePolicy:NSURLCacheStorageNotAllowed];
[client URLProtocol:self didLoadData:data];
[client URLProtocolDidFinishLoading:self];
}
- (void)stopLoading
{
}
I am not sure whether this code works or not but that is not what I have a problem with. Despite registering the custom protocol, canInitWithRequest: is never called when I try to use the URL in this code:
NSURL* url = [NSURL URLWithString:#"memory://video"];
AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator* imageGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
CMTime time = CMTimeMakeWithSeconds(0, 600);
NSError* error;
CMTime actualTime;
CGImageRef image = [imageGen copyCGImageAtTime:time
actualTime:&actualTime
error:&error];
UIImage* uiImage = [UIImage imageWithCGImage:image];
CGImageRelease(image);
image is always nil if I use "memory://video" but works fine if I use "file:///...". What am I missing? Why isn't canInitWithRequest not being called? Does AVFoundation only support specific URL protocols and not custom ones?
Thanks
Certainly the underpinnings used to only support particular URL schemes— as an eBook developer I've seen this happen for any media type loaded through a URL such as epub:// or zip://. In those cases, on iOS 5.x and earlier, tracing through the relevant code would wind up in a QuickTime method which compared the URL scheme against a small number of supported ones: file, http, https, ftp and whatever it is that iTunes uses-- I forget what it's called.
In iOS 6+ there is a new API in AVFoundation, however, which is designed to help here. While I've not used it personally, this is how it should work:
NSURL* url = [NSURL URLWithString:#"memory://video"];
AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil];
////////////////////////////////////////////////////////////////
// NEW CODE START
AVAssetResourceLoader* loader = [asset resourceLoader];
id<AVAssetResourceLoaderDelegate> delegate = [SomeClass newInstanceWithNSURLProtocolClass: [VPMemoryURLProtocol class]];
[loader setDelegate: delegate queue: some_dispatch_queue];
// NEW CODE END
////////////////////////////////////////////////////////////////
AVAssetImageGenerator* imageGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
CMTime time = CMTimeMakeWithSeconds(0, 600);
With this in place, you need only implement the AVAssetResourceLoader protocol somewhere, which is very simple as it contains only one method. Since you already have an NSURLProtocol implementation, all your real work is done and you can simply hand off the real work to the Cocoa loading system or your protocol class directly.
Again, I'll point out that I've yet to actually make use of this, so the above is entirely theoretical.

Download pdf file from server, save it in ApplicationSupport directory. iOS

I'm newbie in iOS development, and I wanna know how can I download a file from a server, and save it in my Application Support folder. I wanna keep it as .pdf file, to be able to display it in UIWebView.
After long time in diferents websites, I think I should use NSURLConnection (asynchronous) to download it. Or NSData (I tried it already, but it didn't work).
So, there is someone who can help me, by show me a sample code of this?
Thank you so much :)
Have a look at this S.O. question for an example of how to do it.
This example uses ASIHTTPRequest, which is an alternative to NSURLRequest and NSURLConnection. I strongly suggest you to use this framework, which will make your life much easier.
If you are really willing to use NSURLRequest and NSURLConnection, see this other topic.
[self.productListArray enumerateObjectsUsingBlock:^(NSDictionary *productDictionary, NSUInteger idx, BOOL *stop)
{
NSFileManager *fileManger=[NSFileManager defaultManager];
if(![fileManger fileExistsAtPath:pdfString])
{
dispatch_async(serialQueue, ^()
{
NSURLRequest *request = [[NSURLRequest alloc]initWithURL:url cachePolicy:NSURLCacheStorageNotAllowed timeoutInterval:120];
NSURLResponse *response = nil;
NSError *connectionError = nil;
NSData *data = [NSURLConnection sendSynchronousRequest:request
returningResponse:&response
error:&connectionError];
if(connectionError)
{
NSLog(#"Pdf Connection Error==>%#",connectionError.userInfo);
[AMSharedClass showAlertMessge:#"Request timeout"];
}
else if ([response.MIMEType isEqualToString:#"application/pdf"])
{
NSLog(#"pdfFilePathURLString==>%#",pdfString);
[data writeToFile:pdfString atomically:YES];
}
else
{
[AMSharedClass showAlertMessge:#"Pdf not found."];
if (idx+1 == [self.productListArray count])
{
[self.btnSetting setEnabled:NO];
}
}
if (idx+1 == [self.productListArray count])
{
[[[AMSharedClass object]sharedHUD]hideOnWindow];
self.pdfURLString = [self joinPDF:self.productFilePathUrlArray WithDetails:self.pdfInfoArray];
[self initialConfiguration];
NSLog(#"%#",self.productFilePathUrlArray);
}
});
// Long running task
}
else
{
if (idx+1 == [self.productListArray count])
{
self.pdfURLString = [self joinPDF:self.productFilePathUrlArray WithDetails:self.pdfInfoArray];
[self initialConfiguration];
NSLog(#"%#",self.productFilePathUrlArray);
[[[AMSharedClass object]sharedHUD]hideOnWindow];
}
}
}];

Resources