Get link to media from AVFullScreenViewController iOS 8 - ios

i'am trying to get links to video when something begin to play (for example any YouTube video).
First i catch when video begin to play
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoStarted:) name:#"UIWindowDidBecomeVisibleNotification" object:nil];
then with delay try to get link
-(void)videoStarted:(NSNotification *)notification
{
NSLog(#"notification dic = %#", [notification userInfo]);
[self performSelector:#selector(detectModal) withObject:nil afterDelay:2.5f];
}
-(void)detectModal
{
CUViewController *controller = (CUViewController *)[appDelegate.window rootViewController].presentedViewController;
NSLog(#"Presented modal = %#", [appDelegate.window rootViewController].presentedViewController);
if(controller && [controller respondsToSelector:#selector(item)])
{
id currentItem = [controller item];
if(currentItem && [currentItem respondsToSelector:#selector(asset)])
{
AVURLAsset *asset = (AVURLAsset *)[currentItem asset];
if([asset respondsToSelector:#selector(URL)] && asset.URL)
[self newLinkDetected:[[asset URL] absoluteString]];
NSLog(#"asset find = %#", asset);
}
}
else
{
for (UIWindow *window in [UIApplication sharedApplication].windows) {
if ([window.rootViewController.presentedViewController isKindOfClass:NSClassFromString(#"AVFullScreenViewController")])
{
controller = (CUViewController *)window.rootViewController.presentedViewController;
for(int i = 0; i < [controller.view.subviews count]; i++)
{
UIView *topView = [controller.view.subviews objectAtIndex:i];
NSLog(#"top view = %#", topView);
for(int j = 0; j < [topView.subviews count]; j++)
{
UIView *subView = [topView.subviews objectAtIndex:j];
NSLog(#"sub view = %#", subView);
for (int k = 0; k < [subView.subviews count]; k++)
{
CUPlayerView *subsubView = (CUPlayerView *)[subView.subviews objectAtIndex:k];
NSLog(#"sub sub view = %# class = %#", subsubView, NSStringFromClass([subsubView class]));
if([NSStringFromClass([subsubView class]) isEqualToString:#"AVVideoLayerView"])
{
NSLog(#"player view controller = %#", subsubView.playerController);
CUPlayerController *pController = subsubView.playerController;
NSLog(#"item = %#", [pController player]);
CUPlayerController *proxyPlayer = pController.playerControllerProxy;
if(proxyPlayer)
{
AVPlayer *player = [proxyPlayer player];
NSLog(#"find player = %# chapters = %#", player, proxyPlayer.contentChapters);
break;
}
}
}
}
}
}
}
}
}
CUViewController, CUPlayerView, CUPlayerController - fake classes it's and looks like this
#interface CUPlayerController : UIViewController
#property(nonatomic, retain) id playerControllerProxy;
#property(nonatomic, retain) id player;
#property(nonatomic, retain) id item;
- (id)contentChapters;
#end
everything is okay until this line
NSLog(#"find player = %# chapters = %#", player, proxyPlayer.contentChapters);
player is always nil. Maybe there is a more simple way to get link to media?

First off I'd like to focus on your AVPlayer which plays an AVPlayerItem. An AVPlayerItem object carries a reference to an AVAsset object and presentation settings for that asset. When you use the playerWithURL: method of AVPlayer it automatically creates the AVPlayerItem backed by an asset that is a subclass of AVAsset named AVURLAsset. AVURLAsset has a URL property. So if you use that you can get the NSURL of the currently playing item. Here's an example function of getting URL:
-(NSURL *)urlOfCurrentlyPlayingInPlayer:(AVPlayer *)player{
// get current asset
AVAsset *currentPlayerAsset = player.currentItem.asset;
// make sure the current asset is an AVURLAsset
if (![currentPlayerAsset isKindOfClass:AVURLAsset.class]) return nil;
// return the NSURL
return [(AVURLAsset *)currentPlayerAsset URL];
}
I think it's just a matter of how you fit this thing around your code. Hope this helps.

Related

AVPlayerItemVideoOutput never gets a pixelBuffer

I've been expanding my testing on my video rendering code and noticed something unusual during testing with AVPlayerItemVideoOutput. In trying to test my rendering code I check for new pixel buffers using hasNewPixelBufferForItemTime.
While testing, this method never returns yes. But the rendering code works in my app just fine using the same setup, rendering frames to openGL textures.
I setup a github project with the very basics showing the error. In the app you can load the video by tapping the button (Not loaded immediately to avoid any conflict with the test). This proves at least the video loads and plays.
The project also has test, that attempts to setup the AVPlayerItemVideoOutput and check for new pixel buffers. This test always fails, but I can't see what I'm doing wrong, or why the exact same steps work in my own app.
The GitHub project is here.
And this is the test method to peruse:
#import <XCTest/XCTest.h>
#import <AVFoundation/AVFoundation.h>
#interface AVPlayerTestTests : XCTestCase
#end
#implementation AVPlayerTestTests
- (void)setUp {
[super setUp];
// Put setup code here. This method is called before the invocation of each test method in the class.
}
- (void)tearDown {
// Put teardown code here. This method is called after the invocation of each test method in the class.
[super tearDown];
}
- (void) testAVPlayer
{
NSURL *fileURL = [[NSBundle bundleForClass:self.class] URLForResource:#"SampleVideo_1280x720_10mb" withExtension:#"mp4"];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:fileURL];
[self keyValueObservingExpectationForObject:playerItem
keyPath:#"status" handler:^BOOL(id _Nonnull observedObject, NSDictionary * _Nonnull change) {
AVPlayerItem *oPlayerItem = (AVPlayerItem *)observedObject;
switch (oPlayerItem.status) {
case AVPlayerItemStatusFailed:
{
XCTFail(#"Video failed");
return YES;
}
break;
case AVPlayerItemStatusUnknown:
return NO;
break;
case AVPlayerItemStatusReadyToPlay:
{
return YES;
}
default:
break;
}
}];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
NSDictionary *pbOptions = #{
(NSString *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
(NSString *)kCVPixelBufferIOSurfacePropertiesKey : [NSDictionary dictionary],
(NSString *)kCVPixelBufferOpenGLESCompatibilityKey : #YES
};
AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pbOptions];
XCTAssertNotNil(output);
[self waitForExpectationsWithTimeout:100 handler:nil];
if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
[playerItem addOutput:output];
player.rate = 1.0;
player.muted = YES;
[player play];
CMTime vTime = [output itemTimeForHostTime:CACurrentMediaTime()];
// This is what we're testing
BOOL foundFrame = [output hasNewPixelBufferForItemTime:vTime];
XCTAssertTrue(foundFrame);
if (!foundFrame) {
// Cycle over for ten seconds
for (int i = 0; i < 10; i++) {
sleep(1);
vTime = [output itemTimeForHostTime:CACurrentMediaTime()];
foundFrame = [output hasNewPixelBufferForItemTime:vTime];
if (foundFrame) {
NSLog(#"Got frame at %i", i);
break;
}
if (i == 9) {
XCTFail(#"Failed to acquire");
}
}
}
}
}
#end
EDIT 1 : This seems to be a bug and I've filed a Radar

How can I use this code to play more sounds?

//Action to play Audio//
-(IBAction)playAudio:(id)sender {
[self.loopPlayer play];
}
//Action to stop Audio//
-(IBAction)stopAudio:(id)sender {
if (self.loopPlayer.isPlaying) {
[self.loopPlayer stop];
self.loopPlayer.currentTime = 0;
self.loopPlayer.numberOfLoops = -1;
[self.loopPlayer prepareToPlay];
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//Code that gets audio file "trap synth"//
NSURL* audioFileURL = [[NSBundle mainBundle] URLForResource:#"trapsynth" withExtension:#"wav"];
self.loopPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL error:nil];
}
This is the code i'm using with one button to play the sound when the button is tapped and stop the sound when the button is released. How would I go about adding more sounds to more buttons? I want to have more buttons that play and stop different sounds just like this.
property (nonatomic, strong) AVAudioPlayer *loopPlayer;
This code is also in my ViewController.h file
Ok although the answer provided by Miro is on the write track the code example given has issues.
Should be this in viewDidLoad -
- (void)viewDidLoad {
[super viewDidLoad];
NSURL* audioFileURL1 = [[NSBundle mainBundle] URLForResource:#"trapsynth" withExtension:#"wav"];
self.loopPlayer1 = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL1 error:nil];
NSURL* audioFileURL2 = [[NSBundle mainBundle] URLForResource:#"other_audio_file" withExtension:#"wav"];
self.loopPlayer2 = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL2 error:nil];
}
also stopAudio: method should be this
-(IBAction)stopAudio:(id)sender {
if (self.loopPlayer1.isPlaying && (sender.tag == 1)) {
[self.loopPlayer1 stop];
self.loopPlayer1.currentTime = 0;
self.loopPlayer1.numberOfLoops = -1;
[self.loopPlayer1 prepareToPlay];
}
if (self.loopPlayer2.isPlaying && (sender.tag == 2)) {
[self.loopPlayer2 stop];
self.loopPlayer2.currentTime = 0;
self.loopPlayer2.numberOfLoops = -1;
[self.loopPlayer2 prepareToPlay];
}
}
And finally for playAudio:
-(IBAction)playAudio:(id)sender {
if([sender tag] == 1){
[self.loopPlayer1 play];
}
if([sender tag] == 2){
[self.loopPlayer2 play];
}
}
If you want to play different sounds at the same time you should look into creating separate AVAudioPlayers - if you create a different one for each sound, then you can easily control (play/stop) each of them separately with a specific button.
On the simplest level, you could do something like this, which allows you to use the same button handlers for all your audio. The playAudio checks the tag of the Play button you press (be sure to set the tag value in IB, to 1,2,etc). There really only need be one Stop button.
You could enhance this in many ways, like attempting to reuse the AVAudioPlayer somehow, and loading the audio on the fly instead of all at the beginning. Or storing your audio file info in an array, creating an array of AVAudioPlayers for management, etc. But this is a start.
-(IBAction)playAudio:(id)sender {
// first, stop any already playing audio
[self stopAudio:sender];
if([sender tag] == 1){
[self.loopPlayer1 play];
} else if([sender tag] == 2){
[self.loopPlayer2 play];
}
}
-(IBAction)stopAudio:(id)sender {
if (self.loopPlayer1.isPlaying) {
[self.loopPlayer1 stop];
self.loopPlayer1.currentTime = 0;
self.loopPlayer1.numberOfLoops = -1;
[self.loopPlayer1 prepareToPlay];
} else if (self.loopPlayer2.isPlaying) {
[self.loopPlayer2 stop];
self.loopPlayer2.currentTime = 0;
self.loopPlayer2.numberOfLoops = -1;
[self.loopPlayer2 prepareToPlay];
}
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL* audioFileURL1 = [[NSBundle mainBundle] URLForResource:#"trapsynth" withExtension:#"wav"];
self.loopPlayer1 = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL error:nil];
NSURL* audioFileURL2 = [[NSBundle mainBundle] URLForResource:#"trapsynth" withExtension:#"wav"];
self.loopPlayer2 = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL error:nil];
}
AND, in the .h file;
property (nonatomic, strong) AVAudioPlayer *loopPlayer1;
property (nonatomic, strong) AVAudioPlayer *loopPlayer2;

Record video in iOS app without leaving app

PhoneGap offers a plug-in for recording video, but this plug-in seems to force users to leave the app for the Camera app in order to record video.
Is it possible to initiate and stop video recording without leaving the app?
The two methods below are the primary methods we are using (rest of plug-in is viewable from the link above) to capture video. How should we modify this to allow recording from the front camera, without leaving the app, when the user taps a button?
Can anyone provide guidance please? Thanks.
Capture.m:
- (void)captureVideo:(CDVInvokedUrlCommand*)command
{
NSString* callbackId = command.callbackId;
NSDictionary* options = [command.arguments objectAtIndex:0];
if ([options isKindOfClass:[NSNull class]]) {
options = [NSDictionary dictionary];
}
// options could contain limit, duration and mode
// taking more than one video (limit) is only supported if provide own controls via cameraOverlayView property
NSNumber* duration = [options objectForKey:#"duration"];
NSString* mediaType = nil;
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
// there is a camera, it is available, make sure it can do movies
pickerController = [[CDVImagePicker alloc] init];
NSArray* types = nil;
if ([UIImagePickerController respondsToSelector:#selector(availableMediaTypesForSourceType:)]) {
types = [UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
// NSLog(#"MediaTypes: %#", [types description]);
if ([types containsObject:(NSString*)kUTTypeMovie]) {
mediaType = (NSString*)kUTTypeMovie;
} else if ([types containsObject:(NSString*)kUTTypeVideo]) {
mediaType = (NSString*)kUTTypeVideo;
}
}
}
if (!mediaType) {
// don't have video camera return error
NSLog(#"Capture.captureVideo: video mode not available.");
CDVPluginResult* result = [CDVPluginResult resultWithStatus:CDVCommandStatus_ERROR messageToErrorObject:CAPTURE_NOT_SUPPORTED];
[self.commandDelegate sendPluginResult:result callbackId:callbackId];
pickerController = nil;
} else {
pickerController.delegate = self;
pickerController.sourceType = UIImagePickerControllerSourceTypeCamera;
pickerController.allowsEditing = NO;
// iOS 3.0
pickerController.mediaTypes = [NSArray arrayWithObjects:mediaType, nil];
if ([mediaType isEqualToString:(NSString*)kUTTypeMovie]){
if (duration) {
pickerController.videoMaximumDuration = [duration doubleValue];
}
//NSLog(#"pickerController.videoMaximumDuration = %f", pickerController.videoMaximumDuration);
}
// iOS 4.0
if ([pickerController respondsToSelector:#selector(cameraCaptureMode)]) {
pickerController.cameraCaptureMode = UIImagePickerControllerCameraCaptureModeVideo;
// pickerController.videoQuality = UIImagePickerControllerQualityTypeHigh;
// pickerController.cameraDevice = UIImagePickerControllerCameraDeviceRear;
// pickerController.cameraFlashMode = UIImagePickerControllerCameraFlashModeAuto;
}
// CDVImagePicker specific property
pickerController.callbackId = callbackId;
SEL selector = NSSelectorFromString(#"presentViewController:animated:completion:");
if ([self.viewController respondsToSelector:selector]) {
[self.viewController presentViewController:pickerController animated:YES completion:nil];
} else {
// deprecated as of iOS >= 6.0
[self.viewController presentModalViewController:pickerController animated:YES];
}
}
}
- (CDVPluginResult*)processVideo:(NSString*)moviePath forCallbackId:(NSString*)callbackId
{
// save the movie to photo album (only avail as of iOS 3.1)
/* don't need, it should automatically get saved
NSLog(#"can save %#: %d ?", moviePath, UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath));
if (&UIVideoAtPathIsCompatibleWithSavedPhotosAlbum != NULL && UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath) == YES) {
NSLog(#"try to save movie");
UISaveVideoAtPathToSavedPhotosAlbum(moviePath, nil, nil, nil);
NSLog(#"finished saving movie");
}*/
// create MediaFile object
NSDictionary* fileDict = [self getMediaDictionaryFromPath:moviePath ofType:nil];
NSArray* fileArray = [NSArray arrayWithObject:fileDict];
return [CDVPluginResult resultWithStatus:CDVCommandStatus_OK messageAsArray:fileArray];
}

iOS - NJSONSerialization causing a memory leak??(according to Instruments)

Below is the method with the code, where I am getting a memory leak using manual memory management. The memory leak is detected by using Xcode instruments and specifically points to the line where I am using NSJSONSerialization. I am running the target app (on a device with iOS 6.1).
The first time that i tap on the refreshButton there is no leak. Any subsequent tap generates the leak(and more leaks on top of that if i continue tapping the button). Below is the code - This is basic stuff for consuming JSON web services(the web service link is bogus but the real one that I am using works). You will notice that I am using Grand Central Dispatch so that I can update the UI without waiting for the parsing of the JSON to finish.
The line detected by instruments is surrounded by the asterisks. I would like to get some help to anyone who might have an idea of what is going on here. The full stack trace(as mentioned in the below comments i will put here:)
+(NSJSONSerialization JSONObjectWithData:option:error:] -> -[_NSJSONReader parseData:options:] -> -[_NSJSONReader parseUTF8JSONData:skipBytes:options]->newJSONValue->newJSONString->[NSPlaceholde‌​rString
initWithBytes:length:encoding:]
-(void)parseDictionary:(NSDictionary *)dictonary{
self.transactions = [dictonary objectForKey:#"transactions"];
if(!self.transactions){
NSLog(#"Expected 'transactions' array");
return;
}
for (int arrayIndex = 0; arrayIndex < [self.transactions count]; arrayIndex++) {
TransactionResult *result = [[[TransactionResult alloc] init] autorelease];
result.transactionID = [[self.transactions objectAtIndex:arrayIndex] objectForKey:#"ID"];
result.transactionDescription = [[self.transactions objectAtIndex:arrayIndex] objectForKey:#"description"];
result.transactionPrice = [[self.transactions objectAtIndex:arrayIndex] objectForKey:#"price"];
self.totalPrice += [result.transactionPrice doubleValue];
NSLog(#"total price: %f", self.totalPrice);
[self.transactionResults addObject:result];
result = nil;
}
}
- (IBAction)refreshButtonPressed:(UIBarButtonItem *)sender {
__block id resultObject;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),^{
NSURL *url = [NSURL URLWithString:#"http://mywebservice.php"];
NSData *data = [NSData dataWithContentsOfURL:url];
NSError *error;
***resultObject = [NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingAllowFragments error:&error];***
if(!error){
if([resultObject isKindOfClass:[NSDictionary class]]){
NSDictionary *dictonary = resultObject;
[self parseDictionary:dictonary];
NSLog(#"Done parsing!");
dispatch_async(dispatch_get_main_queue(), ^{
self.isLoading = NO;
[self.transactionsTableView reloadData];
});
}
else{
NSLog(#"JSON Error: Expected Dictionary");
resultObject = nil;
return;
}
}
else{
NSLog(#"JSON Error: %#", [error localizedDescription]);
dispatch_async(dispatch_get_main_queue(), ^{
resultObject = nil;
[self.transactionsTableView reloadData];
[self showError];
});
return;
}
});
}
I used a ARC as soon as it came out with 4.3, and put an app in the store with it - point being you could switch to ARC. That said, I tried to reproduce your problem by creating a class/file that has the no-arc flag applied to it, but cannot reproduce the problem. This makes me believe your problem is elsewhere. In the code below, I create a Test object in another file, retain it, and send it the test message. No matter what I set "i" to, it always deallocs the object:
#import "Tester.h"
#interface Obj : NSObject <NSObject>
#end
#implementation Obj
- (id)retain
{
NSLog(#"retain");
id i = [super retain];
return i;
}
- (oneway void)release
{
NSLog(#"release");
[super release];
}
- (void)foo
{
}
- (void)dealloc
{
NSLog(#"Obj dealloced");
[super dealloc];
}
#end
#implementation Tester
- (void)test
{
int i = 2;
__block Obj *obj;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),^{
obj = [[Obj new] autorelease];
if(i == 0) {
Obj *o = obj;
dispatch_async(dispatch_get_main_queue(), ^
{
[o foo];
} );
} else if(i == 1) {
obj = nil;
} else if(i == 2) {
dispatch_async(dispatch_get_main_queue(), ^
{
obj = nil;
} );
}
} );
}
#end

AVAssetReader failing after one frame on H.264 .mov file

I'm trying to render an H.264 QuickTime movie to an OpenGL texture on iOS. I am stuck decoding frame buffers from the input file. One frame decodes correctly and displays. All subsequent calls to [AVAssetReaderTrackOutput getNextSample] return NULL, however, and AVAssetReader.status == AVAssetReaderStatusFailed. If I do not specify a value for kCVPixelBufferPixelFormatTypeKey in the settings dict, the status remains AVAssetReaderStatusReading, but the buffer objects returned are empty. The AVAsset in question plays without issue in AVPlayer. Is there anything obviously wrong with my code?
- (id) initWithAsset: (AVAsset *) asset {
if (!(self = [super init])) return nil;
_asset = [asset copy];
[self initReaderToTime:kCMTimeZero];
return self;
}
- (void) initReaderToTime:(CMTime) readStartTime {
_readStartTime = readStartTime;
NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary];
[outputSettings setObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString *)kCVPixelBufferPixelFormatTypeKey];
_trackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:[[_asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] outputSettings:outputSettings];
NSError *error = nil;
_assetReader = [[AVAssetReader alloc] initWithAsset:_asset error:&error];
if (error) return;
if (![_assetReader canAddOutput:_trackOutput]) return;
[_assetReader addOutput:_trackOutput];
if (![_assetReader startReading]) return;
[self getNextSample];
}
- (void) getNextSample {
if (_assetReader.status != AVAssetReaderStatusReading) {
Log(#"Reader status %d != AVAssetReaderStatusReading. Ending...", _assetReader.status);
return;
}
CMSampleBufferRef sampleBuffer = [_trackOutput copyNextSampleBuffer];
/*
Do things with buffer
*/
[self performSelector:_cmd withObject:nil afterDelay:0];
}
[_trackOutput copyNextSampleBuffer] should appear somewhere in your code.

Resources