I am making a video app where I create a new video using AVAssetExportSession. While the video is being created I want to give the user ability to cancel video creation. The problem I have is that I do not know how can I send a cancellation request to AVAssetExportSession as I assume it's running on the main thread. Once it starts I have no idea how can I send a stop request?
I tried this but it doesn't work
- (IBAction) startBtn
{
....
// Export
exportSession = [[AVAssetExportSession alloc] initWithAsset:[composition copy] presetName:AVAssetExportPresetHighestQuality];
[exportSession setOutputFileType:#"com.apple.quicktime-movie"];
exportSession.outputURL = outputMovieURL;
exportSession.videoComposition = mainComposition;
//NSLog(#"Went Here 7 ...");
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled ...");
break;
case AVAssetExportSessionStatusCompleted:
{
NSLog(#"Complete ... %#",outputURL); // moview url
break;
}
case AVAssetExportSessionStatusFailed:
{
NSLog(#"Faild=%# ...",exportSession.error);
break;
}
case AVAssetExportSessionStatusExporting:
NSLog(#"Exporting.....");
break;
}
}];
}
- (IBAction) cancelBtn
{
exportSession = nil;
}
You can cancel an export session by sending it the message cancelExport.
To accomplish this, you simply need to have an ivar (or property) which holds the current active export session:
#property (nonatomic, strong) AVAssetExportSession* exportSession;
Initialize the property:
- (IBAction) startBtn {
if (self.exportSession == nil) {
self.exportSession = [[AVAssetExportSession alloc] initWithAsset:[composition copy]
presetName:AVAssetExportPresetHighestQuality];
...
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
self.exportSession = nil;
....
}];
}
else {
// there is an export session already
}
}
In order to cancel the session:
- (IBAction) cancelBtn
{
[self.exportSession cancelExport];
self.exportSession = nil;
}
Hint: For a better user experience, you should disable/enable "Cancel" and "Start Export" buttons accordingly.
Related
I successfully enabled my app to be able to play audio and video in background after the screen is locked. However, for better user experience I want to show play and pause controls of the running media on the locked screen. After following couple of blogs online, added the following code:
#interface MyControllerClass () <UIGestureRecognizerDelegate, UIApplicationDelegate>
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:avAsset];
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:nil];
NSError *activationError = nil;
BOOL success = [[AVAudioSession sharedInstance] setActive: YES error: &activationError];
}
- (void)viewWillDisappear:(BOOL)animated {
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
[super viewWillDisappear:animated];
}
- (BOOL) canBecomeFirstResponder {
return YES;
}
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
NSLog(#"received event %#",receivedEvent);
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause: {
if ([self isVideoPlaying]) {
[self.avPlayer pause];
} else {
[self.avPlayer play];
}
break;
}
case UIEventSubtypeRemoteControlPlay: {
[self.avPlayer play];
break;
}
case UIEventSubtypeRemoteControlPause: {
[self.avPlayer pause];
break;
}
default:
break;
}
}
}
Added background modes in info.plist
Even though I am able to see the control screen, no user event is received by my app upon clicking the buttons.
I believe I am missing out on something very obvious. Any pointers would be helpful.
EDIT 1: The accepted answer in iOS - UIEventTypeRemoteControl events not received says that Your app must be the “Now Playing” app. How do I do this?
I found the answer to my question. I need to implement the code in my question in AppDelegate to receive events instead of implementing in ViewController.
In my app I have Timeline like facebook and want to implement autoplay the way facebook does. I am able to play a video whenever user stops scrolling on Video post inside UITableViewCell but it takes 5-10 seconds to play a video thats the problem.
I need expert guide how to pre - buffer videos from urls at-least 5 sec to give a user better experience or what is other way to play videos from URL instantly on 3g network. A user might have 100 videos post.
I can't find out which class buffers the videos AVURLAsset , AVPlayerItem or AVPlayer while playing .
I am loading AVURLAsset loadValuesAsynchronouslyForKeys and creating AVPlayerItems then Saving to NSDictionary for URL Key.
Below is my code. For security reasons URL gets expired after 15 minutes.
-(void)setContentURL:(NSURL *)contentURL
{
if (contentURL)
{
[self.moviePlayer replaceCurrentItemWithPlayerItem:nil];
[_activityIndicator startAnimating];
__block AVPlayerItem *playerItem=[_appDelegate.dictAVPlayerItems objectForKey:[contentURL.absoluteString stringByAppendingString:self.postIdOrBlogId]];
if (!playerItem)
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
NSArray *keys = [NSArray arrayWithObject:#"playable"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^()
{
NSLog(#"keys %#", keys);
[self checkAssestStatus:asset];
if (asset==nil)return ;
playerItem = [AVPlayerItem playerItemWithAsset:asset];
[_appDelegate.dictAVPlayerItems setObject:playerItem forKey:[contentURL.absoluteString stringByAppendingString:self.postIdOrBlogId]];
dispatch_async(dispatch_get_main_queue(),
^{
[self addPlayerItem:playerItem isNewAsset:NO];
});
}];
_contentURL = contentURL;
}
else
[self addPlayerItem:playerItem isNewAsset:YES];
}
}
-(void)checkAssestStatus:(AVURLAsset*)asset
{
NSError *error = nil;
AVKeyValueStatus tracksStatus = [asset statusOfValueForKey:#"playable" error:&error];
NSLog(#" AVURLAsset error = %#",error);
if(!asset.isPlayable)
{
NSLog(#"assest is not playable");
[self.activityIndicator stopAnimating];
return;
}
switch (tracksStatus) {
case AVKeyValueStatusLoaded:
{
NSLog(#"loaded");
}
break;
case AVKeyValueStatusFailed:
{
if (error && (error.code == AVErrorUnknown
|| error.code == AVErrorFailedToLoadMediaData))
{
[_appDelegate.dictAVPlayerItems removeObjectForKey:[asset.URL.absoluteString stringByAppendingString:self.postIdOrBlogId]];
NSLog(#"url Expired");
asset=nil;
[CommonTimelineAPI requestFreshURLFor:asset.URL.absoluteString withCompletionBlock:^(NSString *freshURL, NSError *error) {
if(freshURL)
{
NSURL* url=[NSURL URLWithString:freshURL];
if (url)
{
self.contentURL=url;
}
}
}];
}
}
break;
case AVKeyValueStatusCancelled:
{
NSLog(#"cancelled");
}
break;
case AVKeyValueStatusUnknown:
{
NSLog(#"unkonwn");
}
break;
case AVKeyValueStatusLoading:
{
NSLog(#"Loading");
}break;
}
}
I am getting duplicate peername if I connect and disconnect bluetooth multiple time in my both ios device.
Is there any way to get single name for unique peer in gkpeerpickercontroller for bluetooth chat application.
Also attached the screenshot for it.
I am using below code to show GKPeerPickerController.
-(IBAction)btnConnectClicked:(id)sender
{
[self openPeerPickerController];
}
-(IBAction)btnDisconnectClicked:(id)sender
{
[currentSession disconnectFromAllPeers];
}
-(void)openPeerPickerController
{
if (!currentSession)
{
GKPeerPickerController *peerPicker2 = [[GKPeerPickerController alloc] init];
peerPicker2.delegate = self;
peerPicker2.connectionTypesMask = GKPeerPickerConnectionTypeNearby;
[peerPicker2 show];
}
}
-(void)peerPickerController:(GKPeerPickerController *)picker didConnectPeer:(NSString *)peerID toSession:(GKSession *) session
{
NSLog(#"Peer session connected");
//set session delegate and dismiss the picker
session.delegate = self;
currentSession = session;
picker.delegate = nil;
[picker dismiss];
}
- (GKSession *)peerPickerController:(GKPeerPickerController *)picker sessionForConnectionType:(GKPeerPickerConnectionType)type
{
//create ID for session
NSString *sessionIDString = #"MTBluetoothSessionID";
//create GKSession object
GKSession *session = [[GKSession alloc] initWithSessionID:sessionIDString displayName:nil sessionMode:GKSessionModePeer];
return session;
}
-(void)peerPickerControllerDidCancel:(GKPeerPickerController *)picker
{
NSLog(#"Peer cancelled");
[currentSession disconnectFromAllPeers];
currentSession=nil;
picker.delegate = nil;
}
-(void)session:(GKSession *)session peer:(NSString *)peerID didChangeState:(GKPeerConnectionState)state
{
switch (state)
{
case GKPeerStateAvailable:
{
// not connected to session, but available for connectToPeer:withTimeout:
}
break;
case GKPeerStateUnavailable:
{
// no longer available
}
break;
case GKPeerStateConnected:
{
// connected to the session
[currentSession setDataReceiveHandler:self withContext:nil]; //set ViewController to receive data
}
break;
case GKPeerStateDisconnected:
{
// disconnected from the session
currentSession.delegate = nil;
currentSession = nil; //allow session to reconnect if it gets disconnected
}
break;
case GKPeerStateConnecting:
{
// waiting for accept, or deny response
}
break;
default:
break;
}
}
You GKPeerPickerControllerDelegate method
- (GKSession *)peerPickerController:(GKPeerPickerController *)picker sessionForConnectionType:(GKPeerPickerConnectionType)type
Returns a new session every time. In your case its being called twice and hence two sessions are created.
From the documentation:
When the peer picker needs a session, it calls this method. Your
application can either create a new session or return a previously
created session to the peer picker.
So you can declare the session as a property and write a getter and just return the session property in the delegate which will avoid creation of multiple sessions
#property (nonatomic, string) GKSession *session;
#define sessionIDString #"MTBluetoothSessionID"
- (GKSession) session {
if(!_session) {
//create GKSession object
_session = [[GKSession alloc] initWithSessionID:sessionIDString displayName:nil sessionMode:GKSessionModePeer];
}
return _session;
}
And change the delegate method to :
- (GKSession *)peerPickerController:(GKPeerPickerController *)picker sessionForConnectionType:(GKPeerPickerConnectionType)type {
return self.session;
}
Make sure to nullify the session when you're done.
This piece of code works fine on the simulator. However, when I try to run the export on my iPad, it always hangs at progress value 0.14583-ish. Can somebody help me figure out why? been stuck on this for quite awhile.
Here is my code:
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition];
if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality]) {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
initWithAsset:composition presetName:AVAssetExportPresetLowQuality];
exportSession.outputURL = [NSURL fileURLWithPath:[[ShowDAO getUserDocumentDir] stringByAppendingString:exportFilename]];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(0, 1);
CMTime duration = CMTimeMakeWithSeconds(1000, 1);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status]) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export Completed");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export cancelled");
break;
default:
break;
}
}];
while(exportSession.progress != 1.0){
NSLog(#"loading... : %f",exportSession.progress);
sleep(1);
}
[exportSession release];
}
while(exportSession.progress != 1.0){
NSLog(#"loading... : %f",exportSession.progress);
sleep(1);
}
This while loop is blocking the main thread. The NSLog might not be able to fire properly. Try it without the while loop?
I am using AVAssetExportSession to export audio files. It is working, though in a speed that is practical for use. I am setting up my exporter, getting my AVAsset, and starting the export. Here is the code. Any suggestions or insight will help.
[exporter exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"we are now exporting");
int exportStatus = exporter.status;
switch (exportStatus) {
case AVAssetExportSessionStatusFailed: {
// log error to text view
NSError *exportError = exporter.error;
NSLog (#"AVAssetExportSessionStatusFailed: %#", exportError);
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"AVAssetExportSessionStatusCompleted");
// set up AVPlayer
NSData *data = [NSData dataWithContentsOfURL:exportURL];
break;
}
case AVAssetExportSessionStatusUnknown: { NSLog (#"AVAssetExportSessionStatusUnknown"); break;}
case AVAssetExportSessionStatusExporting: { NSLog (#"AVAssetExportSessionStatusExporting"); break;}
case AVAssetExportSessionStatusCancelled: { NSLog (#"AVAssetExportSessionStatusCancelled"); break;}
case AVAssetExportSessionStatusWaiting: { NSLog (#"AVAssetExportSessionStatusWaiting"); break;}
default: { NSLog (#"didn't get export status"); break;}
}
[exporter release];
[exportURL release];
}];
You're probably causing some kind of conversion - that will be slow (not that much faster than realtime). Make sure you're using the passthrough preset, AVAssetExportPresetPassthrough.