I am using AVPlayer to stream audio. Before beginning to play from a URL, I call
[self.currentlyLoadingAsset loadValuesAsynchronouslyForKeys:#[#"playable", #"duration"] completionHandler:^{
[self evaluateAssetStatusForEpisode:episode asset:self.currentlyLoadingAsset];
}];
Only when that completionHandler executes will I reevaluate the item to see if it's ready to play.
But for one user, the completionHandler suddenly is not firing. He cannot stream audio from the web (but can still play downloaded tracks). Logging
AVKeyValueStatus status = [asset statusOfValueForKey:#"playable" error:&error];
, I find that the status is continually AVKeyValueStatusLoading. It never updates, but AVKeyValueStatusLoading never returns an error.
I have tried putting loadValuesAsynchronouslyForKeys on the main thread, removing the duration key, and making currentlyLoadingAsset a property to ensure it's not getting released.
Related
I was trying to play a haptic "AHAP" pattern from a file with the following code:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
The haptics does play successfully, but I have an issue that whenever was pattern is played, the first key press on keyboard make very loud flick sound (the default iOS flick sound) and subsequently back to normal on second press. I thought it was because the engine is still active, so I called
[engine stopWithCompletionHandler:nil];
but then the haptic doesn't play anymore (however, flick sound is normal for first key press). playPatternFromURL:error: is supposed to play synchronously, which means it'll finish playing before executing stopWithCompletionHandler: (from Apple Doc). I honestly has no idea why and how this happens. CoreHaptics rarely can be seen implemented in the wild and github except the official Apple Doc, so I have no useful references (maybe except this in github).
Any idea on this particular issue? Thanks in advance.
EDIT:
For future reader, I managed to mitigate this issue by playing it in another thread:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
});
Perhaps this is due to the Frameworks being a beta software as of writing.
EDIT 2:
Above mitigation however doesn't solve it if you have CHHapticEventTypeAudioCustom
I managed to solve it using these codes below:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
[engine notifyWhenPlayersFinished:^CHHapticEngineFinishedAction(NSError * _Nullable error) {
[engine stopWithCompletionHandler:nil];
return CHHapticEngineFinishedActionStopEngine;
}];
It seems like I needs to observe whenever the pattern stopped playing and stop the engine (not due to the framework being a beta software, my bad). However, for the method playPatternFromURL:error:, quoting from Apple Doc:
This method blocks processing on the current thread until the pattern
has finished playing.
doesn't seems to means what it means, at least to my understanding. That's why calling stopWithCompletionHandler: right after playPatternFromURL:error: failed to trigger any haptics.
Solution:
engine.playsHapticsOnly = YES;
I have an app that needs to preload a bunch of streamed videos as soon as possible so that they play instantly when the user clicks on them.
I am able to achieve this with a collection of AVPlayer objects, initialized right when the app is launched:
-(void)preloadVideos {
for (Video* video in arrayOfVideos){
NSString *streamingURL = [NSString stringWithFormat:#"https://mywebsite.com/%#.m3u8", video.fileName];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:streamingURL] options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
pthread_mutex_lock(&mutex_videoPlayers);
[_videoPlayers setObject:player forKey:videoKey];
pthread_mutex_unlock(&mutex_videoPlayers);
}
}
The lock is defined in init as:
pthread_mutex_init(&mutex_videoPlayers, NULL);
My problem is that when I invoke this function, the app freezes for about 1 minute, then continues on with no problem. This is obviously because there is a lot of processing going on - according to the debug dashboard in xcode, CPU usage spikes to about 67% during the freeze.
So I thought I could solve this by putting the operation into a background thread:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
[self preloadVideos];
});
but the app still froze briefly in exactly the same way, and CPU usage had the same pattern. I thought maybe its because the task is too intensive and needed to be broken up into smaller tasks, so I tried serializing the loop as distinct tasks:
preloadQueue = dispatch_queue_create("preloadQueue", NULL);
...
-(void)preloadVideos {
for (Video* video in arrayOfVideos){
dispatch_async(preloadQueue, ^(void){
[self preloadVideo:video]; // a new function with the logic above
});
}
but that seemed to make the freeze period longer, even though max CPU usage went down to 48%.
Am I missing something with these GCD functions? Why does the AVPlayer creation block the main thread when put into background tasks?
I know its not that there are too many AVPlayers created, because there are only 6 of them, and the app runs fine after they are created.
After adding log messages I notice that (in all implementations), the setObject call is called for every single video player before the interface's viewDidAppear method is called. Also, 5 videos load instantly, and the last - a longer one - takes a while but the freeze ends right when it completes.
Why is the app waiting for background tasks to finish before updating the views?
Update:
The app accesses videoPlayers while these tasks are running, but since I use a lock while writing, I don't lock while reading. Here is the definition:
#property (atomic, retain) NSMutableDictionary *videoPlayers;
Update: updated preloadVideos with mutex locks, still seeing the freezing
Turns out the background thread was locking a resource that the main thread was accessing elsewhere. The main thread needed to wait for the resource to become freed, which caused the interface to freeze.
Your dispatch_async code should not be freezing the main thread. That should be creating the asset objects in the background. It will take time before the assets become available, but that should be ok.
What do you mean "...the app still froze briefly..." Froze how? And for how long?
How are you using the _videoPlayers array once you've loaded it? What are you doing to handle the fact that the array may only be partially loaded? (If you are looping through the _videoPlayers array when it gets saved to from the background you may crash.) At the very least you should make videoPlayers an atomic property of you class and always reference it (read and write) using property notation (self.videoPlayers or [self videoPlayers], never _videoPlayers.) You will probably need better protection than that, like using #synchronized for the code that accesses the array.
Quoted from Xcode doc,
AVAudioSessionSilenceSecondaryAudioHintNotification Posted on the main
thread when the primary audio from other applications starts and
stops.
Subscribe to this notification to ensure that your app is notified
when optional secondary audio muting should begin or end.
However, when my app's audio is playing, and I press the remote-control to start playing music from the Music app. This notification is not triggered in my observer callback. I believe the registration was successful.
Am I having the wrong expectation? Is it supposed to be triggered in a different scenario? Any examples?
You need to set your AVAudioSession Category to AVAudioSessionCategoryAmbient, then your app will allow background apps (such as music or podcasts) to play. If you set it to AVAudioSessionCategorySoloAmbient then it will not accept background music
You can do it like this:
NSError *categoryError = nil;
if ([[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&categoryError]) {
printf("Setting AVAudioSession CategoryAmbient Succeeded\n");
} else {
printf("Setting AVAudioSession CategoryAmbient Failed\n");
}
Check out Audio Session Categories for more details
Can you point me to design pattern guides to adapt my style to AVFoundation's asynch approach?
Working an app where you create an image and place audio onto hotspots on it. I'm implementing export to a movie that is the image with effects (glow of hotspot) playing under the audio.
I can reliably create the video and audio tracks and can correctly get audio into an AVMutableComposition and play it back. Problem is with the video. I've narrowed it to my having written a synchronous solution to a problem that requires use of AVFoundation's asynch writing methods.
The current approach and where it fails (each step is own method):
Create array of dictionaries. 2 objects in dictionary. One dictionary object is image representing a keyframe, another object is URL of audio that ends on that keyframe. First dictionary has start keyframe but not audio URL.
For each dictionary in the array, replace the UIImage with an array of start image->animation tween images->end state image, with proper count for FPS and duration of audio.
For each dictionary in the array, convert image array into a soundless mp4 and save using [AVAssetWriter finishWritingWithCompletionHandler], then replace image array in dictionary with URL of mp4. Each dictionary of mp4 & audio URL represents a segment of final movie, where order of dictionaries in array dictates insert order for final movie
-- all of above works, stuff gets made & ordered right, vids and audio playback --
For each dictionary with mp4 & audio URL, load into AVAssets and insert into an AVMutableComposition track, one track for audio & one for video. The audio load & insert works, plays back. But the video fails and appears to fail because step 4 starts before step 3's AVAssetWriter finishWritingWithCompletionHandler finishes for all MP4 tracks.
One approach would be to pause via while loop and wait for status on the AVAssetWriter to say done. This smacks of working against the framework. In practice it is also leading to ugly and sometimes seemingly infinite waits for loops to end.
But simply making step 4 the completion handler for finishWritingWithCompletionHandler is non-trivial because I am writing multiple tracks but I want step 4 to launch only after the last track is written. Because step 3 is basically a for-each processor, I think all completion handlers would need to be the same. I guess I could use bools or counters to change up the completion handler, but it just feels like a kluge.
If any of the above made any sense, can someone give me/point to a primer on design patterns for asynch handling like this? TIA.
You can use GCD dispatch groups for that sort of problem.
From the docs:
Grouping blocks allows for aggregate synchronization. Your application
can submit multiple blocks and track when they all complete, even
though they might run on different queues. This behavior can be
helpful when progress can’t be made until all of the specified tasks
are complete.
The basic idea is, that you call dispatch_group_enter for each of your async tasks. In the completion handler of your tasks, you call dispatch_group_leave.
Dispatch groups work similar to counting semaphores. You increment a counter (using dipsatch_group_wait) when you start a task, and you decrement a counter when a task finishes.
dispatch_group_notify lets you install a completion handler block for your group. This block gets executed when the counter reaches 0.
This blog post provides a good overview and a complete code sample: http://amro.co/post/48248949039/using-gcd-to-wait-on-many-tasks
#weichsel Thank you very much. That seems like it should work. But, I'm using dispatch_group_wait and it seems to not wait. I've been banging against it for several hours since you first replied but now luck. Here's what I've done:
Added property that is a dispatch group, called videoDispatchGroup, and call dispatch_group_create in the init of the class doing the video processing
In the method that creates the video tracks, use dispatch_group_async(videoDispatchGroup, dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ [videoWriter finishWritingWithCompletionHandler:^{
The video track writing method is called from a method chaining together the various steps. In that method, after the call to write the tracks, I call dispatch_group_wait(videoProcessingGroup, DISPATCH_TIME_FOREVER);
In the dealloc, call dispatch_release(videoDispatchGroup)
That's all elided a bit, but essentially the call to dispatch_group_wait doesn't seem to be waiting. My guess is it has something to do with the dispatch_group_asyn call, but I'm not sure exactly what.
I've found another means of handling this, using my own int count/decrement via the async handler on finishWritingWithCompletion handler. But I'd really like to up my skills by understanding GCD better.
Here's the code-- dispatch_group_wait never seems to fire, but the movies themselves are made. Code is elided a bit for brevity, but nothing was removed that doesn't work without the GCD code.
#implementation MovieMaker
// This is the dispatch group
#synthesize videoProcessingGroup = _videoProcessingGroup;
-(id)init {
self = [super init];
if (self) {
_videoProcessingGroup = dispatch_group_create();
}
return self;
}
-(void)dealloc {
dispatch_release(self.videoProcessingGroup);
}
-(id)convert:(MTCanvasElementViewController *)sourceObject {
// code fails in same way with or without this line
dispatch_group_enter(self.videoProcessingGroup);
// This method works its way down to writeImageArrayToMovie
_tracksData = [self collectTracks:sourceObject];
NSString *fileName = #"";
// The following seems to never stop waiting, the movies themselves get made though
// Wait until dispatch group finishes processing temp tracks
dispatch_group_wait(self.videoProcessingGroup, DISPATCH_TIME_FOREVER);
// never gets to here
fileName = [self writeTracksToMovie:_tracksData];
// Wait until dispatch group finishes processing final track
dispatch_group_wait(self.videoProcessingGroup, DISPATCH_TIME_FOREVER);
return fileName;
}
// #param videoFrames should be NSArray of UIImage, all of same size
// #return path to temp file
-(NSString *)writeImageArrayToMovie:(NSArray *)videoFrames usingDispatchGroup:(dispatch_group_t)dispatchGroup {
// elided a bunch of stuff, but it all works
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:result]
fileType:AVFileTypeMPEG4
error:&error];
//elided stuff
//Finish the session:
[writerInput markAsFinished];
dispatch_group_async(dispatchGroup, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[videoWriter finishWritingWithCompletionHandler:^{
dispatch_group_leave(dispatchGroup);
// not sure I ever get here? NSLogs don't write out.
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
}];
});
return result;
}
I'm working on video editing application for iphone/ipod touch. My app simply asks user to choose one of already existing videos in the camera roll directory, then frame by frame changes pixel values and saves again under different name in the camera roll directory. Because video processing might be quite long I really need to implement some kind of functionality to resume previously started session(ie. when video processing reaches 30% of total video length and user presses down the home button(or there is a phone call) when my application is brought back to the foreground video processing should start from 30%, not from beginning).
Most important parts of my code(simplified a bit to be more readable):
AVAssetReader* assetReader = [[AVAssetReader alloc] initWithAsset:sourceAsset error:&error];
NSArray* videoTracks = [sourceAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack* videoTrack = [videoTracks objectAtIndex:0];
AVAssetReaderTrackOutput* assetReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack
[assetReader addOutput:assetReaderOutput]);
[assetReader addOutput:assetReaderOutput];
[mVideoWriter startWriting]
[mAssetReader startReading]
[mVideoWriter startSessionAtSourceTime: kCMTimeZero];
mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{
while (mWriterInput.readyForMoreMediaData) {
CMSampleBufferRef nextBuffer = [mAssetReaderOutput copyNextSampleBuffer];
if (nextBuffer) {
// frame processing here
} // if
} // while
}]; // block
// when done saving my video to camera roll directory
I'm listening to my app's delegate callback methods like applicationWillResignActive, applicationWillEnterForeground but I'm not sure how to handle them in proper way. What I've tried already:
1) [AVAssetWriter start/endSessionAtSourceTime], unfortunately stopping and starting in the last frame presentation time did not work for me
2) saving "by hand" every part of movie that was processed and when processing reaches 100% merging all of them using AVMutableComposition, this solution however sometimes crashes in dispatch queue
Would be really great if someone could give me some hints how it should be done correctly...
I'm pretty sure AVAssetWriter can't append, so something along the lines of 2), saving the pieces then merging them, is probably the best solution if you must make your export restartable.
But first you have to resolve that crash.
Also, before you start creating hundreds of movie pieces however, you should have a look AVAssetWriter.movieFragmentInterval as with careful management of presentation time stamps/durations you may be able to use it minimise the number of pieces you have to merge.
Have you tried -[UIApplication beginBackgroundTaskWithExpirationHandler:]? This seems like a good place to request extra operating time in the background to finish the heavy lifting.