CHHapticEngine: Properly stop engine after a playback - ios

I was trying to play a haptic "AHAP" pattern from a file with the following code:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
The haptics does play successfully, but I have an issue that whenever was pattern is played, the first key press on keyboard make very loud flick sound (the default iOS flick sound) and subsequently back to normal on second press. I thought it was because the engine is still active, so I called
[engine stopWithCompletionHandler:nil];
but then the haptic doesn't play anymore (however, flick sound is normal for first key press). playPatternFromURL:error: is supposed to play synchronously, which means it'll finish playing before executing stopWithCompletionHandler: (from Apple Doc). I honestly has no idea why and how this happens. CoreHaptics rarely can be seen implemented in the wild and github except the official Apple Doc, so I have no useful references (maybe except this in github).
Any idea on this particular issue? Thanks in advance.
EDIT:
For future reader, I managed to mitigate this issue by playing it in another thread:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
});
Perhaps this is due to the Frameworks being a beta software as of writing.
EDIT 2:
Above mitigation however doesn't solve it if you have CHHapticEventTypeAudioCustom

I managed to solve it using these codes below:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
[engine notifyWhenPlayersFinished:^CHHapticEngineFinishedAction(NSError * _Nullable error) {
[engine stopWithCompletionHandler:nil];
return CHHapticEngineFinishedActionStopEngine;
}];
It seems like I needs to observe whenever the pattern stopped playing and stop the engine (not due to the framework being a beta software, my bad). However, for the method playPatternFromURL:error:, quoting from Apple Doc:
This method blocks processing on the current thread until the pattern
has finished playing.
doesn't seems to means what it means, at least to my understanding. That's why calling stopWithCompletionHandler: right after playPatternFromURL:error: failed to trigger any haptics.

Solution:
engine.playsHapticsOnly = YES;

Related

CallKit:No sound when I use WebRTC

Our project uses WebRTC for VOIP calls and it works fine before accessing the CallKit framework. But when I tried to access the CallKit framework, there was a situation where neither side could hear each other's speech. When I removed CallKit, everything returned to normal.
CallKit's answer button is the same function as the original answer button in the project.
And what amazed me was that it was not necessary to hear no sound. Sometimes everything is normal, but sometimes there will be problems. Well, the probability of a problem is greater.
I found the following flowchart, I suspect the problem lies in the order of function calls. But I do not know how WebRTC corresponds to the functions in the diagram.
In addition, I am curious whether socket instability will cause the CallKit framework to work abnormally
Please forgive me English is not good, but this problem has been haunted me for several days, I do not know where exactly a problem, is not where the conflict with the CallKit framework?
Hope you can help me, thank you very much!
Few steps need to be done to connect webrtc and callkit in proper way:
First of all, you have to use the RTCAudioTrack and add the RTCAudioSession for handling the audio. Old legacy RTCAudioSession added directly into RTCPeerConnection works but it's not prefered a way to do that.
Second thing is to use manualAudio. When app is booted you should change useManualAudio flag on RTCAudioSession:
RTCAudioSession.sharedSession().useManualAudio = true
which gives you possibility to postpone the audio until CallKit informs that audio session was activated, so inside the ProviderDelegate you should implement following method:
(void)provider:(CXProvider *)provider didActivateAudioSession:(AVAudioSession *)audioSession
RTCAudioSession.sharedSession().didActivecated(audioSession)
RTCAudioSession.sharedSession().isAudioEnabled = true
and for second audio delegate method don't forget to add:
(void)provider:(CXProvider *)provider didDeactivateAudioSession:(AVAudioSession *)audioSession
RTCAudioSession.sharedSession().didDeactivecated(audioSession)
RTCAudioSession.sharedSession().isAudioEnabled = false
Apple suggests us to wait till the Connection gets established and then fulfill the performAnswerAction. Below is the source
Apple Suggestion for Call Kit Documentation
If the recipient of a call answers before the app establishes a connection to your server, don't fulfill the CXAnswerCallAction object sent to the provider:performAnswerCallAction: method of your delegate immediately. Instead, wait until you establish a connection and then fulfill the object. While it waits for your app to fulfill the request, the incoming call interface lets the user know that the call is connecting, but not yet ready.
So we need to wait for a second or two before we fulfill the action in performAnswerCallAction
In the end, I solved the problem, but I still do not understand why it can be solved.Below is my solution:
First of all, I delay the call of "fulfill" by 1 second (note that this time can not be less than 1 second)
- (void)provider:(CXProvider *)provider performAnswerCallAction:(CXAnswerCallAction *)action {
if (self.delegate && [self.delegate respondsToSelector:#selector(callKitManager:refreshCurrentCallStatus:)]) {
[self.delegate callKitManager:self refreshCurrentCallStatus:EUCCallKitStatusAnswerAccept];
}
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[action fulfill];
});}
Second, I also delayed my network request call by one second (here longer than the previous one)
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self.peerConnection offerForConstraints:[self offerConstraintsRestartIce:NO] completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
[self peerConnection:self.peerConnection didCreateSessionDescription:sdp error:error];
}];
});
In this way, my problem is solved.
If you know why this can solve this problem, please comment on me, thank you!

iOS interface freeze caused by background thread

I have an app that needs to preload a bunch of streamed videos as soon as possible so that they play instantly when the user clicks on them.
I am able to achieve this with a collection of AVPlayer objects, initialized right when the app is launched:
-(void)preloadVideos {
for (Video* video in arrayOfVideos){
NSString *streamingURL = [NSString stringWithFormat:#"https://mywebsite.com/%#.m3u8", video.fileName];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:streamingURL] options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
pthread_mutex_lock(&mutex_videoPlayers);
[_videoPlayers setObject:player forKey:videoKey];
pthread_mutex_unlock(&mutex_videoPlayers);
}
}
The lock is defined in init as:
pthread_mutex_init(&mutex_videoPlayers, NULL);
My problem is that when I invoke this function, the app freezes for about 1 minute, then continues on with no problem. This is obviously because there is a lot of processing going on - according to the debug dashboard in xcode, CPU usage spikes to about 67% during the freeze.
So I thought I could solve this by putting the operation into a background thread:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
[self preloadVideos];
});
but the app still froze briefly in exactly the same way, and CPU usage had the same pattern. I thought maybe its because the task is too intensive and needed to be broken up into smaller tasks, so I tried serializing the loop as distinct tasks:
preloadQueue = dispatch_queue_create("preloadQueue", NULL);
...
-(void)preloadVideos {
for (Video* video in arrayOfVideos){
dispatch_async(preloadQueue, ^(void){
[self preloadVideo:video]; // a new function with the logic above
});
}
but that seemed to make the freeze period longer, even though max CPU usage went down to 48%.
Am I missing something with these GCD functions? Why does the AVPlayer creation block the main thread when put into background tasks?
I know its not that there are too many AVPlayers created, because there are only 6 of them, and the app runs fine after they are created.
After adding log messages I notice that (in all implementations), the setObject call is called for every single video player before the interface's viewDidAppear method is called. Also, 5 videos load instantly, and the last - a longer one - takes a while but the freeze ends right when it completes.
Why is the app waiting for background tasks to finish before updating the views?
Update:
The app accesses videoPlayers while these tasks are running, but since I use a lock while writing, I don't lock while reading. Here is the definition:
#property (atomic, retain) NSMutableDictionary *videoPlayers;
Update: updated preloadVideos with mutex locks, still seeing the freezing
Turns out the background thread was locking a resource that the main thread was accessing elsewhere. The main thread needed to wait for the resource to become freed, which caused the interface to freeze.
Your dispatch_async code should not be freezing the main thread. That should be creating the asset objects in the background. It will take time before the assets become available, but that should be ok.
What do you mean "...the app still froze briefly..." Froze how? And for how long?
How are you using the _videoPlayers array once you've loaded it? What are you doing to handle the fact that the array may only be partially loaded? (If you are looping through the _videoPlayers array when it gets saved to from the background you may crash.) At the very least you should make videoPlayers an atomic property of you class and always reference it (read and write) using property notation (self.videoPlayers or [self videoPlayers], never _videoPlayers.) You will probably need better protection than that, like using #synchronized for the code that accesses the array.

AVURLAsset status stuck on AVKeyValueStatusLoading; loadValuesAsynchronouslyForKeys:completionHandler never completes

I am using AVPlayer to stream audio. Before beginning to play from a URL, I call
[self.currentlyLoadingAsset loadValuesAsynchronouslyForKeys:#[#"playable", #"duration"] completionHandler:^{
[self evaluateAssetStatusForEpisode:episode asset:self.currentlyLoadingAsset];
}];
Only when that completionHandler executes will I reevaluate the item to see if it's ready to play.
But for one user, the completionHandler suddenly is not firing. He cannot stream audio from the web (but can still play downloaded tracks). Logging
AVKeyValueStatus status = [asset statusOfValueForKey:#"playable" error:&error];
, I find that the status is continually AVKeyValueStatusLoading. It never updates, but AVKeyValueStatusLoading never returns an error.
I have tried putting loadValuesAsynchronouslyForKeys on the main thread, removing the duration key, and making currentlyLoadingAsset a property to ensure it's not getting released.

iOS audio system. Start & stop or just start?

I have an app, where audio recording is the main and the most important part. However user can switch to table view controller where all records are displayed and no recording is performed.
The question is what approach is better: "start & stop audio system or just start it". It may seem obvious that the first one is more correct, like "allocate when you need it, deallocate when used it". I will show my thoughts on this question and I hope to find approval or disapproval with arguments among skilled people.
When I constructed AudioController.m the first time I implemented methods to open/close audio session and to start/stop audio unit. I wanted to stop audio system when recording is not active. I used the following code:
- (BOOL)startAudioSystem {
// open audio session
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
if (![audioSession setActive:YES error:&err] ) {
NSLog(#"Couldn't activate audio session: %#", err);
}
// start audio unit
OSStatus status;
status = AudioOutputUnitStart([self audioUnit]);
BOOL noErrors = err == nil && status == noErr;
return noErrors;
}
and
- (BOOL)stopAudioSystem {
// stop audio unit
BOOL result;
result = AudioOutputUnitStop([self audioUnit]) == noErr;
HANDLE_RESULT(result);
// close audio session
NSError *err;
HANDLE_RESULT([[AVAudioSession sharedInstance] setActive:NO withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&err]);
HANDLE_ERROR(err);
BOOL noErrors = err == nil && result;
return noErrors;
}
I found this approach problematic because of the following reasons:
Audio system starts with delay. That means, recording_callback() not called for some time. I suspect it is AudioOutputUnitStart, which is responsible for that. I tried to comment out the line with this function call and move it to initialization. the delay was gone.
If user performs switching between recording view and table view very very fast (audio system's starts and stops are very fast too), it cause the death of media service (I know that observing AVAudioSessionMediaServicesWereResetNotification could help here, but it is not the point).
To resolve these issues I modified AudioController.m with other approach which I managed to discover: start audio system when application becomes active and do not stop it before the app is terminated In this case there are also several issues:
CPU usage
If audio category is set to recording only, then no other audio could be played when user explores table view controller.
The first one surprisingly is not a big deal, if cancel any kind of processing in recording_callback() like this:
static OSStatus recordingCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
AudioController *input = (__bridge AudioController*)inRefCon;
if(!input->shouldPerformProcessing)
return noErr;
// processing
// ...
//
return noErr;
}
By doing this CPU usage equals to 0% on real device, when no recording is needed and no other actions are performed.
And the second issue can be solved by switching audio category to RecordAndPlay and enable mixing or just ignore the problem. For example in my case app requires mini Jack to be used by external device, so no headphones can be used in parallel.
Despite all this, the first approach is more close to me since I like to close/clean every stream/resource when it is no longer needed. And I want to be sure that there is indeed no other option than just start audio system. Please make me sure that I'm not the only one who came to this solution and it is the correct one.
The key to solving this problem is to note that the audio system actually runs in another (real-time) thread. And you can't really stop and deallocate something running in another thread exactly when you (or the app's main UI thread) "don't need it", but have to delay in order to allow the other thread to realize it needs to do something and then finish and clean up itself. This can potentially take up to many 100's of milliseconds for audio.
Given that, strategy 2 (just start) is safer and more realistic.
Or perhaps set a delay of many many seconds of non-use before attempting to stop audio, and possibly another short delay after that before attempting any restart.

iOS AVPlayer EXC_BAD_ACCESS when accessing properties

I'm working on a plugin for Apache Cordova that will allow audio streaming from a remote URL through the Media API.
The issue that I'm experiencing is that I get an EXC_BAD_ACCESS signal whenever I try to access certain properties on the AVPlayer instance. currentTime and isPlaying are the worst offenders. The player will be playing sound through the speakers, but as soon as my code reaches a player.currentTime or [player currentTime] it throws the bad access signal.
[player play];
double position = round([player duration] * 1000) / 1000;
[player currentTime]; //This will cause the signal
I am using ARC, so I'm not releasing anything that shouldn't be.
Edit:
Everything that I've done has been hacking around on the Cordova 3 CDVSound class as a proof of concept for actual streaming on iOS.
The original code can be found here: https://github.com/apache/cordova-plugin-media/tree/master/src/ios
My code can be found here:
CDVSound.h
CDVSound.m
The method startPlayingAudio will trip an exc_bad_access at line 346. Removing line 346 will cause audio to play, but it will trip a bad access later down the road when getCurrentPositionAudio and line 532 is called.
Edit / Solution
So it turns out that the best way to handle this is to use a AVPlayerItem and then access it with player.currentItem.currentTime. The real question then becomes, why isn't this behavior documented with AVPlayer and why does it behave like this?

Resources