I want to check if my speaker is working sound or not programmatically.
Currently I was playing audio and recording that and checking if sound is recorded using peakPower.
recorder?.updateMeters()
let peakRecordedValue: Float = recorder?.peakPower(forChannel: 1) ?? 0.0
if peakRecordedValue <= 0 && peakRecordedValue >= -30 {
// Speaker is working
}
It was working fine for me. But there is a problem. If mic is not working or we have some issue in recording then it will not receive sound and I can't check if this problem is with mic or speaker.
Is there any other way to check if sound is coming from speakers without using other resources like mic.
PS: AVAudioSessionPortBuiltInSpeaker is the current port.
Related
I making some chat app with the WebRTC in iOS.
I set AVAudioSession like this:
let configuration = RTCAudioSessionConfiguration.init()
configuration.category = AVAudioSession.Category.playAndRecord.rawValue;
configuration.categoryOptions = [.mixWithOthers]
configuration.mode = AVAudioSession.Mode.voiceChat.rawValue;
This setting is works well. Playing well the background audio when I use the microphone. But stream sound is only from the microphone. I want to stream the microphone sound with background sound.
Anybody knows this?
Thank you.
Context:
I am using AudioKit and GPUImage2 in my app.
I have a first screen (Record) to get a preview and record the camera.
I have a second screen (Play) to play the recorded movie and to process the audio (with AudioKit)
on Record screen, GPUImage2 uses a simple AVAssetWriter to capture video and audio. The audio is recorded with this trivial code:
guard (assetWriterAudioInput.isReadyForMoreMediaDatav) else {
return
}
if (!assetWriterAudioInput.append(sampleBuffer)) {
print("Trouble appending audio sample buffer")
} else {
}
This code is called as it should and assetWriterAudioInput is not nil.
On Play screen, I launch two players, a muted AVPlayer for video and a AKAudioPlayer for audio (to be processed) with the same file as source.
Two notes which if removed do not affect my issue.
During record, I play a music on top of the user voice, with AKAudioPlayer.
During the play, I put the music on a mixer with recorded voice (so 2 AKAudioPlayer into 1 output AKMixer)
During record, I play a music on top of the user voice, with AKAudioPlayer.
NB: during the play, I put the music on a mixer with recorded voice (so 2 AKAudioPlayer into 1 output AKMixer)
My issue is:
When I record the first time, I get only the video on Play screen, no audio has been recorded. The mixer is working good as I hear the music I add in it.
When I start again (back action to Record screen), I record normally and on Play screen, everything is functional -> video plays, voice+music is played.
Can someone help me to resolve this?
If it can help, the AudioSessionCategory I use is PlaybackCategory, it never changes during the app lifecycle, so same on first and next records. It only changes from Ambient to Playback at app launch because I let the speakers play with physical mute button active.
I'm developing a live social app, it's able users have text chat when they are watching gameplay.
I found AVPlayer will stop video when user tap voice input on the keyboard.
How do I fixed this issue ?
This will happen because when you record your voice, the sound coming from the video will interrupt the recording (voice coming out of the phone speaker will heavily distort sound going through the microphone) Thus the video will automatically be stopped.
Please try adding the code below: (only need to set it once)
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
my solution is add observer of
UITextInputCurrentInputModeDidChange
I have no idea how to playing video and using voice input at the same time ( just like facebook iOS behavior )
however I can resume the video after voice input is done ( just like youtube iOS behavior )
func textInputMethodDidChange(notification: NSNotification) {
print("textInputMethodDidChange")
if self.textInputMode == nil || self.textInputMode?.primaryLanguage != "dictation" {
self.player.play()
}
}
I have a recorded video clip that I want to play in reverse.
Playing forward is fine but as soon as I seek to the end of the video file and set the rate of playback to -1.0 the video clip seeks to the end of the clip (I see this in the timeline bar above the video) but does not play in reverse.
After I present the player view controller I check if it is ready to use:
print("readyForDisplay = \(playerViewController.readyForDisplay)")
This tells me that all is ready to prepare to play.
I check if reverse play is possible:
let reversePlay = playerViewController.player!.currentItem?.canPlayReverse
print("reversePlay = \(reversePlay)")
This returns TRUE
I then seek to the end of the clip and set the play back rate to -1.0 for reverse play.
playerViewController.player!.seekToTime(playerViewController.player!.currentItem!.asset.duration)
playerViewController.player!.rate = -1.0
I believe having got this far it is ready to play because if I add the following:
let status : AVPlayerItemStatus? = playerViewController.player!.currentItem?.status
if status == AVPlayerItemStatus.ReadyToPlay {
print("Ready to play")
}
It shows me that the clip is ready to play, so I am assuming that seeking to the end of clip (which is only 2 seconds long) has completed.
I then play the clip:
playerViewController.player!.play()
It plays fine if I don't seek to the end and attempt to change the rate to set it to reverse play.
What am I missing here?
I decided to add some logging after launching the video and found that the rate was -1 before launching and 1 immediately after so it seems that the rate is reset to 1.
Could this be a bug?
Anyway setting the rate immediately after the request to play the video has the desire effect.
I am writing an app that plays an audio track for the user to listen to whilst recording from the camera and microphone (using headphones).
The audio track is played using AVAudioPlayer.
The camera/microphone is recorded using AVCaptureSession.
The output uses AVCaptureMovieFileOutput.
It all works perfectly on an iPhone 5 but my iPad 4 experiences an odd side effect. When playing back the recording from the iPad you can hear the audio track as if it has also been recorded. This is all done whilst using headphones and the audio is too quiet to be picked up by the microphone.
When testing on the iPhone only the audio from the mic is recorded, as expected. Both use the same code. Any ideas would be appreciated!!
In case anyone else is having the same problem. I couldn't solve it. Instead I used the setPreferredDataSource:error: method in AVAudioSession to use the device microphone instead of the one on the headset.