In my application, I am using MPMusicPlayerController.systemMusicPlayer for the playing song of Apple music, it's working fine. But when I play back Spotify track using playSpotifyURI it's not working. I have checked logs but not showing error anywhere.
Scenario
Step 1. Play track using playSpotifyURI. It is playing fine
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Step 2. stop track using.
SPTAudioStreamingController.sharedInstance().setIsPlaying(false, callback: { (error) in
})
Step 3. play apple music song using theMPMusicPlayerController.systemMusicPlayer
func beginPlayback(itemID: String) {
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
//musicPlayerController.setQueue(with: [itemID]) //1324456545
musicPlayerController.setQueue(with: [itemID])
musicPlayerController.prepareToPlay { (error) in
print("prepareToPlay----------------")
}
musicPlayerController.play()
}
Step 4. Stop Apple music song using.
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
Step 5. Play track using playSpotifyURI using below code but it's not playing, I couldn't find any error.
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Is there any issue in the above code? Please help me to solve an issue. Any help will be appreciated.
Related
AudioKit 4.9.3
iOS 11+
I am working on a project where the user is recording on the device using the microphone and it continues to record, even if the app is in the background. This works fine but when receiving a phone call I get an AudioKit error. I assume it has something to do with the phone taking over the mic or something. here is the error:
[avae] AVAEInternal.h:109
[AVAudioEngineGraph.mm:1544:Start: (err = PerformCommand(*ioNode,
kAUStartIO, NULL, 0)): error 561017449
AudioKit+StartStop.swift:restartEngineAfterRouteChange(_:):198:error
restarting engine after route change
basically everything that i have recording up until that point is lost.
here is my set up AudioKit code:
func configureAudioKit() {
AKSettings.audioInputEnabled = true
AKSettings.defaultToSpeaker = true
do {
try try audioSession.setCategory((AVAudioSession.Category.playAndRecord), options: AVAudioSession.CategoryOptions.mixWithOthers)
try audioSession.setActive(true)
audioSession.requestRecordPermission({ allowed in
DispatchQueue.main.async {
if allowed {
print("Audio recording session allowed")
self.configureAudioKitSession()
} else {
print("Audio recoding session not allowed")
}
}
})
} catch let error{
print("Audio recoding session not allowed: \(error.localizedDescription)")
}
}
func configureAudioKitSession() {
isMicPresent = AVAudioSession.sharedInstance().isInputAvailable
if !isMicPresent {
return
}
print("mic present and configuring audio session")
mic = AKMicrophone()
do{
let _ = try AKNodeRecorder(node: mic)
let recorderGain = AKBooster(mic, gain: 0)
AudioKit.output = recorderGain
//try AudioKit.start()
}
catch let error{
print("configure audioKit error: ", error)
}
}
and when tapping on the record button code:
do {
audioRecorder = try AVAudioRecorder(url: actualRecordingPath, settings: audioSettings)
audioRecorder?.record()
//print("Recording: \(isRecording)")
do{
try AudioKit.start()
}
catch let error{
print("Cannot start AudioKit", error.localizedDescription)
}
}
Current audio Settings:
private let audioSettings = [
AVFormatIDKey : Int(kAudioFormatMPEG4AAC),
AVSampleRateKey : 44100,
AVNumberOfChannelsKey : 2,
AVEncoderAudioQualityKey : AVAudioQuality.medium.rawValue
]
What can I do to ensure that I can get a proper recording, even when receiving a phone call? The error happens as soon as you receive the call - whether you choose to answer it or decline.
Any thoughts?
I've done work in this area, I'm afraid you cannot access the microphone(s) while a call or a VOIP call is in progress.
This is a basic privacy measure that is enforced by iOS for self-evident reasons.
AudioKit handles only the basic route change handling for an audio playback app. We've found that when an app becomes sufficiently complex, the framework can't effectively predestine the appropriate course of action when interruptions occur. So, I would suggest turning off AudioKit's route change handling and respond to the notifications yourself.
Also, I would putting AudioKit activation code in a button.
My App allows use of HFP (Hands Free Protocol) for it's "Spoken" prompts (like a Navigation App).
The function below to Setup Audio before TextToSpeech or AVAudioPlayer has worked fairly well since iOS 9.x.
I didn't test it against running a PodCast very often so I'm not sure when things broke. The Function below works perfect if Music is Streaming from the Phone to the Bluetooth A2DP or Music is Playing on the FM Radio in the Car (i.e. it will interrupt Radio, Prompt and the resume Radio). But it will NOT work if I'm streaming a PodCast. PodCast pauses and silence comes out for the prompt, then resumes the PodCast.
I recently checked Waze, Google Maps and Apple Maps (all which also offer this HFP option).
Waze is broken (but again I don't test against PodCast often).
Google Maps still works perfectly.
Apple Maps is just weird. The option for HFP is greyed out when Streaming. But when it tries to Pause and Play it also fails.
But again, Google Maps works, so it can be done.
When I call setPreferredInput with the HFP route, my route change handler (also shown below) is NOT called if a PodCast is streaming. If music is Streaming my route change handler is called and Audio from my app comes over HFP correctly.
Background or Foreground doesn't matter.
Any suggestions to solve would be greatly appreciated.
func setupSound(_ activate: Bool)
{
if !Settings.sharedInstance.soundOn && !Settings.sharedInstance.voiceOn
{
return
}
var avopts:AVAudioSessionCategoryOptions = [
.mixWithOthers,
.duckOthers,
.interruptSpokenAudioAndMixWithOthers,
.allowBluetooth
]
if #available(iOS 10.0, *)
{
avopts.insert(AVAudioSessionCategoryOptions.allowBluetoothA2DP)
avopts.insert(AVAudioSessionCategoryOptions.allowAirPlay)
}
var HFP = false
if Settings.sharedInstance.HFPOn && callCenter.currentCalls == nil
{
do
{
if #available(iOS 10.0, *)
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeSpokenAudio, options: avopts)
}
else
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: avopts)
try AVAudioSession.sharedInstance().setMode(AVAudioSessionModeSpokenAudio)
}
}
catch
{
Settings.vprint("Failed Setup HFP Bluetooth Audio, ", error.localizedDescription)
}
if let inputs = AVAudioSession.sharedInstance().availableInputs
{
for route in inputs
{
if route.portType == AVAudioSessionPortBluetoothHFP
{
HFP = true
do
{
try AVAudioSession.sharedInstance().setPreferredInput(route)
HFP = true
break
}
catch
{
Settings.vprint("Failed Set Route HFP Bluetooth Audio, ", error.localizedDescription)
}
}
}
}
}
lastHFPStatus = HFP
if !HFP
{
var avopts:AVAudioSessionCategoryOptions = [
.mixWithOthers
]
if Settings.sharedInstance.duckingSoundOn
{
avopts.insert(.duckOthers)
avopts.insert(.interruptSpokenAudioAndMixWithOthers)
}
if Settings.sharedInstance.speakerOnlyOn
{
avopts.insert(.defaultToSpeaker)
do
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: avopts)
}
catch
{
Settings.vprint("Failed setCategory, ", error.localizedDescription)
}
}
else
{
do
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: avopts)
}
catch
{
Settings.vprint("Failed setCategory, ", error.localizedDescription)
}
}
}
do
{
try AVAudioSession.sharedInstance().setActive(activate, with: .notifyOthersOnDeactivation)
if ((Settings.sharedInstance.debugMask & 2) != 0)
{
Settings.vprint(activate)
}
}
catch
{
Settings.vprint("Could not setActive, ", error.localizedDescription)
}
}
#objc func handleRouteChange(notification: Notification)
{
Settings.vprint(notification)
let currentRoute = AVAudioSession.sharedInstance().currentRoute
for route in currentRoute.outputs
{
Settings.vprint("Change Output: ", route.portName)
}
for route in currentRoute.inputs
{
Settings.vprint("Change Input: ", route.portName)
}
}
I am having issue with Music player, most of the songs gives Error
Error Domain=MPErrorDomain Code=4
The testing device has Apple music subscription and the tracks gives error on the app they are working fine in Apple music app!
Here is the code:
let applicationMusicPlayer = MPMusicPlayerController.systemMusicPlayer()
applicationMusicPlayer.setQueueWithStoreIDs([ID])
if #available(iOS 10.1, *)
{
applicationMusicPlayer.prepareToPlay { (error) in
if (error != nil)
{
print("[MUSIC PLAYER] Error preparing : \(String(describing: error))")
return
}else
{
self.start_timer();
self.applicationMusicPlayer.play()
}
}
}else
//Play directly ios below version 10.1
{
self.applicationMusicPlayer.play()
}
}
But what I've tried, when the track gives this error, I went to Apple music player and played it from there its worked, then I came back to my app and play it from my app its worked also fine, so I need to go to Apple music app to play tracks not playing in my app to make them work in my app! That's so weird any idea why?
PS: the testing device has Apple music subscription
I had some similar problems when adding songs to a playlist, solved it by using:
DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(5)) {
// Code
}
i would play around with waiting a bit before or after preparing.
5 seconds may be too much, but you can start from there
I am recording video (the user also can switch to audio only) with AVAssetWriter. I start the recording when the app is launched.
But the first frames are black (or very dark). This also happens when I switch from audio to video.
It feels like the AVAssetWriter and/or AVAssetWriterInput are not yet ready to record. How can I avoid this?
I don't know if this is a useful info but I also use a GLKView to display the video.
func start_new_record(){
do{
try self.file_writer=AVAssetWriter(url: self.file_url!, fileType: AVFileTypeMPEG4)
if video_on{
if file_writer.canAdd(video_writer){
file_writer.add(video_writer)
}
}
if file_writer.canAdd(audio_writer){
file_writer.add(audio_writer)
}
}catch let e as NSError{
print(e)
}
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!){
guard is_recording else{
return
}
guard CMSampleBufferDataIsReady(sampleBuffer) else{
print("data not ready")
return
}
guard let w=file_writer else{
print("video writer nil")
return
}
if w.status == .unknown && start_recording_time==nil{
if (video_on && captureOutput==video_output) || (!video_on && captureOutput==audio_output){
print("START RECORDING")
file_writer?.startWriting()
start_recording_time=CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
file_writer?.startSession(atSourceTime: start_recording_time!)
}else{
return
}
}
if w.status == .failed{
print("failed /", w.error ?? "")
return
}
if captureOutput==audio_output{
if audio_writer.isReadyForMoreMediaData{
if !video_on || (video_on && video_written){
audio_writer.append(sampleBuffer)
//print("write audio")
}
}else{
print("audio writer not ready")
}
}else if video_output != nil && captureOutput==video_output{
if video_writer.isReadyForMoreMediaData{
video_writer.append(sampleBuffer)
if !video_written{
print("added 1st video frame")
video_written=true
}
}else{
print("video writer not ready")
}
}
}
SWIFT 4
SOLUTION #1:
I resolved this by calling file_writer?.startWriting() as soon as possible upon launching the app. Then when you want to start recording, do the file_writer?.startSession(atSourceTime:...).
When you are done recording and call finishRecording, when you get the callback that says that's complete, set up a new writing session again.
SOLUTION #2:
I resolved this by adding half a second to the starting time when calling AVAssetWriter.startSession, like this:
start_recording_time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
let startingTimeDelay = CMTimeMakeWithSeconds(0.5, 1000000000)
let startTimeToUse = CMTimeAdd(start_recording_time!, startingTimeDelay)
file_writer?.startSession(atSourceTime: startTimeToUse)
SOLUTION #3:
A better solution here is to record the timestamp of the first frame you receive and decide to write, and then start your session with that. Then you don't need any delay:
//Initialization, elsewhere:
var is_session_started = false
var videoStartingTimestamp = CMTime.invalid
// In code where you receive frames that you plan to write:
if (!is_session_started) {
// Start writing at the timestamp of our earliest sample
videoStartingTimestamp = currentTimestamp
print ("First video sample received: Starting avAssetWriter Session: \(videoStartingTimestamp)")
avAssetWriter?.startSession(atSourceTime: videoStartingTimestamp)
is_session_started = true
}
// add the current frame
pixelBufferAdapter?.append(myPixelBuffer, withPresentationTime: currentTimestamp)
Ok, stupid mistake...
When launching the app, I init my AVCaptureSession, add inputs, outputs, etc. And I was just calling start_new_record a bit too soon, just before commitConfiguration was called on my capture session.
At least my code might be useful to some people.
This is for future users...
None of the above worked for me and then I tried changing the camera preset to medium which worked fine
I have a streaming video app, and I would like to know how I can detect whether the app is buffering or not.
In AVPlayer, there is the currentItem.isPlaybackLikelyToKeepUp boolean that tells you when the playback buffer is likely to keep up at the current download speed, and currentItem.isPlaybackBufferEmpty that tells you when the playback buffer is empty.
The problem occurs when the video is playing, the video pauses because the internet is too slow. If I then press the play button, the rate of the player is 1, but it is not playing.
How can I detect that the video is paused because it is buffering? currentItem.isPlaybackBufferEmpty is true even when the video is playing...
EDIT: I have combined these 2 and now the loader I show to display buffering is only shown if currentItem.isPlaybackBufferEmpty && !currentItem.isPlaybackLikelyToKeepUp, the loader now only shows a few seconds after the video starts playing.
This works fine for me, maybe it can help, call self?.bufferState() inside addPeriodicTimeObserver
private func bufferState() {
if let currentItem = self.avPlayer.currentItem {
if currentItem.status == AVPlayerItemStatus.readyToPlay {
if currentItem.isPlaybackLikelyToKeepUp {
print("Playing ")
} else if currentItem.isPlaybackBufferEmpty {
print("Buffer empty - show loader")
} else if currentItem.isPlaybackBufferFull {
print("Buffer full - hide loader")
} else {
print("Buffering ")
}
} else if currentItem.status == AVPlayerItemStatus.failed {
print("Failed ")
} else if currentItem.status == AVPlayerItemStatus.unknown {
print("Unknown ")
}
} else {
print("avPlayer.currentItem is nil")
}
}