In my application, I am using MPMusicPlayerController.systemMusicPlayer for the playing song of Apple music, it's working fine. But when I play back Spotify track using playSpotifyURI it's not working. I have checked logs but not showing error anywhere.
Scenario
Step 1. Play track using playSpotifyURI. It is playing fine
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Step 2. stop track using.
SPTAudioStreamingController.sharedInstance().setIsPlaying(false, callback: { (error) in
})
Step 3. play apple music song using theMPMusicPlayerController.systemMusicPlayer
func beginPlayback(itemID: String) {
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
//musicPlayerController.setQueue(with: [itemID]) //1324456545
musicPlayerController.setQueue(with: [itemID])
musicPlayerController.prepareToPlay { (error) in
print("prepareToPlay----------------")
}
musicPlayerController.play()
}
Step 4. Stop Apple music song using.
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
Step 5. Play track using playSpotifyURI using below code but it's not playing, I couldn't find any error.
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Is there any issue in the above code? Please help me to solve an issue. Any help will be appreciated.
I am trying to use the library to get data from Bluetooth device and play it live as I hit the button .
The Bluetooth input is fast and I get the input at the moment I hit the button but the library is a bottle neck , it takes about 80-100ms or more, till I hear the sound.
Also, if I hit twice fast he play only the first and wait for it to end till I can play next note.
What is the best way to use it to play live instrument ? this is my implelemnation:
First I load, then every time I have Bluetooth input, I play :
{
do {
let file = try AKAudioFile(readFileName: "A.wav", baseDir: .resources)
player = try AKAudioPlayer(file: file)
player.looping = false
AudioKit.output = player
AudioKit.start()
}
catch let error1 as NSError
{
} catch {
}
}
func play()
{
player.play()
}
I am recording video (the user also can switch to audio only) with AVAssetWriter. I start the recording when the app is launched.
But the first frames are black (or very dark). This also happens when I switch from audio to video.
It feels like the AVAssetWriter and/or AVAssetWriterInput are not yet ready to record. How can I avoid this?
I don't know if this is a useful info but I also use a GLKView to display the video.
func start_new_record(){
do{
try self.file_writer=AVAssetWriter(url: self.file_url!, fileType: AVFileTypeMPEG4)
if video_on{
if file_writer.canAdd(video_writer){
file_writer.add(video_writer)
}
}
if file_writer.canAdd(audio_writer){
file_writer.add(audio_writer)
}
}catch let e as NSError{
print(e)
}
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!){
guard is_recording else{
return
}
guard CMSampleBufferDataIsReady(sampleBuffer) else{
print("data not ready")
return
}
guard let w=file_writer else{
print("video writer nil")
return
}
if w.status == .unknown && start_recording_time==nil{
if (video_on && captureOutput==video_output) || (!video_on && captureOutput==audio_output){
print("START RECORDING")
file_writer?.startWriting()
start_recording_time=CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
file_writer?.startSession(atSourceTime: start_recording_time!)
}else{
return
}
}
if w.status == .failed{
print("failed /", w.error ?? "")
return
}
if captureOutput==audio_output{
if audio_writer.isReadyForMoreMediaData{
if !video_on || (video_on && video_written){
audio_writer.append(sampleBuffer)
//print("write audio")
}
}else{
print("audio writer not ready")
}
}else if video_output != nil && captureOutput==video_output{
if video_writer.isReadyForMoreMediaData{
video_writer.append(sampleBuffer)
if !video_written{
print("added 1st video frame")
video_written=true
}
}else{
print("video writer not ready")
}
}
}
SWIFT 4
SOLUTION #1:
I resolved this by calling file_writer?.startWriting() as soon as possible upon launching the app. Then when you want to start recording, do the file_writer?.startSession(atSourceTime:...).
When you are done recording and call finishRecording, when you get the callback that says that's complete, set up a new writing session again.
SOLUTION #2:
I resolved this by adding half a second to the starting time when calling AVAssetWriter.startSession, like this:
start_recording_time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
let startingTimeDelay = CMTimeMakeWithSeconds(0.5, 1000000000)
let startTimeToUse = CMTimeAdd(start_recording_time!, startingTimeDelay)
file_writer?.startSession(atSourceTime: startTimeToUse)
SOLUTION #3:
A better solution here is to record the timestamp of the first frame you receive and decide to write, and then start your session with that. Then you don't need any delay:
//Initialization, elsewhere:
var is_session_started = false
var videoStartingTimestamp = CMTime.invalid
// In code where you receive frames that you plan to write:
if (!is_session_started) {
// Start writing at the timestamp of our earliest sample
videoStartingTimestamp = currentTimestamp
print ("First video sample received: Starting avAssetWriter Session: \(videoStartingTimestamp)")
avAssetWriter?.startSession(atSourceTime: videoStartingTimestamp)
is_session_started = true
}
// add the current frame
pixelBufferAdapter?.append(myPixelBuffer, withPresentationTime: currentTimestamp)
Ok, stupid mistake...
When launching the app, I init my AVCaptureSession, add inputs, outputs, etc. And I was just calling start_new_record a bit too soon, just before commitConfiguration was called on my capture session.
At least my code might be useful to some people.
This is for future users...
None of the above worked for me and then I tried changing the camera preset to medium which worked fine
So I'm making a music player using system.musicplayer() since ipod.musicplayer() was deprecated with iOS 8. Anyway, I have pause and play buttons that work perfectly when the stock music app is open in the background but not when it is terminated (i.e. removed from multitasking). Before adding the ifcondition, the app would crash, and now nothing happens at all since I don't really know what to add. Any advice? I am truly lost.
#IBAction func playButtonAction(sender: AnyObject) {
if let mediaItem = musicPlayer.nowPlayingItem {
if (self.isPlay) {
musicPlayer.pause()
self.isPlay = false
} else {
musicPlayer.play()
self.isPlay = true
setupCurrentMediaItem()
handleShuffleAndReplayMode()
}
} else {
//music
}
}
I am working on playing live streaming video using MPmovieplayercontroller. I can also display time of video playing. I am able to play video but when it gets buffer and regain its playing state meanwhile not able to get buffering state and so timer is not getting updated properly rather it jumps with some time.Is there any way to get it proper? Please help me to resolve. Thanks in advance
My code is here:-
NSNotificationCenter.defaultCenter().addObserver(self, selector: "moviePlayBackStateChanged:", name: "MPMoviePlayerLoadStateDidChangeNotification",object: moviePlayer)
if(moviePlayer?.playableDuration > 0)
{
currentTime = moviePlayer!.currentPlaybackTime
}
else
{
currentTime = 0
}
func moviePlayBackStateChanged(notification:NSNotification)
{
if(moviePlayer?.loadState == MPMovieLoadState.Playable)
{
currentTime = moviePlayer!.currentPlaybackTime
println("currentTime\(currentTime)")
lblTime?.text = stringFromTimeInterval(currentTime!)
}else if(moviePlayer?.loadState == MPMovieLoadState.Stalled)
{
lblTime?.text = stringFromTimeInterval(currentTime!)
}
}