How can I detect buffering in AVPlayer? - ios

I have a streaming video app, and I would like to know how I can detect whether the app is buffering or not.
In AVPlayer, there is the currentItem.isPlaybackLikelyToKeepUp boolean that tells you when the playback buffer is likely to keep up at the current download speed, and currentItem.isPlaybackBufferEmpty that tells you when the playback buffer is empty.
The problem occurs when the video is playing, the video pauses because the internet is too slow. If I then press the play button, the rate of the player is 1, but it is not playing.
How can I detect that the video is paused because it is buffering? currentItem.isPlaybackBufferEmpty is true even when the video is playing...
EDIT: I have combined these 2 and now the loader I show to display buffering is only shown if currentItem.isPlaybackBufferEmpty && !currentItem.isPlaybackLikelyToKeepUp, the loader now only shows a few seconds after the video starts playing.

This works fine for me, maybe it can help, call self?.bufferState() inside addPeriodicTimeObserver
private func bufferState() {
if let currentItem = self.avPlayer.currentItem {
if currentItem.status == AVPlayerItemStatus.readyToPlay {
if currentItem.isPlaybackLikelyToKeepUp {
print("Playing ")
} else if currentItem.isPlaybackBufferEmpty {
print("Buffer empty - show loader")
} else if currentItem.isPlaybackBufferFull {
print("Buffer full - hide loader")
} else {
print("Buffering ")
}
} else if currentItem.status == AVPlayerItemStatus.failed {
print("Failed ")
} else if currentItem.status == AVPlayerItemStatus.unknown {
print("Unknown ")
}
} else {
print("avPlayer.currentItem is nil")
}
}

Related

Spotify track is not playing after I stopped `MPMusicPlayerController.systemMusicPlayer` track

In my application, I am using MPMusicPlayerController.systemMusicPlayer for the playing song of Apple music, it's working fine. But when I play back Spotify track using playSpotifyURI it's not working. I have checked logs but not showing error anywhere.
Scenario
Step 1. Play track using playSpotifyURI. It is playing fine
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Step 2. stop track using.
SPTAudioStreamingController.sharedInstance().setIsPlaying(false, callback: { (error) in
})
Step 3. play apple music song using theMPMusicPlayerController.systemMusicPlayer
func beginPlayback(itemID: String) {
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
//musicPlayerController.setQueue(with: [itemID]) //1324456545
musicPlayerController.setQueue(with: [itemID])
musicPlayerController.prepareToPlay { (error) in
print("prepareToPlay----------------")
}
musicPlayerController.play()
}
Step 4. Stop Apple music song using.
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
Step 5. Play track using playSpotifyURI using below code but it's not playing, I couldn't find any error.
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Is there any issue in the above code? Please help me to solve an issue. Any help will be appreciated.

Audio kit is slowly

I am trying to use the library to get data from Bluetooth device and play it live as I hit the button .
The Bluetooth input is fast and I get the input at the moment I hit the button but the library is a bottle neck , it takes about 80-100ms or more, till I hear the sound.
Also, if I hit twice fast he play only the first and wait for it to end till I can play next note.
What is the best way to use it to play live instrument ? this is my implelemnation:
First I load, then every time I have Bluetooth input, I play :
{
do {
let file = try AKAudioFile(readFileName: "A.wav", baseDir: .resources)
player = try AKAudioPlayer(file: file)
player.looping = false
AudioKit.output = player
AudioKit.start()
}
catch let error1 as NSError
{
} catch {
}
}
func play()
{
player.play()
}

Record video with AVAssetWriter: first frames are black

I am recording video (the user also can switch to audio only) with AVAssetWriter. I start the recording when the app is launched.
But the first frames are black (or very dark). This also happens when I switch from audio to video.
It feels like the AVAssetWriter and/or AVAssetWriterInput are not yet ready to record. How can I avoid this?
I don't know if this is a useful info but I also use a GLKView to display the video.
func start_new_record(){
do{
try self.file_writer=AVAssetWriter(url: self.file_url!, fileType: AVFileTypeMPEG4)
if video_on{
if file_writer.canAdd(video_writer){
file_writer.add(video_writer)
}
}
if file_writer.canAdd(audio_writer){
file_writer.add(audio_writer)
}
}catch let e as NSError{
print(e)
}
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!){
guard is_recording else{
return
}
guard CMSampleBufferDataIsReady(sampleBuffer) else{
print("data not ready")
return
}
guard let w=file_writer else{
print("video writer nil")
return
}
if w.status == .unknown && start_recording_time==nil{
if (video_on && captureOutput==video_output) || (!video_on && captureOutput==audio_output){
print("START RECORDING")
file_writer?.startWriting()
start_recording_time=CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
file_writer?.startSession(atSourceTime: start_recording_time!)
}else{
return
}
}
if w.status == .failed{
print("failed /", w.error ?? "")
return
}
if captureOutput==audio_output{
if audio_writer.isReadyForMoreMediaData{
if !video_on || (video_on && video_written){
audio_writer.append(sampleBuffer)
//print("write audio")
}
}else{
print("audio writer not ready")
}
}else if video_output != nil && captureOutput==video_output{
if video_writer.isReadyForMoreMediaData{
video_writer.append(sampleBuffer)
if !video_written{
print("added 1st video frame")
video_written=true
}
}else{
print("video writer not ready")
}
}
}
SWIFT 4
SOLUTION #1:
I resolved this by calling file_writer?.startWriting() as soon as possible upon launching the app. Then when you want to start recording, do the file_writer?.startSession(atSourceTime:...).
When you are done recording and call finishRecording, when you get the callback that says that's complete, set up a new writing session again.
SOLUTION #2:
I resolved this by adding half a second to the starting time when calling AVAssetWriter.startSession, like this:
start_recording_time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
let startingTimeDelay = CMTimeMakeWithSeconds(0.5, 1000000000)
let startTimeToUse = CMTimeAdd(start_recording_time!, startingTimeDelay)
file_writer?.startSession(atSourceTime: startTimeToUse)
SOLUTION #3:
A better solution here is to record the timestamp of the first frame you receive and decide to write, and then start your session with that. Then you don't need any delay:
//Initialization, elsewhere:
var is_session_started = false
var videoStartingTimestamp = CMTime.invalid
// In code where you receive frames that you plan to write:
if (!is_session_started) {
// Start writing at the timestamp of our earliest sample
videoStartingTimestamp = currentTimestamp
print ("First video sample received: Starting avAssetWriter Session: \(videoStartingTimestamp)")
avAssetWriter?.startSession(atSourceTime: videoStartingTimestamp)
is_session_started = true
}
// add the current frame
pixelBufferAdapter?.append(myPixelBuffer, withPresentationTime: currentTimestamp)
Ok, stupid mistake...
When launching the app, I init my AVCaptureSession, add inputs, outputs, etc. And I was just calling start_new_record a bit too soon, just before commitConfiguration was called on my capture session.
At least my code might be useful to some people.
This is for future users...
None of the above worked for me and then I tried changing the camera preset to medium which worked fine

How do I play music using the system.musicplayer() after the stock Music app has been terminated?

So I'm making a music player using system.musicplayer() since ipod.musicplayer() was deprecated with iOS 8. Anyway, I have pause and play buttons that work perfectly when the stock music app is open in the background but not when it is terminated (i.e. removed from multitasking). Before adding the ifcondition, the app would crash, and now nothing happens at all since I don't really know what to add. Any advice? I am truly lost.
#IBAction func playButtonAction(sender: AnyObject) {
if let mediaItem = musicPlayer.nowPlayingItem {
if (self.isPlay) {
musicPlayer.pause()
self.isPlay = false
} else {
musicPlayer.play()
self.isPlay = true
setupCurrentMediaItem()
handleShuffleAndReplayMode()
}
} else {
//music
}
}

Not able to get buffering state of live video using mpmovieplayercontroller ios

I am working on playing live streaming video using MPmovieplayercontroller. I can also display time of video playing. I am able to play video but when it gets buffer and regain its playing state meanwhile not able to get buffering state and so timer is not getting updated properly rather it jumps with some time.Is there any way to get it proper? Please help me to resolve. Thanks in advance
My code is here:-
NSNotificationCenter.defaultCenter().addObserver(self, selector: "moviePlayBackStateChanged:", name: "MPMoviePlayerLoadStateDidChangeNotification",object: moviePlayer)
if(moviePlayer?.playableDuration > 0)
{
currentTime = moviePlayer!.currentPlaybackTime
}
else
{
currentTime = 0
}
func moviePlayBackStateChanged(notification:NSNotification)
{
if(moviePlayer?.loadState == MPMovieLoadState.Playable)
{
currentTime = moviePlayer!.currentPlaybackTime
println("currentTime\(currentTime)")
lblTime?.text = stringFromTimeInterval(currentTime!)
}else if(moviePlayer?.loadState == MPMovieLoadState.Stalled)
{
lblTime?.text = stringFromTimeInterval(currentTime!)
}
}

Resources