I'm having a hard time understanding how to use SKVideoNode properly. I'm making a simple game where the player has to complete against the "computer" whose moves are rendered in form of a pre-recorded video.
When the game begins the video should be paused, so I call pause() on it. Once the human player begins his/her moves the artificial player's video starts playing. This part is working fine.
Once the time is up, the game scene is paused via setting isPaused = true on my SKScene subclass (let's call it GameScene).
However, the player also has a chance to restart the game via pressing a button. When the game is restarted, the game scene is "unpaused" and the game state is reset like this:
gameView.isPaused = false // gameView is an SKView presenting the scene
gameScene.resetGameState()
The resetGameState method will simply reset players' scores and pause and rewind the video:
public func resetGameState() {
isGameStarted = false
isGameStopped = false
artificialPlayerScoreboard.score = 0
humanPlayerScoreboard.score = 0
video.pause()
videoPlayer.seek(to: CMTime(seconds: 0, preferredTimescale: 1))
}
Here is how I create the video and video player:
if let videoPath = Bundle.main.path(forResource: "video", ofType: "mp4") {
videoPlayer = AVPlayer(url: URL(fileURLWithPath: videoPath))
video = SKVideoNode(avPlayer: videoPlayer)
video.size = playerScreenSize
video.anchorPoint = CGPoint(x: 0, y: 0)
video.position = CGPoint.zero
video.pause()
artificialPlayerScreen.addChild(video)
}
The problem is that the video starts playing immediately after the game is restarted despite being paused. I've been trying to figure out why and came up with this hack:
gameView.isPaused = false
// Can't do this immediately because the video will start plyaing.
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 0.1, execute: {
self.gameScene.resetGameState()
})
Basically it does the same thing but pauses the video after a small delay.
Is this normal behaviour of SKVideoNode? Am I using it right?
Related
I'm working on a iOS app which let's users play a song with multiple tracks. Every track is played simultaneously and the volume of each track may be managed independently. The latency between all players is fine, when the song is played for the first time within the lifecycle of the app. However, the latency between all players increases, when the user stops the playback of all tracks and resumes it afterwards.
I create a AVAudioPlayer for every track in my MusicPlayer-Class:
do {
let data = try Data.init(contentsOf: url, options: NSData.ReadingOptions.alwaysMapped )
let player = try AVAudioPlayer(data: data)
player.prepareToPlay()
self.audioPlayers.append(player)
}
After loading the players the playback of the song is started by the playSong-Method:
playerSemaphore.wait()
NotificationCenter.default.post(playerDidReceiveCommand.getNotification())
for item in self.audioPlayers {
item.prepareToPlay()
}
let time = self.audioPlayers[0].deviceCurrentTime + 0.2
for item in self.audioPlayers {
item.play(atTime: time)
}
DispatchQueue.main.async {
NotificationCenter.default.post(playerFinishedCommand.getNotification())
}
playerSemaphore.signal()
When the user decides to pause the Song or the AudioSession is interrupted the pauseSong-Method pauses all the player by setting their volume to 0 , pausing them, synchronizing their playback-times and resetting the proper volume:
func pauseSong() {
playerSemaphore.wait()
NotificationCenter.default.post(playerDidReceiveCommand.getNotification())
DispatchQueue.global().async {
// Set volumes of all audio tracks to 0, so delay on pause is not noticed
for audioTrackPlayer in self.audioPlayers {
audioTrackPlayer.volume = 0
}
// Update Times and reset the volumes now the tracks are paused
for (index, audioTrackPlayer) in self.audioPlayers.enumerated() {
audioTrackPlayer.pause()
audioTrackPlayer.currentTime = self.audioPlayers[0].currentTime
if let currentSongID = self.loadedSongID, let currentVolume = sharedSongManager.getVolumeOfTrackOfSongID(id: currentSongID, track: index), let currentActivation = sharedSongManager.getActivationStatusOfTrackOfSongID(id: currentSongID, track: index) {
if currentActivation == true {
audioTrackPlayer.volume = currentVolume
}
}
}
DispatchQueue.main.async {
NotificationCenter.default.post(playerFinishedCommand.getNotification())
}
}
playerSemaphore.signal()
}
For logging the playertimes I wrote a method which returns all the playertimes and the difference of the max-Value and the min-Value of that array. It works with a time-offset to get exact values for the playertimes:
let startTime = CACurrentMediaTime()
var playerTimes = [TimeInterval]()
var delay: Double?
for item in self.audioPlayers.enumerated() {
let playerTime = item.element.currentTime
let currentTime = CACurrentMediaTime()
let timeOffset = currentTime - startTime
let correctedPlayerTime = playerTime - timeOffset
playerTimes.append(correctedPlayerTime)
}
if let maxTime = playerTimes.max(), let minTime = playerTimes.min() {
delay = Double(maxTime) - Double(minTime)
}
return (delay, playerTimes)
Here is a list of the delay (max-value - minus-value of playertimes) before and after pausing:
And here are the playertimes for one log before and after pausing:
All logs were created on a iPhone X with the most recent iOS.
After loading a new song the lifecycle restarts and I have a low latency until the first time the user pauses the song. I don't like the idea of creating new AVAudioPlayers after every interaction but I tried a lot of things like calculating the delay as I did in the logging function or calling the player instances asynchronously. However, most of the time this even increased the delay. Does anyone know a method to synchronize the players properly after resuming a song?
I am building a karaoke app with the ability to sing with video so here is my problem:
I am recording the user video (video only from the front camera) along with applying voice filters with audiokit on a separate audio records.
Now in my playback, i want to play the video and the audio in a sync mode but it didn't succeed because a have an out sync of video and audio.
I am using akplayer for audio so i can apply voice mod and vlckit for playing user video.
do {
//MARK: VLC kit part of the video setup
Vlc_VideoPlayer = VLCMediaPlayer()
Vlc_VideoPlayer.media = VLCMedia(url: recordVideoURL)
Vlc_VideoPlayer.addObserver(self, forKeyPath: "time", options: [], context: nil)
Vlc_VideoPlayer.addObserver(self, forKeyPath: "remainingTime", options: [], context: nil)
Vlc_VideoPlayer.drawable = self.CameraView
//MARK: Audiokit with AKPlayer Setup
file = try AKAudioFile(forReading: recordVoiceURL)
player = AKPlayer(audioFile: file)
self.player.preroll()
delay = AKVariableDelay(player)
delay.rampTime = 0.5
delayMixer = AKDryWetMixer(player, delay)
reverb = AKCostelloReverb(delayMixer)
reverbMixer = AKDryWetMixer(delayMixer, reverb)
booster = AKBooster(reverbMixer)
tracker = AKAmplitudeTracker(booster)
AudioKit.output = tracker
try AudioKit.start()
}catch{
print (error)
}
self.startPlayers()
now the startPlayers function :
func startPlayers(){
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
if AudioKit.engine.isRunning {
self.Vlc_VideoPlayer.audio.isMuted = true
self.Vlc_VideoPlayer.play()
self.player.isLooping = false
self.player.play()
}else{
self.startPlayers()
}
}
}
I don't know anything about the VLC player, but with the built in AVPlayer there is an option to sync to a clock:
var time: TimeInterval = 1 // 1 second in the future
videoPlayer.masterClock = CMClockGetHostTimeClock()
let hostTime = mach_absolute_time()
let cmHostTime = CMClockMakeHostTimeFromSystemUnits(hostTime)
let cmVTime = CMTimeMakeWithSeconds(time, preferredTimescale: videoPlayer.currentTime().timescale)
let futureTime = CMTimeAdd(cmHostTime, cmVTime)
videoPlayer.setRate(1, time: CMTime.invalid, atHostTime: futureTime)
AKPlayer then supports syncing to the mach_absolute_time() hostTime using its scheduling functions. As you have above, the two will start close together but there is no guarantee of any sync.
Trying to start two players will work out of pure look and unless you have means to synchronize playback after it started, it will not be perfect. Ideally, you should play the audio with VLC as well to make use of its internal synchronization tools.
To iterate on what you have right now, I would suggest to start playback with VLC until it decoded the first frame, pause, start your audio and continue playback with VLC as soon as you decoded the first audio sample. This will still not be perfect, but probably better.
I'm working on implementing a video player in Swift that will detect if a video has stopped playing, and then play the second one. When the second one has stopped playing, the first video should play again.
Here's where I set up the player, assets, and player items:
//Create URLs
let movieOneURL: URL = URL(fileURLWithPath: movieOnePath)
let movieTwoURL: URL = URL(fileURLWithPath: movieTwoPath)
//Create Assets
let assetOne = AVAsset(url: movieOneURL)
let assetTwo = AVAsset(url: movieTwoURL)
//Create Player Items
avPlayerItemOne = AVPlayerItem(asset: assetOne)
avPlayerItemTwo = AVPlayerItem(asset: assetTwo)
avplayer = AVPlayer(playerItem: avPlayerItemOne)
let avPlayerLayer = AVPlayerLayer(player: avplayer)
avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
avPlayerLayer.frame = UIScreen.main.bounds
movieView.layer.addSublayer(avPlayerLayer)
//Config player
avplayer .seek(to: kCMTimeZero)
avplayer.volume = 0.0
And here's where I set up a notification to detect if the player reached the end of the video file:
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avplayer.currentItem)
...which calls this selector:
func playerItemDidReachEnd(_ notification: Notification) {
// avplayer.seek(to: kCMTimeZero)
changePlayerAsset()
// avplayer.play()
}
...which will then switch out the asset:
func changePlayerAsset(){
if avplayer.currentItem == avPlayerItemOne {
avplayer.replaceCurrentItem(with: avPlayerItemTwo)
avplayer.play()
} else if avplayer.currentItem == avPlayerItemTwo {
avplayer.replaceCurrentItem(with: avPlayerItemOne)
avplayer.play()
}
}
This works perfectly the first time through - when the first movie has finished playing, the next one will then start playing.
The problem I'm having is that my notification observer only seems to register once; at the end of the first video...the notification isn't fired when the second video stops playing at all.
Anyone have an idea why that would be the case
The reason your notification handler isn’t getting called for the second item is this bit, at the end of where you register the notification handler: object: avplayer.currentItem. The handler gets called once, when that item finishes playing, but then when the next item finishes, the notification gets posted with a different object—the other item—which doesn’t match what you registered for, and so your handler doesn’t get called. If you change object to nil when you register the handler, it’ll get called when any item finishes, which is closer to what you’re after.
That said, this isn’t a great way to do what you want—manually swapping out items is likely to incur the cost and delay of loading each item each time it’s about to play. You’d be much better off using the built-in functionality for playing videos in sequence and looping them, namely AVQueuePlayer and AVPlayerLooper. There’s an example of how to use both in the answer to this question.
I'm making an iOS app in Swift that plays a video in a loop in a small layer in the top right corner of the screen which shows a video of specific coloured item. the user then taps the corresponding coloured item on the screen. when they do, the videoName variable is randomly changed to the next colour and the corresponding video is triggered. I have no problem raising, playing and looping the video with AVPlayer, AVPlayerItem as seen in the attached code.
Where I'm stuck is that whenever the next video is shown, previous ones stay open behind it. Also, After 16 videos have played, the player disappears altogether on my iPad. I've tried many suggestions presented in this and other sites but either swift finds a problem with them or it just doesn't work.
So question: within my code here, how do I tell it "hey the next video has started to play, remove the previous video and it's layer and free up the memory so i can play as many videos as needed"?
//set variables for video play
var playerItem:AVPlayerItem?
var player:AVPlayer?
//variables that contain video file path, name and extension
var videoPath = NSBundle.mainBundle().resourcePath!
var videoName = "blue"
let videoExtension = ".mp4"
//DISPLAY VIDEO
func showVideo(){
//Assign url path
let url = NSURL(fileURLWithPath: videoPath+"/Base.lproj/"+videoName+videoExtension)
playerItem = AVPlayerItem(URL: url)
player=AVPlayer(playerItem: playerItem!)
let playerLayer=AVPlayerLayer(player: player!)
//setplayser location in uiview and show video
playerLayer.frame=CGRectMake(700, 5, 350, 350)
self.view.layer.addSublayer(playerLayer)
player!.play()
// Add notification to know when the video ends, then replay it again. THIS IS A CONTINUAL LOOP
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player!.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
self.player!.seekToTime(t1)
self.player!.play()
}
}
`
I adapted #Anupam Mishra's Swift code suggestion. It wasn't working at first but finally figured I had to take the playerLayer outside the function and close the playerLayer after I set the player to nil. Also. instead of using 'if(player!.rate>0 ...)' which would no doubt still work, I created a variable switch to indicate when to say "kill the player AND the layer" as seen below. It may not be pretty but it works! The following is for absolute newbies like myself - WHAT I LEARNED FROM THIS EXPERIENCE: it seems to me that an ios device only allows 16 layers to be added to a viewController (or superLayer). so each layer needs to be deleted before opening the next layer with its player unless you really want 16 layers running all at once. WHAT THIS CODE BELOW ACTUALLY DOES FOR YOU: this code creates a re-sizable layer over an existing viewController and plays a video from your bundle in an endless loop. When the next video is about to be called, the current video and the layer are totally removed, freeing up the memory, and then a new layer with the new video is played. The video layer size and location is totally customizable using the playerLayer.frame=CGRectMake(left, top, width, height) parameters. HOW TO MAKE IT ALL WORK: Assuming you've already added your videos to you bundle, create another function for your button tap. in that function, first call the 'stopPlayer()' function, change the 'videoName' variable to the new video name you desire, then call the 'showVideo()' function. (if you need to change the video extension, change 'videoExtension' from a let to a var.
`
//set variables for video play
var playerItem:AVPlayerItem?
var player:AVPlayer?
var playerLayer = AVPlayerLayer() //NEW playerLayer var location
//variables that contain video file path, name and extension
var videoPath = NSBundle.mainBundle().resourcePath!
var videoName = "blue"
let videoExtension = ".mp4"
var createLayerSwitch = true /*NEW switch to say whether on not to create the layer when referenced by the closePlayer and showVideo functions*/
//DISPLAY VIDEO
func showVideo(){
//Assign url path
let url = NSURL(fileURLWithPath: videoPath+"/Base.lproj/"+videoName+videoExtension)
playerItem = AVPlayerItem(URL: url)
player=AVPlayer(playerItem: playerItem!)
playerLayer=AVPlayerLayer(player: player!) //NEW: remove 'let' from playeLayer here.
//setplayser location in uiview and show video
playerLayer.frame=CGRectMake(700, 5, 350, 350)
self.view.layer.addSublayer(playerLayer)
player!.play()
createLayerSwitch = false //NEW switch to tell if a layer is already created or not. I set the switch to false so that when the next tapped item/button references the closePlayer() function, the condition is triggered to close the player AND the layer
// Add notification to know when the video ends, then replay it again without a pause between replays. THIS IS A CONTINUAL LOOP
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player!.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
self.player!.seekToTime(t1)
self.player!.play()
}
}
//NEW function to kill the current player and layer before playing the next video
func closePlayer(){
if (createLayerSwitch == false) {
player!.pause()
player = nil
playerLayer.removefromsuperlayer()
}
}
`
Why don't just use replaceCurrentItemWithPlayerItem on your player ? You will keep the same player for all your videos. I think it's a better way to do.
Edit : replaceCurrentItemWithPlayerItem has to be call on the main thread
Before moving on next track first check, is player having any videos or music, to stop it do the following checks:
Swift Code-
if player!.rate > 0 && player!.error == nil
{
player!.pause()
player = nil
}
Objective-C Code-
if (player.rate > 0 && !player.error)
{
[player setRate:0.0];
}
I am using an AVPlayer object to play a live stream of an MP3, just using an HTTP link.
As with normal live video playing in iOS, there is a button that you can press to skip to live.
Is it possible to do this with AVPlayer?
E.g I am listening live, pause, then after however long play again. It continues from where I left it. But what if I want to skip to live?
I had the same need and made this extension to AVPlayer:
extension AVPlayer {
func seekToLive() {
if let items = currentItem?.seekableTimeRanges, !items.isEmpty {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
seek(to: CMTimeMakeWithSeconds(startSeconds + durationSeconds, 1))
}
}
}