Player-swift library, how to adjust playback rate - ios

I am using the Player library https://github.com/piemonte/Player for video playback in my app.
I'm trying to figure out how to add the functionality to change the playback speed/rate of the video, like this: https://developer.apple.com/reference/avfoundation/avplayer/1388846-rate
I didn't see a playback function to allow this type of direct control in the Player docs.
Is there a way to change the "rate" of the underlying AVPlayer?

In this lib have the Player.swift, there you can access "_avplayer" variable that is a AVPlayer object..
You can make _avplayer public and access it from everywhere, or you can just make a getter and setter like:
open var rate: Float {
get {
return self._avplayer.rate
}
set {
self._avplayer.rate = newValue
}
}

Related

How to fast switch Audio Input in AVCaptureSession?

I am using AVCaptureSession to develop a camera app.
I need trigger some vibration feedback when touch some UI. using code like below:
private var _impact = UIImpactFeedbackGenerator()
private var _select = UISelectionFeedbackGenerator()
private init() {
_impact.prepare()
_select.prepare()
}
func impact() {
_impact.impactOccurred()
}
func select() {
_select.selectionChanged()
}
so I can invoke select() or impact() to trigger a feedback.
But, if I add Audio Device to AVCaptureSession, all feedbacks will be invalidated. so, for the feedback effect, I have to removed AudioDevice from AVCaptureSession first, when I need to record a video, then add the Audio Device to AVCaptureSession, this operation will lag captureOutput that make camera preview freezed in little time.
So, I found another way to try this, first always add Audio Device to AVCaptureSession, then get all AVCaptureConnection about Audio from AVCaptureSession.connections. and set isEnable to false or true, But this way didn't work. even AVCaptureConnection.isEnable is false, the feedbacks also invalidate.
I think maybe there is only way to make feedback worked is don't add audio device to AVCaptureSession.

Detect when the user has muted AVPlayer

I am playing a live stream using AVPlayer and AVPlayerItem.
Is there a way to know if the user has muted/unmuted the video while playing it
You can check the isMuted value of the AVPlayer.
Documentation:
https://developer.apple.com/documentation/avfoundation/avplayer/1387544-ismuted
You can use a combination of KVO and Combine to observe changes to isMuted property:
let player: AVPlayer
private var cancellables = Set<AnyCancellable>()
//...
player
.publisher(for: \.isMuted)
.sink { isMuted in
// here you can handle the fact the the user has muted or unmuted the player
}
.store(in: &cancellables)
You should implement KVo to detect mute/unmute changes.
Note: in swift, although the property is isMuted, but you still should use ObjectiveC properties in KVo, so listen for muted key and you are set.

How to get animation to work at exact points during playback of a music file?

Question:
In Swift code, apart from using an NSTimer, how can I get animations
to start at exact points during playback of a music file played using AVFoundation?
Background
I have a method that plays a music file using AVFoundation (below). I also have UIView animations that I want to start at exact points during the music file being played.
One way I could achieve this is using an NSTimer, but that has the potential to get out of sync or not be exact enough.
Is there a method that I can tap into AVFoundation accessing the music file's time elapsed (time counter), so when certain points during the music playback arrive, animations start?
Is there an event / notification that AVFoundation triggers that gives a constant stream of time elapsed since the music file has started playing?
For example
At 0:52.50 (52 seconds and 1/2), call startAnimation1(), at 1:20.75 (1 minute, 20 seconds and 3/4), call startAnimation2(), and so on?
switch musicPlayingTimeElapsed {
case 0:52.50:
startAnimation1()
case 1:20.75:
startAnimation2()
default:
()
}
Playing music using AVFoundation
import AVFoundation
var myMusic : AVAudioPlayer?
func playMusic() {
if let musicFile = self.setupAudioPlayerWithFile("fileName", type:"mp3") {
self.myMusic = musicFile
}
myMusic?.play()
}
func setupAudioPlayerWithFile(file:NSString, type:NSString) -> AVAudioPlayer? {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer:AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
} catch {
print("AVAudioPlayer not available")
}
return audioPlayer
}
If you use AVPlayer instead of AVAudioPlayer, you can use the (TBH slightly awkward) addBoundaryTimeObserverForTimes method:
let times = [
NSValue(CMTime:CMTimeMake(...)),
NSValue(CMTime:CMTimeMake(...)),
NSValue(CMTime:CMTimeMake(...)),
// etc
];
var observer: AnyObject? = nil // instance variable
self.observer = self.player.addBoundaryTimeObserverForTimes(times, queue: nil) {
switch self.player.currentTime() {
case 0:52.50:
startAnimation1()
case 1:20.75:
startAnimation2()
default:
break
}
}
// call this to stop observer
self.player.removeTimeObserver(self.observer)
The way I solve this is to divide the music up into separate segments beforehand. I then use one of two approaches:
I play the segments one at a time, each in its own audio player. The audio player's delegate is notified when a segment finishes, and so starting the next segment — along with accompanying action — is up to me.
Alternatively, I queue up all the segments onto an AVQueuePlayer. I then use KVO on the queue player's currentItem. Thus, I am notified exactly when we move to a new segment.
You might try using Key Value Observing to observe the duration property of your sound as it plays. When the duration reaches your time thresholds you'd trigger each animation. You'd need to make the time thresholds match times >= the trigger time, since you will likely not get a perfect match with your desired time.
I don't know how well that would work however. First, I'm not sure if the sound player's duration is KVO-compliant.
Next, KVO is somewhat resource-intensive, and if your KVO listener gets called thousands of times a second it might bog things down. It would at least be worth a try.

MonoGame Mediaplayer plays only one song

I use:
MediaPlayer.Play(song1);
to play a song.
Then I use
MediaPlayer.Play(song2);
to play second song. But Mediaplayer still plays song1. I tried to stop player and play song2 again but it doesn't work. When I swap song1 and song2, it plays only song2.
Edit:
I have this class:
public class SoundHelper
{
public static void PlaySong(Song song)
{
MediaPlayer.Stop();
MediaPlayer.Play(song);
}
public static void StopSong()
{
MediaPlayer.Stop();
}
}
I use:
SoundHelper.PlaySong(Content.Load<Song>("Sounds/Songs/MenuTheme"));
to play song when game starts and it works.
Then I use:
SoundHelper.PlaySong(Content.Load<Song>("Sounds/Songs/Battle"));
to play next song in battle but then MenuTheme is played from beginning.
you have to stop playing current song
if (PlaySong1) {
MediaPlayer.Stop();
MediaPlayer.Play(Song1);
}
else{
MediaPlayer.Stop();
MediaPlayer.Play(Song2);
}
EDIT
you code looks fine, make sure that you call "SoundHelper" only once,
on loading menu or battle.
also try with "try" and "catch" on MediaPlayser.Stop() not sure, but
maybe will throw exception if no song is loaded.
it's better to load songs in your "loadcontent" instead.
song1 = Content.Load<Song>("song1"); song2 = Content.Load<Song>("song2");
also maybe adding MediaPlayer.IsRepeating = true; could be solve.
this is so far all i could remember.
Unfortunately it seems to be a bug in MonoGame. Had that same issue.
With version 3.3 your approach should work.

How to scrub audio with AVPlayer?

I'm using seekToTime for an AVPlayer. It works fine, however, I'd like to be able to hear audio as I scrub through the video, much like how Final Cut or other video editors work. Just looking for ideas or if I've missed something obvious.
The way to do this is to scrub a simultaneous AVplayer asynchronously alongside the video. I did it this way (in Swift 4):
// create the simultaneous player and feed it the same URL:
let videoPlayer2 = AVPlayer(url: sameUrlAsVideoPlayer!)
videoPlayer2.volume = 5.0
//set the simultaneous player at exactly the same point as the video player.
videoPlayer2.seek(to: sameSeekTimeAsVideoPlayer)
// create a variable(letsScrub)that allows you to activate the audio scrubber when the video is being scrubbed:
var letsScrub: Bool?
//when you are scrubbing the video you will set letsScrub to true:
if letsScrub == true {audioScrub()}
//create this function to scrub audio asynchronously:
func audioScrub() {
DispatchQueue.main.async {
//set the variable to false so the scrubbing does not get interrupted:
self.letsScrub = false
//play the simultaneous player
self.videoPlayer2.play()
//make sure that it plays for at least 0.25 of a second before it stops to get that scrubbing effect:
DispatchQueue.main.asyncAfter(deadline: .now() + 0.25) {
//now that 1/4 of a second has passed (you can make it longer or shorter as you please) - pause the simultaneous player
self.videoPlayer2.pause()
//now move the simultaneous player to the same point as the original videoplayer:
self.videoPlayer2.seek(to: self.sameSeekTimeAsVideoPlayer)
//setting this variable back to true allows the process to be repeated as long as video is being scrubbed:
self.letsScrub = true
}
}
}

Resources