Changing the audioTimePitchAlgorithm parameter for an AVPlayerItem - ios

I'm trying to change the audioTimePitchAlgorithm for playback at lower rates than 0.5 but don't seem to be having much luck. I've scoured SO for solutions but every implementation I try seems to still limit me to a rate of 0.5 or above...
var player: AVPlayer = AVQueuePlayer()
#IBAction func play(sender: AnyObject) {
let assetQueue = [aVItem1, aVItem2, aVItem3, aVItem4, aVItem5, aVItem6, aVItem7, aVItem8, aVItem9, aVItem0]
var itemQueue: [AVPlayerItem] = []
for index in 0...9{
let nextItem: AVPlayerItem = AVPlayerItem(asset: assetQueue[index])
nextItem.audioTimePitchAlgorithm = "AVAudioTimePitchAlgorithmSpectral"
itemQueue.append(nextItem)
}
player = AVQueuePlayer(items: itemQueue)
player.play()
player.rate = 0.25
}
I apologise in advance if this is actually a straight forward process and I am perhaps too new to this to grasp some of the underlying concepts. I have already also tried creating an AVMutableAudioMixInputParameters object and assigning it to an AVMutableAudioMix object and intern assigning this mix object to the AVPlayerItem, but this also yielded the same result (playback at a rate of 0.5), so I have only included my first, more simplified, code attempt. Any help would be greatly appreciated :)

Related

how to change the playing speed of AVPlayerItem in swift?

I am using a library named Jukebox , to play audios files , I want to make it player faster or slower.
It inherits AVPlayerItem , but I can't find how.
Any one can help me?
I think you can use the AVPlayer rate property below
var rate: Float { get set }
For instance:-
if let playerItem = AVPlayerItem(URL: yourUrl), let player = AVPlayer(playerItem: playerItem) {
//This will update your player speed faster or slower accordingly
player.rate = Float(rateValue)
}

Build a simple Equalizer

I would like to make a 5-band audio equalizer (60Hz, 230Hz, 910Hz, 4kHz, 14kHz) using AVAudioEngine. I would like to have the user input gain per band through a vertical slider and accordingly adjust the audio that is playing. I tried using AVAudioUnitEQ to do this, but I hear no difference when playing the audio. I tried to hardcode in values to specify a gain at each frequency, but it still does not work. Here is the code I have:
var audioEngine: AVAudioEngine = AVAudioEngine()
var equalizer: AVAudioUnitEQ!
var audioPlayerNode: AVAudioPlayerNode = AVAudioPlayerNode()
var audioFile: AVAudioFile!
// in viewDidLoad():
equalizer = AVAudioUnitEQ(numberOfBands: 5)
audioEngine.attach(audioPlayerNode)
audioEngine.attach(equalizer)
let bands = equalizer.bands
let freqs = [60, 230, 910, 4000, 14000]
audioEngine.connect(audioPlayerNode, to: equalizer, format: nil)
audioEngine.connect(equalizer, to: audioEngine.outputNode, format: nil)
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
bands[0].gain = -10.0
bands[0].filterType = .lowShelf
bands[1].gain = -10.0
bands[1].filterType = .lowShelf
bands[2].gain = -10.0
bands[2].filterType = .lowShelf
bands[3].gain = 10.0
bands[3].filterType = .highShelf
bands[4].gain = 10.0
bands[4].filterType = .highShelf
do {
if let filepath = Bundle.main.path(forResource: "song", ofType: "mp3") {
let filepathURL = NSURL.fileURL(withPath: filepath)
audioFile = try AVAudioFile(forReading: filepathURL)
audioEngine.prepare()
try audioEngine.start()
audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
audioPlayerNode.play()
}
} catch _ {}
Since the low frequencies have a gain of -10 and the high frequencies have a gain of 10, there should be a very noticeable difference when playing any media. However, when the media starts playing, it sounds the same as if played without any equalizer attached.
I'm not sure why this is happening, but I tried several different things to debug. I thought that it might be the order of the functions so I tried switching it so that audioEngine.connect is called after adjusting all of the bands, but that did not make a difference either.
I tried this same code with using an AVAudioUnitTimePitch, and it worked perfectly, so I am dumbfounded as to why it does not work with AVAudioUnitEQ.
I do not want to use any third-party libraries or cocoa pods for this project, I would like to do it using AVFoundation alone.
Any help would be greatly appreciated!
Thanks in advance.
AVAudioUnitEQFilterParameters
Looking through the documentation, I noticed that I had messed with all of the parameters except bypass and it seems that changing this flag fixed everything!
So, I believe the main issue here is that each AVAudioUnitEQ band must not be bypassed by the provided system values rather than the values the programmer sets.
So, I changed
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
to
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
bands[i].bypass = false
bands[i].filtertype = .parametric
}
and everything started working. Furthermore, to make an effective equalizer that allows the user to modify individual frequencies the filtertype for each band should be set to .parametric.
I am still unsure on what I should set the bandwith to, but I can probably check online for that or just mess with it until the sound matches a different equalizer application.

How to close previous AVPlayer and AVPlayerItem

I'm making an iOS app in Swift that plays a video in a loop in a small layer in the top right corner of the screen which shows a video of specific coloured item. the user then taps the corresponding coloured item on the screen. when they do, the videoName variable is randomly changed to the next colour and the corresponding video is triggered. I have no problem raising, playing and looping the video with AVPlayer, AVPlayerItem as seen in the attached code.
Where I'm stuck is that whenever the next video is shown, previous ones stay open behind it. Also, After 16 videos have played, the player disappears altogether on my iPad. I've tried many suggestions presented in this and other sites but either swift finds a problem with them or it just doesn't work.
So question: within my code here, how do I tell it "hey the next video has started to play, remove the previous video and it's layer and free up the memory so i can play as many videos as needed"?
//set variables for video play
var playerItem:AVPlayerItem?
var player:AVPlayer?
//variables that contain video file path, name and extension
var videoPath = NSBundle.mainBundle().resourcePath!
var videoName = "blue"
let videoExtension = ".mp4"
//DISPLAY VIDEO
func showVideo(){
//Assign url path
let url = NSURL(fileURLWithPath: videoPath+"/Base.lproj/"+videoName+videoExtension)
playerItem = AVPlayerItem(URL: url)
player=AVPlayer(playerItem: playerItem!)
let playerLayer=AVPlayerLayer(player: player!)
//setplayser location in uiview and show video
playerLayer.frame=CGRectMake(700, 5, 350, 350)
self.view.layer.addSublayer(playerLayer)
player!.play()
// Add notification to know when the video ends, then replay it again. THIS IS A CONTINUAL LOOP
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player!.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
self.player!.seekToTime(t1)
self.player!.play()
}
}
`
I adapted #Anupam Mishra's Swift code suggestion. It wasn't working at first but finally figured I had to take the playerLayer outside the function and close the playerLayer after I set the player to nil. Also. instead of using 'if(player!.rate>0 ...)' which would no doubt still work, I created a variable switch to indicate when to say "kill the player AND the layer" as seen below. It may not be pretty but it works! The following is for absolute newbies like myself - WHAT I LEARNED FROM THIS EXPERIENCE: it seems to me that an ios device only allows 16 layers to be added to a viewController (or superLayer). so each layer needs to be deleted before opening the next layer with its player unless you really want 16 layers running all at once. WHAT THIS CODE BELOW ACTUALLY DOES FOR YOU: this code creates a re-sizable layer over an existing viewController and plays a video from your bundle in an endless loop. When the next video is about to be called, the current video and the layer are totally removed, freeing up the memory, and then a new layer with the new video is played. The video layer size and location is totally customizable using the playerLayer.frame=CGRectMake(left, top, width, height) parameters. HOW TO MAKE IT ALL WORK: Assuming you've already added your videos to you bundle, create another function for your button tap. in that function, first call the 'stopPlayer()' function, change the 'videoName' variable to the new video name you desire, then call the 'showVideo()' function. (if you need to change the video extension, change 'videoExtension' from a let to a var.
`
//set variables for video play
var playerItem:AVPlayerItem?
var player:AVPlayer?
var playerLayer = AVPlayerLayer() //NEW playerLayer var location
//variables that contain video file path, name and extension
var videoPath = NSBundle.mainBundle().resourcePath!
var videoName = "blue"
let videoExtension = ".mp4"
var createLayerSwitch = true /*NEW switch to say whether on not to create the layer when referenced by the closePlayer and showVideo functions*/
//DISPLAY VIDEO
func showVideo(){
//Assign url path
let url = NSURL(fileURLWithPath: videoPath+"/Base.lproj/"+videoName+videoExtension)
playerItem = AVPlayerItem(URL: url)
player=AVPlayer(playerItem: playerItem!)
playerLayer=AVPlayerLayer(player: player!) //NEW: remove 'let' from playeLayer here.
//setplayser location in uiview and show video
playerLayer.frame=CGRectMake(700, 5, 350, 350)
self.view.layer.addSublayer(playerLayer)
player!.play()
createLayerSwitch = false //NEW switch to tell if a layer is already created or not. I set the switch to false so that when the next tapped item/button references the closePlayer() function, the condition is triggered to close the player AND the layer
// Add notification to know when the video ends, then replay it again without a pause between replays. THIS IS A CONTINUAL LOOP
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player!.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
self.player!.seekToTime(t1)
self.player!.play()
}
}
//NEW function to kill the current player and layer before playing the next video
func closePlayer(){
if (createLayerSwitch == false) {
player!.pause()
player = nil
playerLayer.removefromsuperlayer()
}
}
`
Why don't just use replaceCurrentItemWithPlayerItem on your player ? You will keep the same player for all your videos. I think it's a better way to do.
Edit : replaceCurrentItemWithPlayerItem has to be call on the main thread
Before moving on next track first check, is player having any videos or music, to stop it do the following checks:
Swift Code-
if player!.rate > 0 && player!.error == nil
{
player!.pause()
player = nil
}
Objective-C Code-
if (player.rate > 0 && !player.error)
{
[player setRate:0.0];
}

AVPlayer not synchronized

I'm really out of ideas so I'll have to ask you guys again...
I'm building an iPhone application which uses three instances of AVPlayer. They all play at the same time and it's very important that they do so. I used to run this code:
CMClockRef syncTime = CMClockGetHostTimeClock();
CMTime hostTime = CMClockGetTime(hostTime);
[self.playerOne setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerTwo setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerThree setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
which worked perfectly. But a few days ago it just stopped working, the three players are delayed by about 300-400ms (which is way to much, everything under 100ms would be okay). Two of these AVPlayer have some Audio processing, which takes some time more than the "normal" AVPlayer, but it used to work before and the currentTime property tells me, that these players are delayed, so the syncing seems to fail.
I have no idea why it stopped working, I didn't really changed something, but I'm using an observer where i can ask the self.playerX.currentTime property, which gives me a delay of about .3-.4 seconds... I already tried to resync the players if delay>.1f but the delay is still there. So I think the audio processing of player1 and 2 can't be responsable for the delay, as the currentTime property does know they are delayed (i hope you know what I mean). Maybe someone of you guys know why I'm having such a horrible delay, or is able to provide me another idea.
Thanks in advance!
So, I found the solution. I forgot to [self.playerX prerollAtRate:]. I thought if the observer is AVPlayerReadyToPlay it means, that the player is "really" ready. In fact, it does not. After AVPlayer is readyToPlay, it has to be pre rolled. Once that is done you can sync your placer. The delay is now somewhere at 0.000006 seconds.
Full func to sync avplayer's across multiple iOS devices
private func startTribePlayer() {
let dateFormatterGet = DateFormatter()
dateFormatterGet.dateFormat = "yyyy-MM-dd"
guard let refDate = dateFormatterGet.date(from: "2019-01-01") else { return }
let tsRef = Date().timeIntervalSince(refDate)
//currentDuration is avplayeritem.duration().seconds
let remainder = tsRef.truncatingRemainder(dividingBy: currentDuration)
let ratio = remainder / currentDuration
let seekTime = ratio * currentDuration
let bufferTime = 0.5
let bufferSeekTime = seekTime + bufferTime
let mulFactor = 10000.0
let timeScale = CMTimeScale(mulFactor)
let seekCMTime = CMTime(value: CMTimeValue(CGFloat(bufferSeekTime * mulFactor)), timescale: timeScale)
let syncTime = CMClockGetHostTimeClock()
let hostTime = CMClockGetTime(syncTime)
tribeMusicPlayer?.seek(to: seekCMTime, toleranceBefore: .zero, toleranceAfter: .zero, completionHandler: { [weak self] (successSeek) in
guard let tvc = self, tvc.tribeMusicPlayer?.currentItem?.status == .readyToPlay else { return }
tvc.tribeMusicPlayer?.preroll(atRate: 1.0, completionHandler: { [tvc] (successPreroll) in
tvc.tribePlayerDidPlay = true
tvc.tribeMusicPlayer?.setRate(1.0, time: seekCMTime, atHostTime: hostTime)
})
})
}

AVPlayer "freezes" the app at the start of buffering an audio stream

I am using a subclass of AVQueuePlayer and when I add new AVPlayerItem with a streaming URL the app freezes for about a second or two. By freezing I mean that it doesn't respond to touches on the UI. Also, if I have a song playing already and then add another one to the queue, AVQueuePlayer automatically starts preloading the song while it is still streaming the first one. This makes the app not respond to touches on the UI for two seconds just like when adding the first song but the song is still playing. So that means AVQueuePlayer is doing something in main thread that is causing the apparent "freeze".
I am using insertItem:afterItem: to add my AVPlayerItem. I tested and made sure that this was the method that was causing the delay. Maybe it could be something that AVPlayerItem does when it gets activated by AVQueuePlayer at the moment of adding it to the queue.
Must point out that I am using the Dropbox API v1 beta to get the streaming URL by using this method call:
[[self restClient] loadStreamableURLForFile:metadata.path];
Then when I receive the stream URL I send it to AVQueuePlayer as follows:
[self.player insertItem:[AVPlayerItem playerItemWithURL:url] afterItem:nil];
So my question is: How do I avoid this?
Should I do the preloading of an audio stream on my own without the help of AVPlayer? If so, how do I do this?
Thanks.
Don't use playerItemWithURL it's sync.
When you receive the response with the url try this:
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = #[#"playable"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
[self.player insertItem:[AVPlayerItem playerItemWithAsset:asset] afterItem:nil];
}];
Bump, since this is a highly rated question and similar questions online either has outdated answers or aren't great. The whole idea is pretty straight forward with AVKit and AVFoundation, which means no more depending on third party libraries. The only issue is that it took some tinkering around and put the pieces together.
AVFoundation's Player() initialization with url is apparently not thread safe, or rather it's not meant to be. Which means, no matter how you initialize it in a background thread the player attributes are going to be loaded in the main queue causing freezes in the UI especially in UITableViews and UICollectionViews. To solve this issue Apple provided AVAsset which takes a URL and assists in loading the media attributes like track, playback, duration etc. and can do so asynchronously, with a best part being that this loading process is cancellable (unlike other Dispatch queue background threads where ending a task may not be that straight forward). This means, there is no need to worry about lingering zombie threads in the background as you scroll fast on a table view or collection view, ultimately piling up on the memory with a whole bunch of unused objects. This cancellable feature is great, and allows us to cancel any lingering AVAsset async load if it is in progress but only during cell dequeue. The async loading process can be invoked by the loadValuesAsynchronously method, and can be cancelled (at will) at any later time (if still in progress).
Don't forget to exception handle properly using the results of loadValuesAsynchronously. In Swift (3/4), here's how you would would load a video asynchronously and handle situations if the async process fails (due to slow networks, etc.)-
TL;DR
TO PLAY A VIDEO
let asset = AVAsset(url: URL(string: self.YOUR_URL_STRING))
let keys: [String] = ["playable"]
var player: AVPlayer!
asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "playable", error: &error)
switch status {
case .loaded:
DispatchQueue.main.async {
let item = AVPlayerItem(asset: asset)
self.player = AVPlayer(playerItem: item)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
playerLayer.frame = self.YOUR_VIDEOS_UIVIEW.bounds
self.YOUR_VIDEOS_UIVIEW.layer.addSublayer(playerLayer)
self.player.isMuted = true
self.player.play()
}
break
case .failed:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
case .cancelled:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
default:
break
}
})
NOTE:
Based on what your app wants to achieve you may still have to do some amount of tinkering to tune it to get smoother scroll in a UITableView or UICollectionView. You may also need to implement some amount of KVO on the AVPlayerItem properties for it to work and there's plenty of posts here in SO that discuss AVPlayerItem KVOs in detail.
TO LOOP THROUGH ASSETS (video loops/GIFs)
To loop a video, you can use the same method above and introducing AVPlayerLooper. Here's a sample code to loop a video (or perhaps a short video in GIF style). Note the use of duration key which is required for our video loop.
let asset = AVAsset(url: URL(string: self.YOUR_URL_STRING))
let keys: [String] = ["playable","duration"]
var player: AVPlayer!
var playerLooper: AVPlayerLooper!
asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "duration", error: &error)
switch status {
case .loaded:
DispatchQueue.main.async {
let playerItem = AVPlayerItem(asset: asset)
self.player = AVQueuePlayer()
let playerLayer = AVPlayerLayer(player: self.player)
//define Timerange for the loop using asset.duration
let duration = playerItem.asset.duration
let start = CMTime(seconds: duration.seconds * 0, preferredTimescale: duration.timescale)
let end = CMTime(seconds: duration.seconds * 1, preferredTimescale: duration.timescale)
let timeRange = CMTimeRange(start: start, end: end)
self.playerLooper = AVPlayerLooper(player: self.player as! AVQueuePlayer, templateItem: playerItem, timeRange: timeRange)
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
playerLayer.frame = self.YOUR_VIDEOS_UIVIEW.bounds
self.YOUR_VIDEOS_UIVIEW.layer.addSublayer(playerLayer)
self.player.isMuted = true
self.player.play()
}
break
case .failed:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
case .cancelled:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
default:
break
}
})
EDIT : As per the documentation, AVPlayerLooper requires the duration property of the asset to be fully loaded in order to be able to loop through videos. Also, the timeRange: timeRange with the start and end timerange in the AVPlayerLooper initialization is really optional if you want an infinite loop. I have also realized since I posted this answer that AVPlayerLooper is only about 70-80% accurate in looping videos, especially if your AVAsset needs to stream the video from a URL. In order to solve this issue there is a totally different (yet simple) approach to loop a video-
//this will loop the video since this is a Gif
let interval = CMTime(value: 1, timescale: 2)
self.timeObserverToken = self.player?.addPeriodicTimeObserver(forInterval: interval, queue: DispatchQueue.main, using: { (progressTime) in
if let totalDuration = self.player?.currentItem?.duration{
if progressTime == totalDuration{
self.player?.seek(to: kCMTimeZero)
self.player?.play()
}
}
})
Gigisommo's answer for Swift 3 including the feedback from the comments:
let asset = AVAsset(url: url)
let keys: [String] = ["playable"]
asset.loadValuesAsynchronously(forKeys: keys) {
DispatchQueue.main.async {
let item = AVPlayerItem(asset: asset)
self.playerCtrl.player = AVPlayer(playerItem: item)
}
}

Resources