WWDC2012's Real-Time Media Effects and Processing during Playback Session explains how to synchronize an AVPlayer with custom audio. Paraphrasing, you
start playback of the AVPlayer and custom audio at the same time
play both at the same rate
The first part is achieved by priming the AVPlayer with prerollAtRate:completionHandler:, so playback can be started with "minimal latency", and the second part by making the AVPlayer use The iOS Audio Clock.
The code snippet assumes you have calculated the future host time that you anticipate audio hitting the speaker (taken literally this last phrase seems to imply supreme omniscience, so let's just read it as your [desired] audio start hosttime).
CMClockRef audioClock = NULL;
OSStatus err = CMAudioClockCreate(kCFAllocatorDefault, &audioClock);
if (err == noErr) {
[myPlayer setMasterClock:audioClock];
[myPlayer prerollAtRate:1.0 completionHandler:^(BOOL finished){
if (finished) {
// Calculate future host time here
[myPlayer setRate:1.0 time:newItemTime atHostTime:hostTime];
} else {
// Preroll interrupted or cancelled
}
}];
}
It's a tiny amount of code, yet it raises so many questions. What happens if the preroll currentTime and newItemTime don't agree? Don't video and audio play at the same rate of one second per second? So shouldn't their clocks be the same? Doesn't 48kHz divide 60fps? How can the code only need to know the desired start time and no other details of my audio code? Is it due to the one iOS Audio Clock? Is this API ingenious or an awful non-orthogonal mish-mash that won't compose well with other AVFoundation features?
Despite my doubts, the code seems to work, but I want to seamlessly loop the video and custom audio. The question is how?
I can't preroll the playing AVPlayer because that happens from currentTime (and the player wouldn't appreciate having its buffers changed while playing). Maybe an alternating pair of prerolled AVPlayers? AVPlayerLooper sounds promising. It's not actually an AVPlayer, but it wraps an AVQueuePlayer (which is). Assuming preroll works on the AVQueuePlayer and I pay extra special attention to looping the custom audio, then this may work. Otherwise I think the only remaining option is to drop the prerolling and shoehorn the video and custom audio into an audio tap within an AVComposition, which would be looped with the help of an AVPlayerLooper.
Related
I need help using AV Audio Player, a Swift tool I'm not very familiar with. I need to know how to fast forward and/or rewind audio time with AV Audio Player in Swift 5. There is a currentTime() property for AV Audio Player that may be helpful in doing this, but I'm not sure how to use it.
You are correct: Set the AVAudioPlayer's currentTime. It measures the position (time) within the song, in seconds. (It calls this a TimeInterval, but that's just a Double signifying seconds or a fraction thereof.)
As the documentation remarks:
By setting this property you can seek to a specific point in a sound file or implement audio fast-forward and rewind functions.
I'm running into some strange issues with AVQueuePlayer. I'll first explain the purpose of my implementation, then go over the issues I am hitting.
Running: iPhone5s, iOS 10.1.1, LTE connection
Video: Progressively downloading a video. .mp4, 5mb, 4 second duration.
The purpose of the implementation is to play a progressively downloaded video that loops seamlessly. The video won't contain any sounds.
I'm using an AVQueuePlayer (supporting iOS 9 and up) to loop videos. I set it up the same way Apple recommended. Which is to listen for the current player item to change and then move the player item to the end of the queue.
https://developer.apple.com/library/content/samplecode/avloopplayer/Listings/Projects_VideoLooper_VideoLooper_QueuePlayerLooper_swift.html#//apple_ref/doc/uid/TP40014695-Projects_VideoLooper_VideoLooper_QueuePlayerLooper_swift-DontLinkElementID_11
I am hitting 2 issues.
Issue 1: My designer gave me a video that contains a video track and an audio track. Durations are the same. I am able to track the buffer progress by checking the current player item loadedTimeRanges. However, when the video loops, it isn't seamless. So we tried a video without the audio track and hit Issue 2.
Issue 2: Testing the same video, but this video contains only a video track. The video loops amazingly. It's seamless. However, when checking the loadedTimeRanges to track the buffer progress, the duration will remain 0 until the video has completely loaded. Then the duration will report the total duration of the video.
Is Issue2 a bug? I also find it strange that removing the audio track creates a much more seamless loop.
I've provided my code below that is used to check seconds buffered. Note that it will return a duration of 0 if the playerItem.loadedTimeRanges.first.timeRangeValue doesn't exist. I can confirm that value does exist and the duration is properly returned when testing both issues.
public var secondsBuffered: Float64 {
if let playerItem = self.player?.currentItem {
if let loadedTimeRange = playerItem.loadedTimeRanges.first?.timeRangeValue {
let duration: Float64 = CMTimeGetSeconds(loadedTimeRange.duration)
return duration
}
}
return 0
}
I am developing an application which require to apply audio effect on recorded video.
I am recording video using GPUImage library. I can successfully done with it. Now, I need to apply audio effect like Chipmunk, Gorila, Large Room, etc.
I looked into Apple's document and it say that AudioEngine can't apply AVAudioUnitTimePitch on Input Node (as Microphone).
For solving this problem, I use following mechanism.
Record video & audio at same time.
Play video. While playing video, start AudioEngine on Audio file and apply AVAudioUnitTimePitch on it.
[playerNode play]; // Start playing audio file with video preview
Merge video and new effected audio file.
Problem :
User have to preview a full video for audio effect merge. This is not a good solution.
If I set volume of playerNode to 0 (zero). Then It record mute video.
Please provide any better suggestion to do this things. Thanks in advance.
I am interested in recording media using an AVCaptureSession in iOS while playing media back using an AVPlayer (specifically, I am playing back audio and recording video, but I'm not sure it matters).
The problem is, when I play the resulting media back together later, they are out of sync. Is it possible to synchronize them, either by ensuring that playback and recording start simultaneously, or by discovering what the offset is between them? I probably need the sync to be on the order of 10 ms. It is unreasonable to assume that I can always capture audio (since the user may use headphones), so syncing via analysis of original and recorded audio is not an option.
This question suggests that it's possible to end playback and recording simultaneously and determine the initial offset from the resulting lengths that way, but I'm unclear how to get them to end simultaneously. I have two cases: 1) the audio playback runs out, and 2), the user hits the "stop recording" button.
This question suggests priming and then applying a fixed, but possibly device-dependent delay, which is obviously a hack, but if it's good enough for audio it's obviously worth considering for video.
Is there another media layer I can use to perform the required synchronization?
Related: this question is unanswered.
If you are specifically using AVPlayer to playback Audio and i would suggest you to use AudioQueueServices for the same. Its seamless and fast as it reads buffer by buffer and play pause is faster than AVPLlayer
There can also be the possibility that you are missing the initial statement of [avPlayer prepareToPlay] which might be causing much overhead for it to sync before playing the Audio.
Hope it helps you.
I have roughly researched audio APIs for iOS. There are several layer APIs to perform audio capture and play.
My app needs a simple function like audio amplifier (needs delay around 0.2 Seconds). I don't need save record to file. I am not sure which way is more simple to implement it. Core Audio? Or AVfoundation?
How do I record audio on iPhone with AVAudioRecorder? I am not sure does this link working with my case or not.
While playing a sound does not stop recording Avcapture This link is playing other audio when recording. It is not suit my case.
For buffered near-simultaneous audio record and playback, you will need to use either the Audio Queue API or Audio Units such as RemoteIO. Audio Units will allow a lower latency.