Reconciling CACurrentMediaTime() and deviceCurrentTime - ios

I am trying to synchronize several CABasicAnimations with AVAudioPlayer. The issue I have is that CABasicAnimation uses CACurrentMediaTime() as a reference point when scheduling animations while AVAudioPlayer uses deviceCurrentTime. Also for the animations, CFTimeInterval is used, while for sound it's NSTimeInterval (not sure if they're "toll free bridged" like other CF and NS types). I'm finding that the reference points are different as well.
Is there a way to ensure that the sounds and animations use the same reference point?

I don't know the "official" answer, but they are both double precision floating point numbers that measure a number of seconds from some reference time.
From the docs, it sounds like deviceCurrentTime is linked to the current audio session:
The time value, in seconds, of the audio output device. (read-only)
#property(readonly) NSTimeInterval deviceCurrentTime Discussion The
value of this property increases monotonically while an audio player
is playing or paused.
If more than one audio player is connected to the audio output device,
device time continues incrementing as long as at least one of the
players is playing or paused.
If the audio output device has no connected audio players that are
either playing or paused, device time reverts to 0.
You should be able to start an audio output session, call CACurrentMediaTime() then get the deviceCurrentTime of your audio session in 2 sequential statements, then calculate an offset constant to convert between them. That offset would be accurate within a few nanoseconds.
The offset would only be valid while the audio output session is active. You'd have to recalculate it each time you remove all audio players from the audio session.

I think the official answer just changed, though currently under NDA.
See "What's New in Camera Capture", in particular the last few slides about the CMSync* functions.
https://developer.apple.com/videos/wwdc/2012/?id=520

Related

How do you fast forward or rewind audio (like a song) for a certain time interval with AV Audio Player in Swift 5?

I need help using AV Audio Player, a Swift tool I'm not very familiar with. I need to know how to fast forward and/or rewind audio time with AV Audio Player in Swift 5. There is a currentTime() property for AV Audio Player that may be helpful in doing this, but I'm not sure how to use it.
You are correct: Set the AVAudioPlayer's currentTime. It measures the position (time) within the song, in seconds. (It calls this a TimeInterval, but that's just a Double signifying seconds or a fraction thereof.)
As the documentation remarks:
By setting this property you can seek to a specific point in a sound file or implement audio fast-forward and rewind functions.

Objective-C: Convert float to int64_t without precision loss

THE SCENARIO I am working on an application that uses RTPlayer to play prerecorded Video and Audio from our server.
THE SUSPECTS RTPlayer has 2 useful properties; .initialPlaybackTime and .currentPosition for calculating media time position in seconds. .initialPlaybackTime sets where in the media the player should start playing from, and .currentPosition tells you where you left off to resume at the same position in the media.
THE ISSUE The .initialPlaybackTime property is of int64_t type, and .currentPosition is of type float. When I "plug" the .currentPosition value into .initialPlaybackTime there is always about 8-10 seconds ADDED to the player's position.
QUESTION How can I convert the .currentPosition float value to a int64_t and keep the same value?
The "8-10 seconds being added to the player's position" may have something to do with the underlying HTTP Live Streaming (HLS) technology.
If the media you are playing is streamed it is likely that it conforms to this technology and, if so, will be split into several smaller chunks of media (my experience is this is usually about 15 seconds in length for video) at a variety of bitrates.
In that case, unless the initialPlaybackTime is set to a value that coincides with the start time of one of those media segments, it is possible the player is just using the nearest media segment and jumping to the beginning of that segment (a common practice) or to the next segment if it is near the end of the current segment to reduce loading a full segment's worth of media data without playing it.

Playing a sequence of audio clips in the background after a delay

I would like to schedule a series of local audio tracks to play in sequence while my app is backgrounded or the device is locked, and after a delay. Furthermore, each track should only play a fixed duration sample. For example:
User presses a button in my app.
User locks device.
After x minutes, tracks A, B, and C each play for y seconds in sequence.
How might I accomplish this?
My current Best Hope is to schedule these in an AVPlayerQueue since if I set up the queue correctly the background audio should then 'just work'. But I don't see any way to set a duration for each AVPlayerItem. I also don't know how to set an initial delay, though I would consider using looping a silent audio clip if that is the only obstacle.
If you use AudioKit, something like will be easy to accomplish. Have a look at AKClipPlayer or AKPlayer. Both classes support scheduling and start/end time.

Get the total played duration of a video in MPMoviePlayer

I need to get information about total time a video is played using MPMoviePlayer.How to handle the case when user watches a 3 min video upto 2 min and moves backward to 1.30 and closes the video.The requirement is to know the fraction of video viewed by user accurately.
From the Apple docs on MPMoviePlayerController:
Movie Player Notifications
A movie player generates notifications to keep your app informed about the state of movie playback. In addition to being notified when playback finishes, your app can be notified in the following situations:
When the movie player begins playing, is paused, or begins seeking forward or backward
Using these notifications, you could set your own timers to know the total amount of time that a video has been playing. Specifically, you probably want the MPMoviePlayerPlaybackStateDidChangeNotification.
Knowing the total percentage of the video watched could be a little trickier but still possible I think. You would need to register for the MPMediaPlayback protocol and use it in conjunction with the PlaybackStateDidChangeNotification mentioned above.
One idea I had (though probably not the best or most efficient approach) would be to create an array of BOOL values, 1 for each second of the video. When a video plays, grab the currentPlaybackTime on the player and mark off each second as it is played. If the video state changes (pause, skip forward, etc), stop marking them off until it is resumed, then start at that new index based on the new currentPlaybackTime and continue marking. When they're finished, calculate the % of indexes that have been marked.
MPMoviePlayerController
MPMediaPlayback Protocol
Let me know if this works for you!!

Playing an AVMutableComposition with AVPlayer audio gets out of sync

I have an AVMutableComposition with 2 audio tracks and one video track. I'm using the composition to string about 40 different video clips from .mov files, putting the video content of each clip in the video track of my composition and the audio in the audio track. The second audio track I use for music.
I also have a synchronized layer for titles graphics.
When I play this composition using an AVPlayer, the audio slowly gets out of sync. It takes about 4 minutes to start becoming noticeable. It seems like if I only string together a handfull of longer clips the problem is not as apparent, it is when there are many clips shorter clips (~40 in my test) that it gets really bad.
Pausing and Playing doesn't re-sync the audio, however seeking does. In other words, if I let the video play to the end, towards the end the lip sync gets noticeably off even if I pause and play throughout, however, if I seek to a time towards the end the audio gets back in sync.
My hacky solution for now is to seek to the currentTime + 1 frame every minute or so. This creates an unpleasant jump in the video caused by a lag in the seek operation, so not a good solution.
Exporting with an ExportSession doesn't present this problem, audio remains in sync in the output movie.
I'm wondering if the new masterClock property in the AVPlayer is the answer to this, and if it is, how is it used?
I had the same issue and fixed it, among many other audio and video things, by specifying times timescales in the following manner:
CMTime(seconds: my_seconds, preferredTimescale: CMTimeScale(600))
Before, my time scale was CMTimeScale(NSEC_PER_SEC). That caused me jittery when composing clips at a different frame rate, plus this audio out-of-sync that Eddy mentions here.
In spite of looking like a magic number, 600 is a common multiple of 24, 30, 60 and 120. These are usual frame rates for different purposes. The common multiple avoids dragging around rounding problems when composing multiple clips.

Resources