Playing interactive videos with finger gestures in iOS - ios

I am working on a kids ABC learning app which will be somewhat like this app.
Petting Zoo
The user can do these gestures... Swipe UP, DOWN, LEFT , RIGHT and TOUCH and each gesture has a small animation clip (approx duration 1 -3 secs each ) linked to it like the character jumping on Swipe UP, etc. There will also be an IDLE loopable movie which will be playing continuously when there is no input from user.
So I am trying to use videos in MP4 and M4V format for these gestures but the problem is that the videos are lagging just before playing. Means they dont play instantly upon doing a gesture but take a time of say micro second to load and play.
I am looking for output like the video above. You can see that the animations are so responsive and do not hang even for little time.
My developer once achieved such smooth output with the MP4 video clips but those clips didnt have audio embedded in them and then when he used videos with audio embedded in them, they were lagging again.
Can audio be the issue for lag here ? Or anything else you experts will like to suggest.
Please help guys. Yours inputs will be very valuable for me.

You can use - (void)replaceCurrentItemWithPlayerItem:(AVPlayerItem *)item of AVPlayer.
This methods helps you to load the new item to an existing AVPlayer.
Here is the important reference for playing multiple videos using single AVPlayer.

Related

Best way to play silence using AVAudioPlayer on iOS

I found myself in a situation where I need to simulate audio playback to trick OS controls and MPNowPlayingInfoCenter into thinking that an audio is being played. This is because I am building a player that plays multiple audio tracks, with pauses in-between creating one, continuous "audio" track. I have already everything setup inside the app itself, and the lock screen controls are working correctly but the only problem I am facing is while the actual audio stops and a pause is being "played", the lock screen info center stops the timer, and it only continues with showing correct time and overall state once another audio track starts playing.
Here is the example of my audio track built from audio files and pause items:
let items: [AudioItem] = [
.audio("part-1.mp3"),
.pause(duration: 5), // value of type: TimeInterval
.audio("part-2.mp3"),
.pause(duration: 3),
... // the list goes on
]
then in my custom player, once AVAudioPlayer finishes its job with current item, I get the next one from the array and play either a .pause with a scheduled Timer or another .audio with AVAudioPlayer.
extension Player: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playNextItem()
}
}
And here lies the problem, once the AVAudioPlayer stops, the Now Playing info center automatically stops too, even tho I keep feeding it fresh nowPlayingInfo. Then when it hits another .audio item, it resumes correctly and shows current time, etc.
And here lies the question
how do I trick the MPNowPlayingInfoCenter into thinking that audio is being played while I "play" my .pause item?
I realise that it may still not be clear, what I am trying to achieve but I am happy to share more insight if needed. Thanks!
Some solutions I am currently thinking about:
A. Keeping 1s long empty audio track that would play on loop for as long as the pause is needed to play.
B. Creating programatically empty audio track with appropriate lenght and playing it instead of using Timer for keeping track of pause duration/progress and relying completely on AVAudioPlayer for both .audio and .pause items. Not sure this is possible though.
C. Maybe there is a way to tell the MPNowPlayingInfoCenter that the audio keeps playing without the need of using AVAudioPlayer but some API I am not familiar with?
AVAudioPlayer is probably the wrong tool here. You want AVAudioPlayerNode, which is slightly lower-level. Create an AVAudioEngine, and attach an AVAudioPlayerNode. You can then call scheduleFile(_:at:completionHandler:) to play the audio at the times you want.
Much of the Apple documentation on AVAudioEngine appears broken right this moment, but the links hopefully will be available again shortly in the links for Audio Engine Building Blocks. (If it stays down and you have trouble finding docs, leave a comment and I'll hunt down the WWDC videos and other tutorials on using AVAudioEngine. It's not particularly difficult for simple problems.)
If you know in advance how you want to compose these items (and it looks like you may), see also AVMutableComposition, which lets you glue together assets very efficiently, including adding empty segments of silence. See Media Composition and Editing for the various tools in that space.

Swift 5 - How to use rate to show video in reverse mode

I have a uiview that contains avplayer asset. I want to control the way of how the video is played.
Moving forward is grate and run smoothly, but when I'm trying to play it on backwards with rate= -1 the video is jumpy and choppy..
Is there better practice of running video inside uiview on backwards?

Transitioning to background, AVPlayer Video has small audio gap

I have followed the many helpful previous questions to get my AVPlayer successfully streaming video when my app goes to the background. There are two methods described on Apple's QA1668 and they both work for my stream urls.
The problem is that there is a noticeable audio gap during the transition that is identical for both methods. On my iPhone 6 in release mode I would say the gap is less than 0.5 seconds, which may not seem terrible but if I'm playing something like a music video this is very distracting.
After more testing it looks like this gap actually occurs when I remove the AVPlayerLayer (or, if I am using the other method, when I disable the AVMediaCharacteristicVisual tracks) as I have determined it will still happen if I hook those actions up to a button rather than the backgrounding state.
My guess is that is has something to do with the audio re-syncing to the new video state of the AVPlayer but really I have no clue. Any help would be greatly appreciated!

Smooth crossfade/transition between two videos?

I have been successfully running the example, Creating a Video Application: http://www.samsungdforum.com/Guide/tut00055/index.html
What I am trying to figure out is how I can smoothly transition from one video to another using a cross fade or any other transition. In this example you are forced to call stop on the player before loading up another video causing a black flash in between.
I've tried going down the path of creating two Video instance one behind eachother and using JQuery to fade out/fade in but I am having a lot of trouble. it appears jquery fades don't even apply to the video element. Also, Is there some kind of limitation with playing two videos at the same time? Is there a better way to go about this? Would it be better to look into doing this using WebGl instead?
Thanks!
If you use Samsung API you can play only one video at time. I can't find documentation page at the moment.
You can try different approaches:
use html5 <video> tag, maybe browser will be capable of playing 2
of them.
use screenshot from second video, and fadeOut it over video
before stopping first video and starting next.
you can recode your videos in 1 continious video in any video editing software. If you need to switch between videos, you can jump to position of next video start.

How does Vine looping video playback work on iOS?

Vine loops videos without any pauses in between loops. I've tried creating an AVPlayer that plays the video from the beginning whenever it ends. This introduces a slight lag between every loop. I'm looking for suggestions of ways to avoid this lag.
I have considered creating much longer videos out of repeating short clips. Am I missing some obvious solution?
Thanks.
You can use AVPlayerQueue to queue up the videos (in this case the same one multiple times), and it will handle playing them in sequence.
For example
iPhone Smooth Transition from One Video To Another

Resources