How to create a time lapse video like the photos app? - ios

I'm looking for a way to create long time lapse videos on an iPhone running iOS 9, and hoping to get some pointers on how to start. Ideally I would compress 1 hour of footage into 1 minute, so the scaling factor is 60. I take one frame out of 60 and stitch them together, right?
I have a project which uses AVFoundation capture images using captureOutput:idOutputSampleBuffer:fromConnection:
However, I'm not sure if there are better approaches to creating a time lapse over several hours.
Would it make sense to take individual photos and stitch them together (activating camera every few seconds)?
Or just take frames out of CMSampleBufferRef?
Are there other APIs I can use for capturing camera images?
I'm hoping to understand which approach would result in the highest quality and battery life.
I'm looking at this question which appears to have code for stitching images, but I'm not sure if I need anything else for my project.

One way to accomplish timelapse would be , instead of using AVCaptureVideoDataOutput to process video frames , you can use AVCapturePhotoOutput to get the images from the sample buffers.
A timer is then set to Capture the sampleBuffer every second or so , and finally you stitch the frames together with AVAssetWriter to have the video.
Checkout Apple's StopNGo sample app.

If you consider how dslr captures it you will get the hold of it.
The camera basically clicks 1 picture after every n seconds.
Let's say you set the value to 60. This will result into 1 click every minute. You leave the camera for 8 hours -> 480 minutes -> 480 pictures.
Now its time to stitch these frames together. Lets say you add them with 10 fps , meaning 10 pics in 1 second. This will result in 48 seconds of total footage. I wrote a short piece on this. If needed I can provide the link.

Related

IOS Apply a slow motion effect to a video while shooting

Is it possible to apply slow motion effect while recording video?
This means that the recording has not finished yet, the file has not been saved, but the user sees the recording process in slow motion.
I think it is important to understand what slow-motion actually means. To "slow down motion" in a movie, you need to film more images per second than usually and play this movie afterwards in normal speed, that's making the slow motion effect.
Example: Videos are often shot in 30 frames per second (fps), so for one second of movie you're creating 30 single images. If you want a motion to be half as fast, you need to shoot 60 fps (60 images per second). If you play those 60 images at half-speed (the normal 30fps), it will result in a movie of 2 second lengths showing the slow-motion effect.
As you can see, you cannot record and show a slow-motion effect at the same time. You'll need to save it first and then play it slower than recorded.

How to set delay in seconds to the camera capture on iOS?

I want to know if setting a delay in seconds to the camera feed is possible on iOS with Swift.
Let's say I add a 30 second delay to my camera. If I am pointing the camera at a person who is crouching down, and then that person stands up, my camera is not going to show that until those 30 seconds have passed. In other words, it will continue to show the person crouched until that time has passed.
how can I achieve this?
This sounds extremely difficult. Problem is camera recording is bringing in (most likely) 30 frames a second. So those buffers from 30 seconds ago need to be saved somewhere, and in raw form that is a TON of data so you can't just keep those in memory.
You don't want to write as images because the compression is garbage compared to video.
Perhaps the best way would be to setup your capture, then capture your video in say 10 second segments, save to file, then display those video file segments on an AVPlayer or something. That way it's 'delayed' for the user.
Either way, from my understanding of how things work, the main challenge is what to do with those buffers while waiting to display them. You could potentially stream it somewhere, but then you need a whole backend to support that and then stream it back, seems silly.

Create a timelapse from a normal video in iOS

I have two solutions to this problem:
SOLUTION A
Convert the asset to an AVMutableComposition.
For every second keep only one frame , by removing timing for all the other frames using removeTimeRange(...) method.
SOLUTION B
Use the AVAssetReader to extract all individual frames as an array of CMSampleBuffer
Write [CMSampleBuffer] back into a movie skipping every 20 frames or so as per requirement.
Convert the obtained video file to an AVMutableComposition and use scaleTimeRange(..) to reduce overall timeRange of video for timelapse effect.
PROBLEMS
The first solution is not suitable for full HD videos , the video freezes in multiple place and the seekbar shows inaccurate timing .
e.g. A 12 second timelapse might only be shown to have a duration of 5 seconds, so it keeps playing even when the seek has finished.
I mean the timing of the video gets all messed up for some reason.
The second solution is incredibly slow. For a 10 minute HD video the memory would run upto infinity since all execution is done in memory.
I am searching for a technique that can produce a timelapse for a video right away , without waiting time .Solution A kind of does that , but is unsuitable because of timing problems and stuttering.
Any suggestion would be great. Thanks!
You might want to experiment with the inbuilt thumbnail generation functions to see if they are fast/effecient enough for your needs.
They have the benefit of being optimised to generate images efficiently from a video stream.
Simply displaying a 'slide show' like view of the thumbnails one after another may give you the effect you are looking for.
There is iinfomrtaion on the key class, AVAssetImageGenerator, here including how to use it to generate multiple images:
https://developer.apple.com/reference/avfoundation/avassetimagegenerator#//apple_ref/occ/instm/AVAssetImageGenerator/generateCGImagesAsynchronouslyForTimes%3acompletionHandler%3a

How to dynamically change the playback rate of video in iOS?

The perfect example of what I am trying to achieve can be seen in the Flow ● Slow and Fast Motion app .
One can change the playback rate of the video by dragging points on the curve up or down. The video can also be saved in this state.
I am looking for a way to dynamically speed up/down a video , so that the playback rate can be changed while the video is being played.
Video explanation
WHAT I'VE TRIED
The playback rate property of AVPlayer .But it Only works with a few values for playback Rate(0.50, 0.67, 0.80, 1.0, 1.25, 1.50, and 2.0 ) and one cannot save the video
The scaleTimeRange(..) property of AVMutableComposition. But it doesn't work when you want to ramp the video for gradually decreasing slow/fast motion.
Display video frames on screen using CAEAGLLayer and CADisplayLink. But my many attempts on trying to achieve Slow/Fast motion with this have been unsuccessful .
All this has taken me months and I'm starting to doubt if I'll be able to accomplish this at all.
Thus any suggestion , would be immensely valuable.
In IOS, the MPNowPlayingInfoCenter object contains a 'nowPlayingInfo' dictionary whose contents describe the item being played. It is advised that you start the playback at the 'currentplaybackrate' and then set the speed. See this thread on the developer's forum.
You might possibly end up with something like this (but this is javascript) where the playback rate of the video has been sped up by 4.
document.querySelector('video').playbackRate = 4.0;
document.querySelector('video').play();
video{width:400px;
height:auto;}
<video controls preload="true" autoplay>
<source src="http://www.rachelgallen.com/nature.mp4" type="video/mp4" >
</video>
So I'm not sure I fully understand the use case you're going for, but I think
func setRate(_ rate: Float,
time itemTime: CMTime,
atHostTime hostClockTime: CMTime)
[Apple Documentation Source]
Is something that you're looking for. While this may not be exactly what you need, I'm also not sure where in the docs there is exactly what you're looking for, but with the above method alone, you could do the following to save videos at a variable rate:
Use the above method to play the video throughout (assuming it's not too long, otherwise this will be computationally impossible/timeout-worthy on some devices) at the desired rates each second. Design UI to adjust this per second rate.
under the hood you can actually play the video at that speed "frame by frame" and capture the frames you want (in the right # which will give you the rate you desire) and voila -- saving the right number of frames together (skipping/duplicating as needed to increase/lower desired rate based on "picker" UI) you've now accomplished what you desire
To be clear, what I'm talking about here is a video output # 60FPS has 60 frames per second. You would literally "cut and paste" frames together from the source video into the "destination" video based on whatever UI steppers values you receive from your user (in the screenshot-ed example the question contains, as my basis), and pick up that many frames. AKA if the user says seconds 2-10 of their 20 second video should be at 2X, only put in 30 frames for each of those seconds (if filmed at 60 FPS) alternating frames. The output will, at 60FPS, seem like 2X speed (since there are now 30 frames per 1 second of original video, which is 0.5 seconds at 60 FPS). Similarly, any value can appropriately be factored into:
(desired consistent FPS) = (source video FPS) = (destination video FPS) (ie 60 or 90)
(rate) = (rate from UI steppers/graph UI to pick rate # each time interval) (ie 1X/2x/0.25X)
(desired consistent FPS) * (rate) = (# frames kept in destination video)
(destination video frames) = (source video) * (desired consistent FPS) ~modulated by~ (per custom time interval rate)
The exact mechanisms for ^^ might actually be built into AVPlayer and I didn't find the details, but this alone should be a good start to get you going in that direction.

How to create a video from an animating UIView in iOS (Swift / ObjC)?

I've looked around and couldn't find any solution for this...
My goal is to create a video on iOS from an UIViewController who has some animating subviews. Some people create images frame to frame from this ViewController for e.g. 20 seconds and compose them into a video with Apple's AVFoundation.
I don't think this is an optimal solution, not only because of the performance but also because the enduser who wants to render a video has to wait at least 20 seconds until every frame is captured (assumed that the video only contains 1 frame / second).
Is there any other possible solution to achieve that – maybe in the background and much faster?
Short answer is no, you are not going to be able to create something that works faster than the lossy hardware h.264 video encoder available in iOS. There are other ways to capture and compress whole frames as lossless video frames, but it is not going to be faster since you would still need to wait for all the IO to finish.

Resources