How to implement a camera taking GIF in iOS? - ios

I want to implement such a function that enable users to make GIF directly from their camera.
In detail, I want to show users a camera view, and a record button. When the button is tapped, the camera starts to record video. In fact, however, behind the scene the camera is actually taking photos at constant speed, say 1 shot per 0.5 second. When the record ends, we got an array of images and then connect them into a GIF.
I think there might be 2 approaches:
1、Directly taking images: Use AVCaptureStillImageOutput's -captureStillImageAsynchronouslyFromConnection method. But it will block UI every time it is called.
2、Take a video and extract several images from it. I have checked video taking libraries such as PBJVision and SCRecorder, and noticed that taking video is typically writing data to a mp4 file locally. I cannot figure out how to extract images at specific time intervals from a video file. Also, is there a way to store the video in memory?
Could anyone help?

Creating Gif
Create and and export an animated gif via iOS?
Convert Images to gif using ios
Extract Images from Video
Get a particular frame by time value using AVAssetReader
Similar here Creating a Movie from Images
How do I export UIImage array as a movie?

You can use a library called 'Regift' by Matthew Palmer, which will convert video to GIF.
Here it is: https://github.com/matthewpalmer/Regift
You can also check out the following answer here on SO:
https://stackoverflow.com/a/28150109/3288936
Hope this will help! :)

Related

Detect if video is boomerang video

I'm working on a project and I'd like to know if it's possible to determine whether a video is a boomerang video or not. Boomerang videos generally are about 4 seconds long or slightly shorter.
What I've thought about doing so far is filtering the array I receive from the users camera roll to only display videos which are 4 seconds, but is there a better way?
Any pointers or advice will be greatly appreciated.
This is not an exact answer, but rather one perspective of how to approach this.
From my understanding Boomerang works by taking a super short, super fast burst of photos and stitching them together into a mini video that plays forward and backward and forward and backward. So that means there is chance for the first frame of the video to appear again. So what I suggest is convert each frames of the video into an array of UIImages. Then take the first image of that array and find out if that image is present in the rest of the array.
To make the video into array of images, you can refer Update for Swift 4.2 part of this answer :- https://stackoverflow.com/a/45153948/4637057
From that you will get frames which is an array of UIImages. Now create another array by taking out the first image from that array using frames.remove(at: 0). But before that create image1, which is frames[0]. Then loop through this new array, consider each image as image2 and apply this logic to determine if the first frame is repeating :- https://stackoverflow.com/a/6488838/4637057
The only help i can give you here, is to refer playbackStyle of PHAsset object, if you're using Photos.framework. More information can be found in PhotoKit documentation here

How to add real time stamp on the streaming and recording video for every frame in Swift?

My app uses VideoCore project for live streaming to Wowza server and store the video. Also it uses AVCaptureMovieFileDataOutput to record the offline video.
I want to embed the video capturing time stamp on top-left of video, and it is not a static time. It means it is not only a static watermark but also a real video capturing time display.
For the streaming case, I have no idea for now. For the offline case, I tried to utilize AVCaptureAudioDataOutput to get every frame to add time text overlay. But this causes preview screen freezes.
Any tips are helpful.
Thank you.
My platform is Xcode7.3 + Swift2
I did some similar thing using transcodig on wowza, the transcodig menu enables image overlay, this image could be refreshed every second (or less), so if you create an image with timestamps every second, wowza takes it and put it on the stream every second. you can define where to put the image, the size and transparency.
to create the image I use PHP, but you could use another tool that enables image creation.

Play Video in Background of UIViewcontroller-iOS

I want to play video (.mp4 format) in background continuously(on loop). I tried using the MediaPlayer,AVFoundation but i didn't get the desired result.
I used another method after converting mp4 to gif, .gif worked for me but the quality was low. So please help me find the best method to solve my problem.
For what purpose is this? If you're trying to have a video background for a login screen or something of that sort, its better to stick with lower res GIFs. It will save you memory, infact you can get really high quality GIFs from .mp4 files using photoshop but the resolution and frames will eat up iPhone's memory.
You can create AVPlayer object in AppDelegate.
Then register its "MovieFinished" observer. In that notification, you can restart the video playing.
Hope this helps.

iOS: Draw on top of AV video, then save the drawing in the video file

I'm working on an iPad app that records and plays videos using AVFoundation classes. I have all of the code for basic record/playback in place and now I would like to add a feature that allows the user to draw and make annotations on the video—something I believe will not be too difficult. The harder part, and something that I have not been able to find any examples of, will be to combine the drawing and annotations into the video file itself. I suspect this is part is accomplished with AVComposition but have no idea exactly how. Your help would be greatly appreciated.
Mark
I do not think that you can actually save a drawing into a video file in iOS. You could however consider using a separate view to save the drawing and synchronize the overlay onto the video using a transparent view. In other words, the user circled something at time 3 mins 42 secs in the video. Then when the video is played back you overlay the saved drawing onto the video at the 3:42 mark. It's not what you want but I think it is as close as you can get right now.
EDIT: Actually there might be a way after all. Take a look at this tutorial. I have not read the whole thing but it seems to incorporate the overlay function you need.
http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos

Capture video from cam + custom view into single video file

I wonder if it's possible in iOS 4 or 5 to save into a single video file not just a stream from camera, but a stream from camera WITH custom view(s) overlaid. Custom view will contain few labels with transparent background. Those labels will show additional info: current time and GPS coordinates. And every video player must be able to playback that additional info.
I think you can use AVCaptureVideoDataOutput to process each frame and use AVAssetWriter to record the processed frame.You can refer to this answer
https://stackoverflow.com/a/4944594/379941 .
And you can process CVImageBufferRef then use AVAssetWriterPixelBufferAdaptor's appendPixelBuffer:withPresentationTime: method to export.
And I strongly suggest using OpenCV to process frame. this is a nice tutorial http://aptogo.co.uk/2011/09/opencv-framework-for-ios/. OpenCV library is very great.

Resources