It's my first time posting on this site, so I hope I didn't make a mistake...
I am currently developing a small app in Swift 2, and one of the functions is to append a still image video (a video clip that is only a particular image, with X duration), to another video. I have already obtained the video, the image, and everything, the problem is the next step...
I have experience with Swift and UIKit, but none with the AVFoundation Kit.
Could someone point a way, example or guide to perform this task? Thanks in advance!
Related
fellow developers!
I am working on an application for photo and video shooting with the ability to apply filters of the Core Image to frames right during shooting. Rendering and displaying frames using MTKView/Metal.
Everything seems to work well, but I ran into an unpleasant problem: Randomly, from time to time, video recording turns out to be "broken" and the native video player does not open it. In order not to clog up the space here, I pulled the problematic pieces of code into a separate project and posted them on GitHub: https://github.com/VKostin8311/LiveEffectCamera.git .
I hope someone can help me. Thank you in advance!
I am working on generating .mp4 Video from array of Images with some transition effect. Is there any Library of SDK is available for this. Please help.
Please check the below video URL about what I am looking for.
https://www.dropbox.com/s/242azi2totylmaa/Screen%20Recording.mov?dl=0
This video I have generated from one demo I got on Github but it is only playing the images with the transition effect but I want to generate .MP4 video.
If anybody knows the solution please help.
Thank you in advance.
You can achieve what you want by using AVVideoCompositionCoreAnimationTool and AVAssetExportSession from AVFoundation Framework to create video from images with animation.
Here is nice tutorial under the section Roger Rabbit, Eat Your Heart Out — Add Animation to Videos
I'm working on an iPad app that records and plays videos using AVFoundation classes. I have all of the code for basic record/playback in place and now I would like to add a feature that allows the user to draw and make annotations on the video—something I believe will not be too difficult. The harder part, and something that I have not been able to find any examples of, will be to combine the drawing and annotations into the video file itself. I suspect this is part is accomplished with AVComposition but have no idea exactly how. Your help would be greatly appreciated.
Mark
I do not think that you can actually save a drawing into a video file in iOS. You could however consider using a separate view to save the drawing and synchronize the overlay onto the video using a transparent view. In other words, the user circled something at time 3 mins 42 secs in the video. Then when the video is played back you overlay the saved drawing onto the video at the 3:42 mark. It's not what you want but I think it is as close as you can get right now.
EDIT: Actually there might be a way after all. Take a look at this tutorial. I have not read the whole thing but it seems to incorporate the overlay function you need.
http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos
I want to make video from images and merge the recording sound with this video.
After a lot of R&D and samples i am unable to merge both, till now i am only able to create video from images.
Can any one tell me the concept behind merging video and audio.
I am using xcode 4.5 and iOS 6.
Thanks in advance.
You Must Search the Answers Previously Answered in StackOverflow.
here is an Answer Describing Adding Audio to a movie.
And Here is an answer for Creating Movie from images.
I am a very beginner in Objective-C and iOS programming. I spent a month to find out how to show a 3D model using OpenGL ES (version 1.1) on top of the live camera preview by using AvFoundation. I am doing a kind of augmented reality application on iPad. I process the input frames and show 3D object overlay with the camera preview in realtime. These was fine because there are so many site and tutorial about these things (Thanks to this website as well).
Now, I want to make a screen capture of the whole screen (the model with camera preview as the background) as the image and show in the next screen. I found a really good demonstration here, http://cocoacoderblog.com/2011/03/30/screenshots-a-legal-way-to-get-screenshots/. He did everything I want to do. But, as I said before, I am so beginner and don't understand the whole project without explanation in details. So, I'm stuck for a while because I don't know how to implement this.
Does anybody know any of good tutorial or any kind of source in this topic or any suggestion that I should learn more in order to do this screen capture? This will help me a lot to moving on.
Thank you in advance.
I'm currently attempting to solve this same problem to allow a user to take a screenshot of an Augmented Reality app. (We use Qualcomm's AR SDK plugged into Unity 3D to make our AR apps, which saved me from ever having to learn how to programmatically render OpenGL models)
For my solution I am first looking at implementing the second answer found here: How to take a screenshot programmatically
Barring that I will have to re-engineer the "Combined Screenshots" method found in CocoaCoder's Screenshots app.
I'll check back in when I figure out which one works better.
Here are 3 very helpful links to capture screenshot:
OpenGL ES View Snapshot
How to capture video frames from the camera as images using AV Foundation
How do I take a screenshot of my app that contains both UIKit and Camera elements
Enjoy