I have an array of images that I am passing to this class:
https://github.com/jberlana/iOSKenBurns
This basically takes all of the images and builds a slideshow from them and then adds an automatic and random ken burns effect.
Everything works awesome; however, I am now trying to add the ability to export the slide show to a movie file. There will be just a simple button to accomplish this, but I have no idea where to start for turning this into a video.
I have found this: QTMovie Class Reference which is EXACTLY what I'm looking for (compile array of images into a movie file), but I need to retain the ken burns effect the class has added, and I don't know if this applied to iOS either.
Not really sure if this is even possible, or what to do with it. Any help would be great, at least a point in the right direction! Thanks in advance!
The QTMovie class does not function under iOS. You can take a look at an example Xcode project I created to show off this sort of movie composition logic in my library, the Xcode project is at AVRender. This example creates a lossless output file that basically contains all the images as frames of video. If you want to actually convert that to h.264 when finished, there is another example on that same page named AVDecodeEncode that shows how to do that and play the results with AVPlayer. You could also roll your own code to do all this using the AVAssets API under iOS, but be warned that those APIs are not simple to use.
Related
On iOS, my requirement is disallow user to take manual screenshot from my application, either disallow or blur the captured screenshot. How?
The only solution is to simulate the iOS controls you have in your View using DRM'ed videos.
For each widget you need to create a video subclass that renders the widget, and apply DRM to the video.
You can try to do it yourself, or use a commercial solution such as the following:
https://screenshieldkit.com
It is possible but I don't recommend it because of the room for error. It was easier to do in the past, in iOS13 you will have to do it like this:
You will have to ask for the user's permission to read and edit their photo library, then you have a listener which is checking the number of photos in their library while they are using your app, if that number changes, they have just taken a screenshot (unless you allow other things in your app like tap and hold to save image, etc). When this happens, read said photo and apply a blur, then delete the photo from their library and save the blurred photos.
Warning: There are times where a user may get a photo while using your app that is not a screenshot (e.g. they received an airdrop) and you will now be tampering with their photos, which is very bad. To prevent this you may need to use key value pixel encoding on your screen at all times, for example the first 3 pixels of the screen are 3 very specific RGB values, that way if a new photo is detected and the first 3 pixels are those exact RGB values you know it's a screenshot of your app and not just another photo that was somehow saved while the user was using the app.
There isn't any regular solution to your problem!
You can do some tricks such as if you force the user to have their finger on the screen for the image to show then I don't think they can create screenshots. Because as soon as you press the home+lock keys to actually take the screenshot, the screen seems to behave as if there are no fingers touching it.
BUT what if the user takes a screenshot by AssistiveTouch?!
OR what do you want to do if user records screen and taking screenshot from the video?
I think it's better to change your strategy for example notify the owner of picture for taking screen shot by another one (like SnappChat)!
I am working with a sample application like vine. My requirement is that I have to create a 'ghost' filter for video as in vine.
Actual requirement is
-Record a video on long press on the view
-On pause of record, I need to show the last frame of the recorded video above my view. Please see the expected working here
I have checked PBJVision library and found this feature working. But I need to implement this feature separately in my application.
While analysing the code, I found that this can be achieved using Open GLES. I have tried using a GLKView but it just shows a dark shade instead of image frame. Since I am new in this area, please help me.
I'm relatively new to iOS development and there is something I'm trying to achieve but I'm out of luck finding good resources. I would like to embed subtitles to a video file for a project. The user records a video, after recording he inputs some sentences, and the text he input gets embedded into the video as a subtitle, one word at a time. I think this is achieved through AVFoundation but to my surprise there is relatively little information on AVFoundation except Apple's documentation (which is not of much use when you want to embed subtitles.)
How might one go about doing this? How do I need to format the text input so that it becomes a subtitle track? How do I embed that subtitle track to the video file and export it so that the video always has the subtitles on it? I hope someone here knows about these things..
Thanks in advance.
If you are looking to create a QuickTime video with embedded subtitles (AKA soft subtitles) in it, see avsubtitleswriterOSX sample code from WWDC 2013.
If you prefer to burn subtitles into a video, you can use AVVideoCompositionCoreAnimationTool, as described in this tutorial
Sorry, I've asked a similar question but I'm already suffering 3 days dealing with a simple photo gallery in my app. I just need a gallery with array of 1024x768 images, a gallery that will fit pictures properly into the screen.
I've tried
ATPagingView - worked fine but an 1024x768 image couldn't be resized properly for both orientations.
MWPhotobrowser - didn't managed to connect because i'm using ARC, even if I disabled arc for added files, there was a compile error I coudn't get rid of.
Custom UIScrollView with pages, with a scrollView for each page and imageview inside it, but it didn't help.
Please, help me somehow, did somebody make galleries like that?
Give Nimbus a try. It is an open source framework that is run by jverkoey, ex Three20 guy. It is not ARC'd yet, but you can turn ARC off for compilation and that works just fine, plus they are moving to ARC soon, already have a git branch of it. It has a photoviewer class but I've not used it yet, but I will need to, that's only one of the reasons I'm using it. Pretty happy with it so far!
I already found KTPhotoBrowser classes. They are nice and well-documented. I implemented the very simple photo browsing gallery very fast. Images are now resized properly, everything is okay. if you want a gallery, try this, for sure!
Is there any tutorials that show how to make a Image display similar to the Album Art diaply in iTunes? Or anything similar. I followed code posted here, but I just cannot seem to get it working in the new XCode. Opening his project works fine, but using it in my own, the UIImageView renders the images beyond it's borders, making them appear over each other.
Any help would be appreciated.
I assume you're talking about CoverFlow?
I wrote a free, very easy to use CoverFlow library. It's modelled on the way that UITableView works, so if you can use that, you can use this. You can get it from here:
https://github.com/nicklockwood/iCarousel