Over the years software has been introduced to give a fake animation effect to a single photo image so the image appears moving as in an animated gif. I'm not talking about rotation or translation or animated gifs from multiple photos but rather the mimicking of video or Live Photos from a single photo by automagically perturbing the layers or pixels. After Effects, for example, lets you do this.
Does anyone know if something like this is possible with IOS libraries such core animation?
It's in motion effect of UIView. check UIInterpolatingMotionEffectType.
Related
I'm fairly new to iOS development.
What I want to achive is to put the stream from the camera in a UIView class. (and size it with a frame).
So i don't need controls or the possibility to capture images, just on the screen what the camera sees.
Furthermore, i want that view to be blurred. Is there a way (or a library) to put a gaussian blur on that videostream?
Thank you!
You can use GPUImage https://github.com/BradLarson/GPUImage try realtime effects they provide. That will solve your problem for sure.
To display the camera on screen without controls you will need to use AVFoundation. Take a look at Apple's SquareCam sample code.
As for the blur a simpler solution might be creating semi-transparent image with a blur effect and placing it about the camera view.
I'm working on an iPad app that records and plays videos using AVFoundation classes. I have all of the code for basic record/playback in place and now I would like to add a feature that allows the user to draw and make annotations on the video—something I believe will not be too difficult. The harder part, and something that I have not been able to find any examples of, will be to combine the drawing and annotations into the video file itself. I suspect this is part is accomplished with AVComposition but have no idea exactly how. Your help would be greatly appreciated.
Mark
I do not think that you can actually save a drawing into a video file in iOS. You could however consider using a separate view to save the drawing and synchronize the overlay onto the video using a transparent view. In other words, the user circled something at time 3 mins 42 secs in the video. Then when the video is played back you overlay the saved drawing onto the video at the 3:42 mark. It's not what you want but I think it is as close as you can get right now.
EDIT: Actually there might be a way after all. Take a look at this tutorial. I have not read the whole thing but it seems to incorporate the overlay function you need.
http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos
All I need to do is capture an image, and all I can find is complicated code on capturing video or multiple frames. I can't use UIImagePickerController because I do not want to see the camera shutter animation and I have a custom overlay, and my app is landscape only. What is the simplest way to manually capture an image from the front or back live camera view in the correct orientation? I don't want to save it to the camera roll, I want to present it in a view controller for editing.
Take a look to the SquareCam (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html) example from Apple. It contains all what you need for high-quality capture of images. I recently copy-pasted the code from this project myself where I solved the same task as you. It works well :)
I have a three-second PNG sequence (a logo animation) that I'd like to display right after my iOS app launches. Since this is the only animated sequence in the app, I'd prefer not to use Cocos2D.
But with UIImageView's animationImages, the app runs out of memory on iPod Touch devices.
Is there more memory-conscious/efficient way to show this animation? Perhaps a sprite sheet class that doesn't involve Cocos2D? Or something else?
If this is an animated splash screen or similar, note that the HIG frowns on such behavior (outside of fullscreen games, at least).
If you're undeterred by such arguments (or making a game), you might consider saving your animation as an MPEG-4 video and using MPMoviePlayerController to present it. With a good compressor, it should be possible to get the size and memory usage down quite a lot and still have a good quality logo animation.
I doubt you're going to find much improvement any other way -- a sprite sheet, for example, is still going to be doing the same kind of work as as sequence of PNGs. The problem is that for most animations, a lot of the pixels are untouched from frame to frame... if you're presenting it just as a series of images, you're wasting a lot of time and space on temporally duplicated pixels. This is why we have video codecs.
You could try manually loading/unloading the png images as needed. I don't know what your frame rate requirements are. Also, consider a decent-quality jpg or animated gif. And you can always make the image smaller so it doesn't take up the whole screen. Just a few thoughts.
I want to use an UIImagePicker to have a camera preview being displayed. Over this preview I want to place an overlay view with controls.
Is it possible to apply any effects to the preview which will be displayed from camera? I particularly need to apply a blur effect to the camera preview.
So I want to have a blurred preview from camera and overlay view with controls. If I decide to capture the still image from the camera, I need to have it original without blur effect. So blur effect must applied only to the preview.
Is this possible using such configuration or maybe with AVFoundation being used for accessing the camera preview or maybe somehow else, or that's impossible at all?
With AV foundation you could do almost everything you want since you can obtain single frame from the camera and elaborate them, but it could lead you at a dead-end applying a blur on an image in realtime is a pretty intensive task with laggy video results, that could lead you to waste hours of coding. I would suggest you to use the solution of James WebSster or OpenGL shaders. Take a look at this awesome free library written by one of my favorite guru Brad http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework even if you do not find the right filter, probably it will lead you to a correct implementation of what you want to do.
The right filter is Gaussian blur of course, but I don't know if it is supported, but you could do by yourself.
Almost forgot to say than in iOS 5 you have full access to the Accelerate Framework, made by Apple, you should look also into that.
From the reasonably limited amount of work I've done with UIImagePicker I don't think it is possible to apply the blur to the image you see using programatic filters.
What you might be able to do is to use the overlay to estimate blur. You could do this, for example, by adding an overlay which contains an image of semi-transparent frosted glass.