Rendering CALayers with CoreAnimations - ios

How is it possible to render CALayer that has lots of animations to a mov file?
I know how to render it with AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer but therefore i need a dummy video (i have one 1x1px) but i am wondering how this can be done without having a mp3/mov, so simply rendering a CALayer to a video without any other resources.
Any ideas? Maybe even code examples?

This RenderCoreAnimationToVideo will help you. Check it.
The above example is for Mac OS, but it easy to convert for iOS.

Related

Drawing video frames (images) - ios

I'm working on a project and one of the features is to play a video from a rtmp path. I'm trying to find the best way to draw the frames. Right now I'm using a UIImageView and works, but It's not very elegant nor efficient. OpenGL might do the trick but I've never used it before. Do you have some ideas what should I use? If you agree with OpenGL can you give me a code snippet that I could use for drawing the frames?
Thank you.

Most efficient way for displaying procedurally generated video frames on iOS

I need to display a sequence procedurally generated images as video sequence preferably with built-in controls (controls would be nice to have, but no requirement) and I'm just looking for a bit of guidance of which API to use. There seem to be a number of options but I'm not sure which one is the best suited to my needs. GPUImage, Core Video, Core Animation, OpenGL ES or something else?
Targetting just iOS 6 and up would be no problem if that helps.
Update: I'd prefer something that would allow me to display the video frames directly rather than writing them to a temporary movie.
Checkout the animationImages property of UIImageView. This may do what you are looking for. Basically you would store all of your images into this array so it can handle the animation for you. UIImageView Reference

Apply GPUImage filter to a UIView

I've a problem. I need to apply a filter like Pixelate or Blur to an entire UIView.
Like the eBay iPad app.
I thought to use GPUImage but I don't know how to do it.
There is a way to apply a filter to a GPUImageView directly without pass a UIImage?
The primary problem is that making a screenshot of a large UIView on an iPad 3rd is to expensive (2 seconds for the UIWindow grab). So the perfect solution is to apply filter directly to the views, just like eBay app, but.. how?
Thanks to all!
To pull a view into GPUImage, you can use a GPUImageUIElement source, which takes a UIView or CALayer as input. There's an example of this in the FilterShowcase sample application.
This does rely on the -renderInContext: method of an underlying CALayer, which can be expensive for redrawing the view. However, if the view is static, you just need to use this update once and the resulting image will be cached on the GPU as a texture. Filter actions applied to it after that point will be very fast.
You might be able to achieve the look you are after by applying CIFilters to your views layer.filters property. Check the docs for more info:
https://developer.apple.com/library/mac/#documentation/graphicsimaging/reference/CALayer_class/Introduction/Introduction.html
Maybe this is something for you? Haven't tried it but read about it in a post once:
StackBlur
Ow sorry, I read your post again and this extension is about blurring an UIImage, and you said that this was something you didn't want...
Well I'll leave it here anyways if people go googling for blurring an image..
Sorry :(

How to use OpenGL ES sharegroup to share a render buffer for screen mirroring on the iPad?

I'm trying to do screen mirroring on the iPad with OpenGL 1.1. I've got to the point of setting up the external window and view. I'm using OpenGL on the first screen, and I've read that I can setup a shared render buffer, but since I'm somewhat of an OpenGL beginner I'm having some trouble getting something up and running that can share a render buffer.
I've got as far as setting up two separate contexts and rendering different things to both, but of course I would like to share the render buffer for the sake of efficiency. The Apple documentation explains how I would set up a share group object and initialize a shared context, but I would like to also know how I would go about setting up and sharing a render buffer so that the external screen can just draw this render buffer to it's frame buffer.
The eventual goal is to do the screen mirroring as efficiently as possible, so any advice on the matter would be most appreciated.
,
I think this topic in the cocos2d forums would be a good read for you! ( Scroll down to the last posts ).
Maybe you're not using Cocos2d at all, but the information there is quite valuable, and there's some code too.
Good luck!

Save an animated gif from an actionscripted animation

Is it possible to save a .fla file as an animated gif from an actionscripted animation? I know you can do it from a tweened animation quite easily, but haven't been able to figure out a way to do it from a scripted one.
I am not 100% sure I understand your question.
If you want to write an actionscript animation to a file from the Flash IDE, you can try quicktime(.mov) by choosing:
File > Export Movie and choosing QuickTime as the output format. This will allow you save the rendering of your actionscript to a file. A handy video by Lee Brimlow is available here.
I don't think it's possible to save a gif from the IDE for actionscript, but if GIF is a must you can try rendering with actionscript into an array of BitmapData objects that you would feed to a GIFEncoder.
Have a look at this fun and easy to use GIF Encoder by Thibault Imbert. I found it really easy to get started with.
Have fun!
In case you're still curious, check Converting SWF into GIF file

Resources