Create an animated movie of a UIVIew - perhaps using snapshotViewAfterScreenUpdates - ios

In iOS 7 it is quite easy to create a screen shot using the new snapshotViewAfterScreenUpdates UIView method.
I would like to create a screen recording using this (perhaps a different method is better). The finial outcome should be an an animated gif - or something that can be exported and viewed in a media player.
Any suggested method to do this?

Related

What is PGHostedWindow in window hierarcy on iPad and how to prevent their creating?

Ok. Here is the question. I have tabBar controller, in tab at index 0 I have a TableView with cells showing video by AVPlayerViewController. For iPhone, when i print content of UIApplication.shared.windows - it has only 2 window - UIWindow and UITextEffectWindow. But for iPad - it has UIWindow, UITextEffectWindow and several PGHostedWindow (3-4 depends on the number of cells with video).
What are those PGHostedWindows? It seems to me that they are creating along with AVPlayer when it's view is added to cell's view hierarcy - and it led me to the thought that it may be connected with iPad's ability to show video in "Picture in picture" mode. But even if I set AVPlayerViewController's allowsPictureInPicture to false - those windows are still creating. And the worst part - even if I scroll those cells with video from visible area,or go to another tab - those PGHostedWindows are not deallocated.
So the question is - what are those PGHostedWindows/ and how to prevent their creating?
I was trying to debug an AVKit issue with restoring from picture in picture and also noticed and wondered about PGHostedWindow. Like another commenter said, I don't see a reason for concern about its creation or lifetime. My understanding is AVPlayer moves the AVPlayerLayer between windows when moving in/out of PiP, and this window is one that the system manages.
Of note, which led me here: I noticed there is a longstanding issue where any subviews on AVPlayerLayer get clipped by the video after the view gets reinstalled on PiP. Apple's sample code has the same issue if the controls are extended. So maybe there's a few issues with PGHostedWindow, but we're stuck with them.

How to force SCNView to render a new frame?

Is there a way to force SCNView to render a new frame on demand if there is no animation inside the scene? If the scene is static SCNView renders exactly once and then only after something changes.
Usually this makes sense, but I am working together with the Vuforia augmented reality framework which requires me to render a new frame every time it processed a new video frame from the camera. I worked around this issue by creating my own UIView with a CAEAGLLayer which renders the SceneKit content using an SCNRenderer. This works great, but I am curious if there is a way to do this with SCNView so I can avoid directly touching OpenGL ES.
Update
As of iOS 11.0 and macOS 10.13 the rendersContinuously property on SCNView is the preferred way to force the view to continuously render frames.
Previous answer
you can set its playing property to YES
You can increment the sceneTime like so:
sceneView.sceneTime += 1
It will then render a single frame.
As a UIView, you can use the same mechanism to request an update as you would any other UIView: [_sceneView setNeedsDisplay]; However, as pointed out, this shouldn't be used as the primary way to drive updates as it would not be synced with the display.

Display 3D-Object with COCOS3D on top of AR-View

I work on an app, which scans an image and shows you a 3D-Object or a video on top of the image target. Normal AR-App. For that AR stuff I usw the Vuforia SDK. The problem is, that the Vuforia SDK don't support animated 3D-Objects and for that I use cocos3d.
So I created a basic cocos3d app and included my vuforia stuff for the AR. This works good and the app displays normal 3D-Objects and videos. This was the background, now the problem.
The first view I have is my camera view, which scans the images. If I scan now a specified target, I want to show an animated 3D-Object. For that, I display the cocos3d view on top of the AR-View. The cocos3d view is transparent and is displayed on top of the AR-View (tested this with a simple button in the cocos3d view).
The problem is, that I'am not able to display a animated 3D-Object. I tested some options but none of them worked because I don't really have an idea how to do that. My current code:
CCDirector *director = CCDirector.sharedDirector;
EAGLViewCC *glView = [EAGLViewCC viewWithFrame:[window bounds] pixelFormat:kEAGLColorFormatRGBA8 depthFormat:GL_DEPTH_COMPONENT16_OES];
[director setOpenGLView:glView];
[window addSubview:director.openGLView];
After that I have a layer and add the test scene to my layer (Standard from the example). But than I don't know how to display it. It tried this:
[director pushScene:scene];
but no luck. In the example, they use this code to show the object (viewController is of type "CC3DeviceCameraOverlayUIViewController").
[viewController runSceneOnNode: mainLayer];
Why don't I use the viewController? Because I couldn't get the view transparent. So how do I get the 3D-Object displayed in my view? What do I have to do? Am I completely wrong?

how to take a UIView screenshot faster?

I'm making a dictionary app, which allows user find a word definition by touch the word in screen, and there will be a magnifier on screen and follow user's finger. I have implement it by take a screenshot of view and assign the image to maginifer image view, which is a UIImageView, however, in order to take a screenshot, the method [self.layer renderInContext:c]; cost too much time, is there any other way to do it ?maybe openGL will help?
after profiling my app with instruments->core animation, it is only 9 fps showing magnifier, but it will 30 fps if showing the system default magnifier in a UITextView, I don't know why the system is so fast
You can use exact same view in magnifier view, and change position to visible words.
There are new methods in iOS 7 that are highly optimized :
– snapshotViewAfterScreenUpdates:
– resizableSnapshotViewFromRect:afterScreenUpdates:withCapInsets:
– drawViewHierarchyInRect:afterScreenUpdates:
However they are not available in previous versions.

AVCam Demo OverscanCompensation implementation

I'm currently working with the AVCam demo app to present a live camera feed over airplay or apple hdmi adapter for import into a HD camera switcher.
The issue I'm having is with OverScanCompensation to remove the huge black border from the mirrored view.
The only documentation I have found is to implement the screen.overscanCompensation = 3; method someplace? I have tried to put it into viewDidLoad and it will let me, but it doesn't change anything on the external view?
I had success of sorts with the Airplay Demo (quellish) using UIImagePicker, but I would much prefer to implement AVFoundation for this exercise.
Is there a better way to achieve what I'm looking for without having to implement separate view controllers?
All you need to do is, upon setting up the external screen (via, say, if ([[UIScreen screens] count] > 1) externalScreen = (UIScreen *)[[UIScreen screens] objectAtIndex:1];), set the overscanCompensation property of the above UIScreen instance to UIScreenOverscanCompensationInsetApplicationFrame (=2). It'll entirely get rid of both the border (overscanning) and image quality-deterioating scaling.
See http://www.iphonelife.com/blog/87/tv-display-output-why-does-your-picture-have-black-border-and-how-can-it-be-fixed for more info.

Resources