Video Preview Layer Running While in Split-View? - ios

I currently have an app that displays the front facing camera atop a video preview layer. By default in iOS 9, the preview layer is interrupted/paused and will not resume until split-view is dismissed. Based on the nature of the app, maintaining the running camera preview layer while multitasking is essential.
Is there any way to force the capture session to continue previewing while in split view?

Update: Seems as if Apple does not allow any sort of camera use while the device has more than one application open. You can, however, invoke UIImagePickerController in order to take a photo while in split-view. Of course this solution only allows you to snap a single photo, and nothing more. Hope this helps someone!

Related

How to make a screen-record guiding video for iOS device

I'm creating an iOS app on recording screen by using ReplayKit.
I want to make a user operation video to guide user on how to use my app to record screen.
So I need to record a video on how to record screen and this approach is a bit awkward.
I tried several ways: Screen Mirroring such as AirPlayer and QuickTime, but the recording button on control-center is disabled whilst using Screen Mirroring.
The effect I want is as follows:
https://support.apple.com/library/content/dam/edam/applecare/images/en_US/iOS/ios13-iphone-xs-control-center-screen-record-animation.gif
How was this video made?
I think you could recreate that video by recording your screen showing every action you want, save the video and edit it on top of an iPhone image asset.

Unity ReplayKit how to bypass native preview?

Using the Unity ReplayKit API from https://docs.unity3d.com/550/Documentation/ScriptReference/Apple.ReplayKit.ReplayKit.html
I can record, preview and share screen recordings just fine, however due to design & UX requirements I need to have custom buttons to invoke sharing/saving of the recorded clip. I need to show the recorded clip right after the capture (for example like videotexture on a plane) in the background with custom share button overlays.
Is there a way to access the clip captured by ReplayKit and bypass iOS native preview screen?
With iOS 11, you can have direct access to the video. See https://developer.apple.com/videos/play/wwdc2017/606/
You can save it using AVAssetWriter, then do whatever you want with it after that.
This project doesn't completely work, but will head you in the right direction: https://medium.com/#giridharvc7/replaykit-screen-recording-8ee9a61dd762

How to implement apple like camera app using AVCaptureSession?

In the iOS(9) camera app the controls overlay does not rotate but when 'Record' is selected the video is outputted with the correct orientation. Has anyone got any ideas how apple implemented this.
You've got a few different options to achieve the same effect:
You can rotate the camera, but keep the UI the same, when the device itself is rotated so that it matches UIDevice.currentDevice().orientation.
You could store the device orientation at the time of capture and then rotate the video afterwards.
You could even set the capture connection of the capture session to match the devices orientation while keeping the preview connection the same. Check this out for a similar question
I implemented exactly what you're looking for using 2 different UIWindow instances, one for the video preview layer and one for the UI to be displayed on top of it. The UIWindow with the UI needs to have a transparent background in order for the preview layer to remain visible.

Camera feed from iPhone to Watch and image editing

Even with the release of WatchOS2, it seems developers still can't access the iPhone's camera for a live feed/stream. So I was wondering for different ways this can be done, basically I want to see on my apple watch, what is visible on my iPhone's camera.
Apple's builtin camera app (in watch) does this and so does the garage door control app.
I read a approach to the above problem in this post and found it quite possible, but still had a few questions and doubts:
Camera live View in apple watch
At what frame rate should the images be captured at ? For example,
for a 10frames/second, should a method to capture image be fired at
0.1second ?
Aside this, the image will be shared from iPhone to Watch using Shared App Group or MMWormHole ? Or is MMWH just for
"notifications" between devices on whats going on ?
Secondly, do I
need to save the image each time physically on device, transfer it
and delete it (from iphone and Apple Watch) after next image comes in, or
just do so using a imageObject ?
Aside this, I also want to show a overlay frame over the camera feed (like how image editing apps in iPhone show frames, etc.).
So, for this, should I be merging/overlaying my frame image over the feed image directly on the iPhone part before sending the image frame over to the watch ?

Swift - Slow Motion Camera in App

I'm been learning Swift and iOS development and am interested in making an application that uses the camera on the back of the iPhone to shoot slow motion video and then allow the user to use a slider to move through the video normally, but also to move frame by frame. Of course, this would be regulated by a button that transitioned the user from "normal navigation" to frame by frame. Can anyone point me in the right direction? Are there any resources where I can read about using the "slo-mo" feature in a custom app? Thanks.
If you don't want the user to be able to save the slow-mo video, what you could do is take a video shot, then have the app play it back in slow-mo. You can do all this using the AVFoundation framework. Here are some links to help you out:
Taking Control of the iPhone Camera in iOS 8
Objective-C: How to do slow-motion video in iOS (yes, this is for Objective-C, but you can change the code to Swift or just copy and paste it into a new Objective-C file)

Resources