Swift - Slow Motion Camera in App - ios

I'm been learning Swift and iOS development and am interested in making an application that uses the camera on the back of the iPhone to shoot slow motion video and then allow the user to use a slider to move through the video normally, but also to move frame by frame. Of course, this would be regulated by a button that transitioned the user from "normal navigation" to frame by frame. Can anyone point me in the right direction? Are there any resources where I can read about using the "slo-mo" feature in a custom app? Thanks.

If you don't want the user to be able to save the slow-mo video, what you could do is take a video shot, then have the app play it back in slow-mo. You can do all this using the AVFoundation framework. Here are some links to help you out:
Taking Control of the iPhone Camera in iOS 8
Objective-C: How to do slow-motion video in iOS (yes, this is for Objective-C, but you can change the code to Swift or just copy and paste it into a new Objective-C file)

Related

Unity ReplayKit how to bypass native preview?

Using the Unity ReplayKit API from https://docs.unity3d.com/550/Documentation/ScriptReference/Apple.ReplayKit.ReplayKit.html
I can record, preview and share screen recordings just fine, however due to design & UX requirements I need to have custom buttons to invoke sharing/saving of the recorded clip. I need to show the recorded clip right after the capture (for example like videotexture on a plane) in the background with custom share button overlays.
Is there a way to access the clip captured by ReplayKit and bypass iOS native preview screen?
With iOS 11, you can have direct access to the video. See https://developer.apple.com/videos/play/wwdc2017/606/
You can save it using AVAssetWriter, then do whatever you want with it after that.
This project doesn't completely work, but will head you in the right direction: https://medium.com/#giridharvc7/replaykit-screen-recording-8ee9a61dd762

Video Preview Layer Running While in Split-View?

I currently have an app that displays the front facing camera atop a video preview layer. By default in iOS 9, the preview layer is interrupted/paused and will not resume until split-view is dismissed. Based on the nature of the app, maintaining the running camera preview layer while multitasking is essential.
Is there any way to force the capture session to continue previewing while in split view?
Update: Seems as if Apple does not allow any sort of camera use while the device has more than one application open. You can, however, invoke UIImagePickerController in order to take a photo while in split-view. Of course this solution only allows you to snap a single photo, and nothing more. Hope this helps someone!

Trying to create an Xcode Objective-C function that records a video capture of my UIView contents and saves to phone

I'm trying to create an Xcode Objective-C function that can be called from a button tap, that will record the contents of a UIView and its subviews (or a fixed section of the screen e.g. 320x320 in the center) and then allow the user to save the video to their iPhone camera roll. I also want to include the audio that is being played by the app at the time of the recording e.g background music and sound effects.
I'm having trouble finding any info on this as their seems to be a lot of people trying to record their running app for external purposes like the app store video preview. I need my video captured within the app, to be used as an app feature.
Does anyone know if this can be done or know a website or tutorial where I can learn what's needed? Thanks
I know this post is two years old, but for anybody who comes along who might need to record their iOS app's screens and save them to the phone's camera roll or even a specific URL, take a look at https://github.com/alskipp/ASScreenRecorder
I've tried it and it works! The frames per second aren't 60 so I don't know how well it would work if you were trying to record an action game, but it's still pretty awesome.
You can't do that with just one function, check that project:
https://github.com/coolstar/RecordMyScreen

iOS App: AVFoundation Video camera recording delegates

I am trying to develop an app that records the video using the rear camera.
I want the video to be tweaked (adding overlay image, green screen) before saving it to a file.
I am using AVFoundation, and have researched anywhere, but I still cant make the correct codes to preview and record (without the tweaking)
Some codes that Iget from the internet use BufferDelegate, the rest are using RecordingDelegate.
Do I need to use both to achieve what I want? or is it just one ofthem?
Could you please pinpoint me how I can achieve this?

Multiple Video, Same Screen

Is it possible to play more than one video the same time / on the same screen with iOS?
(E.g. Picture-in-Picture mode or split screen.)
Step by step and full source code on how to do this: http://iosguy.com/2012/01/11/multiple-video-playback-on-ios/
You can't using MPMoviePlayerController. The documentation states that clearly..
Although you can create multiple MPMoviePlayerController objects and
present their views in your interface, only one movie player at a time
can play its movie.
But I think you can do this with lower level AVFoundation framework..But I never tried it..See this
To update the answers up today's SDKs, with the SDK for iOS 5 (and probably back to 4) you can use AVFoundation to play video in an AVPlayerLayer and add a number of layers to a view. I've confirmed that three separate WQVGA h.264 video streams can play on the same view at the same time under both iOS 4.3.5 on an iPhone 3GS and iOS 5.0 on an iPhone 4S.

Resources