how make AVFoundtion stretch and rotate to be LandScape on UIView - ios

I have a camera input from AVFoundtion, how can I stretch and rotate it to length UIView
LiveStreamView class I took the code from the documents to associate it with uiview
I would love to understand how to do it, thanks.
It looks like this
my code:
import Foundation
import AVFoundation
import UIKit
class AVFoundtionHandler {
let captureSesstion = AVCaptureSession()
init() {
configure()
}
func configure() {
let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video, position: .back)
guard let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice!),
captureSesstion.canAddInput(videoDeviceInput)
else { return }
captureSesstion.addInput(videoDeviceInput)
}
}
class LiveStreamView:UIView {
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
/// Convenience wrapper to get layer as its statically known type.
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
return layer as! AVCaptureVideoPreviewLayer
}
}

To begin with
Get device orientation UIDevice.current.orientation
Set the AVCaptureVideoPreviewLayer.connection?.videoOrientation property
let deviceOrientation = UIDevice.current.orientation
let videoOrientation = AVCaptureVideoOrientation(rawValue: deviceOrientation.rawValue)
videoPreviewLayer.connection?.videoOrientation = videoOrientation ?? .portrait
Note that this should run on main thread, or it will crash.
I also prefer to set at the same place the videoPreviewLayer.videoGravity = .resizeAspectFill (the default is resizeAspect)
The main complication is that although UIDeviceOrientation and AVCaptureVideoOrientation have some very similar values, they are not the same and UIDeviceOrientation has more values (faceUp, faceDown, unknown). I just gave it portrait as the default in such cases, but you need to make whatever is appropriate for your app.
After that you need to watch for device orientation change (with override func viewWillTransition of the UIView), and update the same value again if the device was rotated.
This solves the view problem. If you are processing the video, or capturing the image from the camera, it's the whole other set of things you need to do - separate problem though.

Related

Looping an iOS live photo programmatically in SwiftUI

I'd like to be able to loop a live photo, for continuous playback.
So far, I'm trying to use the PHLivePhotoViewDelegate to accomplish this.
import Foundation
import SwiftUI
import PhotosUI
import iOSShared
struct LiveImageView: UIViewRepresentable {
let view: PHLivePhotoView
let model:LiveImageViewModel?
let delegate = LiveImageLargeMediaDelegate()
init(fileGroupUUID: UUID) {
let view = PHLivePhotoView()
// Without this, in landscape mode, I don't get proper scaling of the image.
view.contentMode = .scaleAspectFit
self.view = view
// Using this to replay live image repeatedly.
view.delegate = delegate
model = LiveImageViewModel(fileGroupUUID: fileGroupUUID)
guard let model = model else {
return
}
model.getLivePhoto(previewImage: nil) { livePhoto in
view.livePhoto = livePhoto
}
}
func makeUIView(context: Context) -> UIView {
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
guard let model = model else {
return
}
guard !model.started else {
return
}
model.started = true
view.startPlayback(with: .full)
}
}
class LiveImageLargeMediaDelegate: NSObject, PHLivePhotoViewDelegate {
func livePhotoView(_ livePhotoView: PHLivePhotoView, didEndPlaybackWith playbackStyle: PHLivePhotoViewPlaybackStyle) {
livePhotoView.stopPlayback()
DispatchQueue.main.asyncAfter(deadline: .now() + .milliseconds(200)) {
livePhotoView.startPlayback(with: .full)
}
}
}
But without full success. It seems the audio does play again, but not the video. The livePhotoView.stopPlayback and the async aspect are just additional changes I was trying. I've tried it without those too.
Note that I don't want the user to have to manually change the live photo (e.g., see NSPredicate to not include Loop and Bounce Live Photos).
Thoughts?
ChrisPrince I tried your code and it works fine for me, I just add delegate and start playback inside of it and everything runs well and smoothly. I thought that there is no point in using stop playback because the function itself says that the playback ended.
func livePhotoView(_ livePhotoView: PHLivePhotoView, didEndPlaybackWith playbackStyle: PHLivePhotoViewPlaybackStyle) {
livePhotoView.startPlayback(with: .full)
}

SwiftUI - Determining Current Device and Orientation

I am trying to detect when the device is on iPad and in Portrait.
Currently I use the UIDevice API in UIKit and use an environment object to watch changes. I use the solution found here - Determining Current Device and Orientation.
However the orientationInfo.orientation is initially always equal to .portrait until rotated into portrait and then back to landscape.
So when doing the following to display the FABView
struct HomeView: View {
#EnvironmentObject var orientationInfo: OrientationInfo
let isPhone = UIDevice.current.userInterfaceIdiom == .phone
var body: some View {
ZStack(alignment: .bottom) {
#if os(iOS)
if isPhone == false && orientationInfo.orientation == .portrait {
FABView()
}
#endif
}
}
}
The view is loaded when the iPad is initially in landscape, but when changing to portrait and back to landscape is then removed. Why is this happening and how can I make sure the view isn't loaded on first load ?
Full Code
struct HomeTab: View {
var body: some View {
NavigationView {
HomeView()
.environmentObject(OrientationInfo())
}
}
}
struct HomeView: View {
#EnvironmentObject var orientationInfo: OrientationInfo
let isPhone = UIDevice.current.userInterfaceIdiom == .phone
var body: some View {
ZStack(alignment: .bottom) {
#if os(iOS)
if isPhone == false && orientationInfo.orientation == .portrait {
FABView()
}
#endif
}
}
}
final class OrientationInfo: ObservableObject {
enum Orientation {
case portrait
case landscape
}
#Published var orientation: Orientation
private var _observer: NSObjectProtocol?
init() {
// fairly arbitrary starting value for 'flat' orientations
if UIDevice.current.orientation.isLandscape {
self.orientation = .landscape
}
else {
self.orientation = .portrait
}
// unowned self because we unregister before self becomes invalid
_observer = NotificationCenter.default.addObserver(forName: UIDevice.orientationDidChangeNotification, object: nil, queue: nil) { [unowned self] note in
guard let device = note.object as? UIDevice else {
return
}
if device.orientation.isPortrait {
self.orientation = .portrait
}
else if device.orientation.isLandscape {
self.orientation = .landscape
}
}
}
deinit {
if let observer = _observer {
NotificationCenter.default.removeObserver(observer)
}
}
}
You can use UIDevice.orientationDidChangeNotification for detecting orientation changes but you shouldn't rely on it when the app starts.
UIDevice.current.orientation.isValidInterfaceOrientation will be false at the beginning and therefore both
UIDevice.current.orientation.isLandscape
and
UIDevice.current.orientation.isPortrait
will return false.
Instead you can use interfaceOrientation from the first window scene:
struct ContentView: View {
#State private var isPortrait = false
var body: some View {
Text("isPortrait: \(String(isPortrait))")
.onReceive(NotificationCenter.default.publisher(for: UIDevice.orientationDidChangeNotification)) { _ in
guard let scene = UIApplication.shared.windows.first?.windowScene else { return }
self.isPortrait = scene.interfaceOrientation.isPortrait
}
}
}
Also note that device orientation is not equal to interface orientation. When the device is upside down in portrait mode, device orientation is portrait but interface orientation can be landscape as well.
I think it's better to rely on the interface orientation in your case.
I would like to add another details that can help in some cases:
You can get orientation by rawValue:
UIDevice.current.orientation.rawValue
Portrait = 1
PortraitUpSideDown = 2
LandscapeLeft = 3 (Top of the Device go left)
LandscapeRight = 4 (Top of the Device go right)
What is often forgotten is that there is also a Flat orientation. This can be an issue when you test your app on the physical device.
Flat = 5 (Device is on the back)
Flat = 6 (Device is on the front/screen)
This is important when you use Notification for UIDevice.orientationDidChangeNotification.
Because you will get notification every time user change Flat orientation, but you expected only to get notified when it goes from Portrait to Landscape and vice versa.
Solutions to this problem
You can use .isValidInterfaceOrientation which returns only Portrait and Landscape orientations.
UIDevice.current.orientation.isValidInterfaceOrientation
You can change var only when rawValue is from 1...4
struct ContentView: View {
#State private var isPortrait = false
var body: some View {
Text("isPortrait: \(String(isPortrait))")
.onReceive(NotificationCenter.default.publisher(for: UIDevice.orientationDidChangeNotification)) { _ in
if UIDevice.current.orientation.rawValue <= 4 { // This will run code only on Portrait and Landscape changes
guard let scene = UIApplication.shared.windows.first?.windowScene else { return }
self.isPortrait = scene.interfaceOrientation.isPortrait
}
}
}
}

Alternative to animation(forKey:) (now deprecated)?

I am working with iOS 11 (for ARKit) and while many point to a sample app for SceneKit from Apple with a Fox, I am having problem with the extension it uses in that sample project (file) to add animations:
extension CAAnimation {
class func animationWithSceneNamed(_ name: String) -> CAAnimation? {
var animation: CAAnimation?
if let scene = SCNScene(named: name) {
scene.rootNode.enumerateChildNodes({ (child, stop) in
if child.animationKeys.count > 0 {
animation = child.animation(forKey: child.animationKeys.first!)
stop.initialize(to: true)
}
})
}
return animation
}
}
It seems that this extension is very handy but I am not sure how to migrate this now that it is deprecated? Is it built into SceneKit by default now?
The documentation didn't really show much info on why it was deprecated or where to go from here.
Thanks
TL;DR: examples of how to use new APIs can be found in Apple's sample game (search for SCNAnimationPlayer)
Even though animation(forKey:) and its sister methods that work with CAAnimation have been deprecated in iOS11, you can continue using them – everything will work.
But if you want to use new APIs and don't care about backwards compatibility (which you wouldn't need in the case of ARKit anyway, because it's only available since iOS11), read on.
The newly introduced SCNAnimationPlayer provides a more convenient API compared to its predecessors. It is now easier to work with animations in real time.
This video from WWDC2017 would be a good starting point to learn about it.
As a quick summary: SCNAnimationPlayer allows you to change animation's speed on the fly. It provides a more intuitive interface for animation playback using methods such as play() and stop() compared to adding and removing CAAnimations.
You also can blend two animations together which, for example, can be used to make smooth transitions between them.
You can find examples of how to use all of this in the Fox 2 game by Apple.
Here's the extension you've posted adapted to use SCNAnimationPlayer (which you can find in the Character class in the Fox 2 sample project):
extension SCNAnimationPlayer {
class func loadAnimation(fromSceneNamed sceneName: String) -> SCNAnimationPlayer {
let scene = SCNScene( named: sceneName )!
// find top level animation
var animationPlayer: SCNAnimationPlayer! = nil
scene.rootNode.enumerateChildNodes { (child, stop) in
if !child.animationKeys.isEmpty {
animationPlayer = child.animationPlayer(forKey: child.animationKeys[0])
stop.pointee = true
}
}
return animationPlayer
}
}
You can use it as follows:
Load the animation and add it to the corresponding node
let jumpAnimation = SCNAnimationPlayer.loadAnimation(fromSceneNamed: "jump.scn")
jumpAnimation.stop() // stop it for now so that we can use it later when it's appropriate
model.addAnimationPlayer(jumpAnimation, forKey: "jump")
Use it!
model.animationPlayer(forKey: "jump")?.play()
Lësha Turkowski's answer without force unwraps.
extension SCNAnimationPlayer {
class func loadAnimationPlayer(from sceneName: String) -> SCNAnimationPlayer? {
var animationPlayer: SCNAnimationPlayer?
if let scene = SCNScene(named: sceneName) {
scene.rootNode.enumerateChildNodes { (child, stop) in
if !child.animationKeys.isEmpty {
animationPlayer = child.animationPlayer(forKey: child.animationKeys[0])
stop.pointee = true
}
}
}
return animationPlayer
}
}
Here's an example of SwiftUI and SceneKit
import SwiftUI
import SceneKit
struct ScenekitView : UIViewRepresentable {
#Binding var isPlayingAnimation: Bool
let scene = SCNScene(named: "art.scnassets/TestScene.scn")!
func makeUIView(context: Context) -> SCNView {
// create and add a camera to the scene
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
scene.rootNode.addChildNode(cameraNode)
let scnView = SCNView()
return scnView
}
func updateUIView(_ scnView: SCNView, context: Context) {
scnView.scene = scene
// allows the user to manipulate the camera
scnView.allowsCameraControl = true
controlAnimation(isAnimating: isPlayingAnimation, nodeName: "TestNode", animationName: "TestAnimationName")
}
func controlAnimation(isAnimating: Bool, nodeName: String, animationName: String) {
guard let node = scene.rootNode.childNode(withName: nodeName, recursively: true) else { return }
guard let animationPlayer: SCNAnimationPlayer = node.animationPlayer(forKey: animationName) else { return }
if isAnimating {
print("Play Animation")
animationPlayer.play()
} else {
print("Stop Animation")
animationPlayer.stop()
}
}
}
struct DogAnimation_Previews: PreviewProvider {
static var previews: some View {
ScenekitView(isPlayingAnimation: .constant(true))
}
}
A 2023 example.
I load typical animations like this:
func simpleLoadAnim(filename: String) -> SCNAnimationPlayer {
let s = SCNScene(named: filename)!
let n = s.rootNode.childNodes.filter({!$0.animationKeys.isEmpty}).first!
return n.animationPlayer(forKey: n.animationKeys.first!)!
}
So,
laugh = simpleLoadAnim(filename: "animeLaugh") // animeLaugh.dae
giggle = simpleLoadAnim(filename: "animeGiggle")
You then, step one, have to add them to the character:
sally.addAnimationPlayer(laugh, forKey: "laugh")
sally.addAnimationPlayer(giggle, forKey: "giggle")
very typically you would have only one going at a time. So set the weights, step two.
laugh.blendFactor = 1
giggle.blendFactor = 0
to play or stop an SCNAnimationPlayer it's just step three
laugh.play()
giggle.stop()
Almost certainly, you will have (100s of lines) of your own code to blend between animations (which might take only a short time, 0.1 secs, or may take a second or so). To do so, you would use SCNAction.customAction.
If you prefer you can access the animation, on, the character (sally) using the keys. But really the "whole point" is you can just start, stop, etc etc the SCNAnimationPlayer.
You will also have lots of code to set up the SCNAnimationPlayer how you like (speed, looping, mirrored, etc etc etc etc etc etc etc etc)
You will need THIS very critical answer to get collada files (must be separate character / anim files) working properly https://stackoverflow.com/a/75093081/294884
Once you have the various SCNAnimationPlayer animations working properly, it is quite easy to use, run, blend etc animes.
The essential sequence is
each anime must be in its own .dae file
load each anime files in to a SCNAnimationPlayer
"add" all the animes to the character in question
program the blends
then simply play() or stop() the actual SCNAnimationPlayer items (don't bother using the keys on the character, it's a bit pointless)

How to get a display of camera without UI elements?

How to remove all UI elements from a camera ? I need to get the minimalistic display of camera as in the second screenshot.
One way to do it is to use UIImagePickerController, which is probably the easiest way to take a photo with the camera. That class has a showsCameraControls property that you can set to NO if you don't want the usual set of controls. If you do that, though, you'll have to set the cameraOverlayView property to a view that you supply. Normally, that view would contain your own set of camera controls, but you could instead pass in an empty view. You'll want to set up the view so that it responds to the user's gestures, so that they can still take the photo (with a tap, perhaps) or dismiss the camera without taking a photo (you could use a swipe for that).
class Scanner: NSObject {
//MARK: - Private Properties
private var captureSession: AVCaptureSession?
private(set) var videoLayer: AVCaptureVideoPreviewLayer?
//MARK: - Initialization
override init() {
super.init()
captureSession = AVCaptureSession()
if let captureSession = captureSession {
let device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do {
let input = try AVCaptureDeviceInput(device: device)
captureSession.addInput(input)
let output = AVCaptureMetadataOutput()
captureSession.addOutput(output)
} catch {
print(error)
abort()
}
videoLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
if let videoLayer = videoLayer {
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
}
}
}
In your UIViewController you set it up like this:
let scanner = Scanner()
if let videoLayer = scanner.videoLayer {
videoLayer.frame = self.view.bounds
self.view.layer.addSublayer(videoLayer)
scanner.startSession()
}

AVPlayer Swift: How do I hide controls and disable landscape view?

since this is my first post, just a few words about me: Usually I design stuff (UI primarily) but I really wanted to take a leap into programming to understand you guys better. So I decided build a small app to get going.
So I've been trying to figure this out for hours now – this is my first app project ever so I apologise for my newbyness.
All I want to do is to hide the controls of AVPlayer and disable landscape view but I just can't figure out where to put showsPlaybackControls = false.
import UIKit
import AVKit
import AVFoundation
class ViewController: UIViewController {
private var firstAppear = true
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
if firstAppear {
do {
try playVideo()
firstAppear = false
} catch AppError.InvalidResource(let name, let type) {
debugPrint("Could not find resource \(name).\(type)")
} catch {
debugPrint("Generic error")
}
}
}
private func playVideo() throws {
guard let path = NSBundle.mainBundle().pathForResource("video", ofType:"mp4") else {
throw AppError.InvalidResource("video", "m4v")
}
let player = AVPlayer(URL: NSURL(fileURLWithPath: path))
let playerController = AVPlayerViewController()
playerController.player = player
self.presentViewController(playerController, animated: false) {
player.play()
}
}
}
enum AppError : ErrorType {
case InvalidResource(String, String)
}
showsPlaybackControls is a property of AVPlayerViewController, so you can set it after you create it:
playerController.showsPlaybackControls = false
If you want to allow only landscape, you can subclass AVPlayerViewController, override supportedInterfaceOrientations and return only landscape, then use that class instead of AVPlayerViewController.
UPDATE
As mentioned in the comments, if you go to the documentation for AVPlayerViewController you'll find a warning that says:
Do not subclass AVPlayerViewController. Overriding this class’s methods is unsupported and results in undefined behavior.
They probably don't want you to override playback-related methods that could interfere with the behavior, and I would say that overriding only supportedInterfaceOrientations is safe.
However, if you want an alternative solution, you can create your own view controller class that overrides supportedInterfaceOrientations to return landscape only, and place the AVPlayerViewController inside that view controller.

Resources