How to remove all UI elements from a camera ? I need to get the minimalistic display of camera as in the second screenshot.
One way to do it is to use UIImagePickerController, which is probably the easiest way to take a photo with the camera. That class has a showsCameraControls property that you can set to NO if you don't want the usual set of controls. If you do that, though, you'll have to set the cameraOverlayView property to a view that you supply. Normally, that view would contain your own set of camera controls, but you could instead pass in an empty view. You'll want to set up the view so that it responds to the user's gestures, so that they can still take the photo (with a tap, perhaps) or dismiss the camera without taking a photo (you could use a swipe for that).
class Scanner: NSObject {
//MARK: - Private Properties
private var captureSession: AVCaptureSession?
private(set) var videoLayer: AVCaptureVideoPreviewLayer?
//MARK: - Initialization
override init() {
super.init()
captureSession = AVCaptureSession()
if let captureSession = captureSession {
let device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do {
let input = try AVCaptureDeviceInput(device: device)
captureSession.addInput(input)
let output = AVCaptureMetadataOutput()
captureSession.addOutput(output)
} catch {
print(error)
abort()
}
videoLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
if let videoLayer = videoLayer {
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
}
}
}
In your UIViewController you set it up like this:
let scanner = Scanner()
if let videoLayer = scanner.videoLayer {
videoLayer.frame = self.view.bounds
self.view.layer.addSublayer(videoLayer)
scanner.startSession()
}
Related
I have a camera input from AVFoundtion, how can I stretch and rotate it to length UIView
LiveStreamView class I took the code from the documents to associate it with uiview
I would love to understand how to do it, thanks.
It looks like this
my code:
import Foundation
import AVFoundation
import UIKit
class AVFoundtionHandler {
let captureSesstion = AVCaptureSession()
init() {
configure()
}
func configure() {
let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video, position: .back)
guard let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice!),
captureSesstion.canAddInput(videoDeviceInput)
else { return }
captureSesstion.addInput(videoDeviceInput)
}
}
class LiveStreamView:UIView {
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
/// Convenience wrapper to get layer as its statically known type.
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
return layer as! AVCaptureVideoPreviewLayer
}
}
To begin with
Get device orientation UIDevice.current.orientation
Set the AVCaptureVideoPreviewLayer.connection?.videoOrientation property
let deviceOrientation = UIDevice.current.orientation
let videoOrientation = AVCaptureVideoOrientation(rawValue: deviceOrientation.rawValue)
videoPreviewLayer.connection?.videoOrientation = videoOrientation ?? .portrait
Note that this should run on main thread, or it will crash.
I also prefer to set at the same place the videoPreviewLayer.videoGravity = .resizeAspectFill (the default is resizeAspect)
The main complication is that although UIDeviceOrientation and AVCaptureVideoOrientation have some very similar values, they are not the same and UIDeviceOrientation has more values (faceUp, faceDown, unknown). I just gave it portrait as the default in such cases, but you need to make whatever is appropriate for your app.
After that you need to watch for device orientation change (with override func viewWillTransition of the UIView), and update the same value again if the device was rotated.
This solves the view problem. If you are processing the video, or capturing the image from the camera, it's the whole other set of things you need to do - separate problem though.
I'd like to be able to loop a live photo, for continuous playback.
So far, I'm trying to use the PHLivePhotoViewDelegate to accomplish this.
import Foundation
import SwiftUI
import PhotosUI
import iOSShared
struct LiveImageView: UIViewRepresentable {
let view: PHLivePhotoView
let model:LiveImageViewModel?
let delegate = LiveImageLargeMediaDelegate()
init(fileGroupUUID: UUID) {
let view = PHLivePhotoView()
// Without this, in landscape mode, I don't get proper scaling of the image.
view.contentMode = .scaleAspectFit
self.view = view
// Using this to replay live image repeatedly.
view.delegate = delegate
model = LiveImageViewModel(fileGroupUUID: fileGroupUUID)
guard let model = model else {
return
}
model.getLivePhoto(previewImage: nil) { livePhoto in
view.livePhoto = livePhoto
}
}
func makeUIView(context: Context) -> UIView {
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
guard let model = model else {
return
}
guard !model.started else {
return
}
model.started = true
view.startPlayback(with: .full)
}
}
class LiveImageLargeMediaDelegate: NSObject, PHLivePhotoViewDelegate {
func livePhotoView(_ livePhotoView: PHLivePhotoView, didEndPlaybackWith playbackStyle: PHLivePhotoViewPlaybackStyle) {
livePhotoView.stopPlayback()
DispatchQueue.main.asyncAfter(deadline: .now() + .milliseconds(200)) {
livePhotoView.startPlayback(with: .full)
}
}
}
But without full success. It seems the audio does play again, but not the video. The livePhotoView.stopPlayback and the async aspect are just additional changes I was trying. I've tried it without those too.
Note that I don't want the user to have to manually change the live photo (e.g., see NSPredicate to not include Loop and Bounce Live Photos).
Thoughts?
ChrisPrince I tried your code and it works fine for me, I just add delegate and start playback inside of it and everything runs well and smoothly. I thought that there is no point in using stop playback because the function itself says that the playback ended.
func livePhotoView(_ livePhotoView: PHLivePhotoView, didEndPlaybackWith playbackStyle: PHLivePhotoViewPlaybackStyle) {
livePhotoView.startPlayback(with: .full)
}
I am using Twilio Video QuickStart, Swift, while I was able to make a video call using the example, but I couldn't make it work when I bundled all of it into one class and used in my app's viewController.
Here is what I did:
I've put all the Twilio Code in one class VideoCall as shown below, so that I could use the Twilio Video in whichever Controller I need, by just calling VideoCall(myViewController: self) as shown below.
While the below code creates both views (videoView & subVideoView) and renders local video in subVideoView as intended, it comes up with an Error when an incoming call is received
TwilioVideo:[Platform]:Fatal runtime capture error: Error
Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be
completed"
However, I don't receive this error, when I put all the variables and functions into the class mainViewController instead of a separate class VideoCall, but this means I would be duplicating the code if I need to use Twilio Video in other View Controllers.
More info on Twilio class TVIVideoView which says
A TVIVideoView draws video frames from a TVIVideoTrack.
TVIVideoView should only be used on the application's main thread.
Subclassing TVIVideoView is not supported.
UIViewContentModeScaleToFill, UIViewContentModeScaleAspectFill and
UIViewContentModeScaleAspectFit are the only supported content modes.
Please advice how I could get around this issue.
class mainViewController : UIViewController{
//Declare variable in my viewController
var videoCall : VideoCall!
// And create an instance
fun viewDidAppear(){
videoCall = VideoCall(myViewController: self)
}
}
class VideoCall: NSObject {
// Declared variables for Twilio Video
var subVideoView : TVIVideoView!
var videoView : TVIVideoView!
var viewController : UIViewController
var accessToken = "TWILIO_ACCESS_TOKEN"
// Configure remote URL to fetch token from
var tokenUrl = "https://testapp.com/getTwilioVideoAccessToken"
// Video SDK components
var room: TVIRoom?
var camera: TVICameraCapturer?
var localVideoTrack: TVILocalVideoTrack?
var localAudioTrack: TVILocalAudioTrack?
var participant: TVIParticipant?
var remoteView: TVIVideoView?
// CallKit components
let callKitProvider : CXProvider
let callKitCallController : CXCallController
var callKitCompletionHandler: ((Bool)->Swift.Void?)? = nil
var callConfiguration : CXProviderConfiguration
init(myViewController: UIViewController) {
viewController = myViewController
let callConfiguration = CXProviderConfiguration(localizedName: "Video Call")
callConfiguration.maximumCallGroups = 1
callConfiguration.maximumCallsPerCallGroup = 1
callConfiguration.supportsVideo = true
print("configuration: \(callConfiguration)")
if let callKitIcon = UIImage(named: "iconMask80") {
callConfiguration.iconTemplateImageData = UIImagePNGRepresentation(callKitIcon)
}
callKitProvider = CXProvider(configuration: callConfiguration)
callKitCallController = CXCallController()
super.init()
subVideoView = TVIVideoView.init(frame: CGRect.zero, delegate: self)
videoView = TVIVideoView.init(frame: CGRect.zero, delegate: self)
callKitProvider.setDelegate(self, queue: nil)
setup()
}
deinit {
// CallKit has an odd API contract where the developer must call invalidate or the CXProvider is leaked.
callKitProvider.invalidate()
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
func setup(){
// The following is done here
// a. videoView and subVideoView are made subViews of window, so that they are overlaid on top of the viewController
// b. Setting size and position of videoView and subVideoView
}
}
extension VideoCall{
/*
Has all the callKit methods and Twilio methods, same as the example app but with these changes
a.the remoteView in the example app is replaced with videoView and screamed on initialisation, unlike Twilio Video example where it is created when connected to the room and cleared after disconnecting.
b. previewView is replaced with subVideoView.
*/
}
since this is my first post, just a few words about me: Usually I design stuff (UI primarily) but I really wanted to take a leap into programming to understand you guys better. So I decided build a small app to get going.
So I've been trying to figure this out for hours now – this is my first app project ever so I apologise for my newbyness.
All I want to do is to hide the controls of AVPlayer and disable landscape view but I just can't figure out where to put showsPlaybackControls = false.
import UIKit
import AVKit
import AVFoundation
class ViewController: UIViewController {
private var firstAppear = true
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
if firstAppear {
do {
try playVideo()
firstAppear = false
} catch AppError.InvalidResource(let name, let type) {
debugPrint("Could not find resource \(name).\(type)")
} catch {
debugPrint("Generic error")
}
}
}
private func playVideo() throws {
guard let path = NSBundle.mainBundle().pathForResource("video", ofType:"mp4") else {
throw AppError.InvalidResource("video", "m4v")
}
let player = AVPlayer(URL: NSURL(fileURLWithPath: path))
let playerController = AVPlayerViewController()
playerController.player = player
self.presentViewController(playerController, animated: false) {
player.play()
}
}
}
enum AppError : ErrorType {
case InvalidResource(String, String)
}
showsPlaybackControls is a property of AVPlayerViewController, so you can set it after you create it:
playerController.showsPlaybackControls = false
If you want to allow only landscape, you can subclass AVPlayerViewController, override supportedInterfaceOrientations and return only landscape, then use that class instead of AVPlayerViewController.
UPDATE
As mentioned in the comments, if you go to the documentation for AVPlayerViewController you'll find a warning that says:
Do not subclass AVPlayerViewController. Overriding this class’s methods is unsupported and results in undefined behavior.
They probably don't want you to override playback-related methods that could interfere with the behavior, and I would say that overriding only supportedInterfaceOrientations is safe.
However, if you want an alternative solution, you can create your own view controller class that overrides supportedInterfaceOrientations to return landscape only, and place the AVPlayerViewController inside that view controller.
I'm working on a "camera/photo" based application.
Application launching (sideMenu) : MainView is a normal VC in a NavigationCTRL with a camera button.
A tap on it pushes a new VC.
This one is normal but inside, there are these kind of things :
let captureSession: AVCaptureSession = AVCaptureSession()
var currentCaptureDevice: AVCaptureDevice?
var deviceInput: AVCaptureDeviceInput?
var stillCameraOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?
So it's a custom camera view controller.
I can - this way - have a better user experience.
Then I can take a photo (from the library, or by tapping the shot button (after putting on/off some features [Timer, HD, Flash..].
When it takes the picture, I use the prepareForSegue method to send the captured image to the next VC :
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
if segue.identifier == Constants.Segues.CameraToFiltersSegue {
var filterVC = segue.destinationViewController as! MYCLASSVC
filterVC.originalImage = pictureImage
}
}
And then, it pushed the new VC AND...
I got THIS MESSAGE : BSXPCMessage received error for message: Connection interrupted
The current VC contains :
an imageView (UIImageView) : the previous picture
A small collection view with small (kind of thumbnail) filtered pictures from the original (CIFilter)
Nothing more in this view. But I have this message.
What about this message?
Should I do something in a different way?
This is how I set my filters :
//At the top of the class
//EAGLRenderingAPI.OpenGLES2 but maybe EAGLRenderingAPI.OpenGLES3??
static let contextFast = CIContext(EAGLContext: EAGLContext(API: EAGLRenderingAPI.OpenGLES2), options: [kCIContextWorkingColorSpace: NSNull(), kCIContextUseSoftwareRenderer: false])
//In my static method (one of the parameter is "originalImage")
var ciimage: CIImage = CIImage(image: originalImage)
var imageFilter = CIFilter(name: "CIPhotoEffectInstant")
filter.setValue(ciimage, forKey: kCIInputImageKey)
let cgimg = contextFast.createCGImage(imageFilter.outputImage, fromRect: imageFilter.outputImage.extent())
return UIImage(CGImage: cgimg, scale: 1.0, orientation: image.imageOrientation)
This code runs fast!
I don't know where the message comes from.
So please if you have any idea or something more to help?
If I return nil before the next line, I never have the message.
let cgimg = contextFast.createCGImage(filter.outputImage, fromRect: filter.outputImage.extent())
And if I use a normal context with the software renderer option to "true" :
let context = CIContext(options: [kCIContextUseSoftwareRenderer: true])
The message disappears.
I tried to put this option into my last fastContext but nothing changed.
So it comes from the used context, but with a normal context, the rendered is x5 longer (maybe more). With the fast context, I never have a slow in my collectionView. I'm not even seeing a reload of the data.
But with a normal context, it's visible.
In the basic camera/photos applications, Apple has the same thing. A picture and a collectionView with thumbnail of the original picture but filtered! How do they do that? We can't even see the items loading.