I have set up my custom camera, and already coded the video preview. I have a button on the screen that i want to use to capture video when it is pressed. I don't know how to go about it. Everything so far is set up and working fine.
In the start recording button function, i just need the code necessary to capture the video and save it. Thank you
class CameraViewController: UIViewController, AVAudioRecorderDelegate {
#IBOutlet var recordOutlet: UIButton!
#IBOutlet var recordLabel: UILabel!
#IBOutlet var cameraView: UIView!
var tempImage: UIImageView?
var captureSession: AVCaptureSession?
var stillImageOutput: AVCapturePhotoOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
var currentCaptureDevice: AVCaptureDevice?
var usingFrontCamera = false
/* This is the function i want to use to start
recording a video */
#IBAction func recordingButton(_ sender: Any) {
}
it seems as though Apple prefers developers to use the default camera for capturing video. If you are ok with that, I found a tutorial online with code to help. https://www.raywenderlich.com/94404/play-record-merge-videos-ios-swift.
You can scroll down to the "recording video" section and it will walk you through it with code.
Here's some of what it says: "
import MobileCoreServices
You’ll also need to adopt the same protocols as PlayVideoViewController, by adding the following to the end of the file:
`extension RecordVideoViewController: UIImagePickerControllerDelegate {
}
extension RecordVideoViewController: UINavigationControllerDelegate {
}
Add the following code to RecordVideoViewController:
`func startCameraFromViewController(viewController: UIViewController, withDelegate delegate: protocol<UIImagePickerControllerDelegate, UINavigationControllerDelegate>) -> Bool {
if UIImagePickerController.isSourceTypeAvailable(.Camera) == false {
return false
}
var cameraController = UIImagePickerController()
cameraController.sourceType = .Camera
cameraController.mediaTypes = [kUTTypeMovie as NSString as String]
cameraController.allowsEditing = false
cameraController.delegate = delegate
presentViewController(cameraController, animated: true, completion: nil)
return true
}`
This method follows the same logic is in PlayVideoViewController, but it accesses the .Camera instead to record video.
Now add the following to record(_:):
startCameraFromViewController(self, withDelegate: self)
You are again in familiar territory. The code simply calls startCameraControllerFromViewController(_:usingDelegate:) when you tap the “Record Video” button.
Build and run to see what you’ve got so far.
Go to the Record screen and press the “Record Video” button. Instead of the Photo Gallery, the camera UI opens. Start recording a video by tapping the red record button at the bottom of the screen, and tap it again when you’re done recording."
Cheers,
Theo
Here is worked code, you need to deal correct with optional values and error handling in real project, but you can use this next code as example:
//
// ViewController.swift
// CustomCamera
//
// Created by Taras Chernyshenko on 6/27/17.
// Copyright © 2017 Taras Chernyshenko. All rights reserved.
//
import UIKit
import AVFoundation
import AssetsLibrary
class CameraViewController: UIViewController,
AVCaptureAudioDataOutputSampleBufferDelegate,
AVCaptureVideoDataOutputSampleBufferDelegate {
#IBOutlet var recordOutlet: UIButton!
#IBOutlet var recordLabel: UILabel!
#IBOutlet var cameraView: UIView!
var tempImage: UIImageView?
private var session: AVCaptureSession = AVCaptureSession()
private var deviceInput: AVCaptureDeviceInput?
private var previewLayer: AVCaptureVideoPreviewLayer?
private var videoOutput: AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()
private var audioOutput: AVCaptureAudioDataOutput = AVCaptureAudioDataOutput()
private var videoDevice: AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
private var audioConnection: AVCaptureConnection?
private var videoConnection: AVCaptureConnection?
private var assetWriter: AVAssetWriter?
private var audioInput: AVAssetWriterInput?
private var videoInput: AVAssetWriterInput?
private var fileManager: FileManager = FileManager()
private var recordingURL: URL?
private var isCameraRecording: Bool = false
private var isRecordingSessionStarted: Bool = false
private var recordingQueue = DispatchQueue(label: "recording.queue")
var captureSession: AVCaptureSession?
var stillImageOutput: AVCapturePhotoOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
var currentCaptureDevice: AVCaptureDevice?
var usingFrontCamera = false
/* This is the function i want to use to start
recording a video */
#IBAction func recordingButton(_ sender: Any) {
if self.isCameraRecording {
self.stopRecording()
} else {
self.startRecording()
}
self.isCameraRecording = !self.isCameraRecording
}
override func viewDidLoad() {
super.viewDidLoad()
self.setup()
}
private func setup() {
self.session.sessionPreset = AVCaptureSessionPresetHigh
self.recordingURL = URL(fileURLWithPath: "\(NSTemporaryDirectory() as String)/file.mov")
if self.fileManager.isDeletableFile(atPath: self.recordingURL!.path) {
_ = try? self.fileManager.removeItem(atPath: self.recordingURL!.path)
}
self.assetWriter = try? AVAssetWriter(outputURL: self.recordingURL!,
fileType: AVFileTypeQuickTimeMovie)
let audioSettings = [
AVFormatIDKey : kAudioFormatAppleIMA4,
AVNumberOfChannelsKey : 1,
AVSampleRateKey : 16000.0
] as [String : Any]
let videoSettings = [
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : UIScreen.main.bounds.size.width,
AVVideoHeightKey : UIScreen.main.bounds.size.height
] as [String : Any]
self.videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,
outputSettings: videoSettings)
self.audioInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio,
outputSettings: audioSettings)
self.videoInput?.expectsMediaDataInRealTime = true
self.audioInput?.expectsMediaDataInRealTime = true
if self.assetWriter!.canAdd(self.videoInput!) {
self.assetWriter?.add(self.videoInput!)
}
if self.assetWriter!.canAdd(self.audioInput!) {
self.assetWriter?.add(self.audioInput!)
}
self.deviceInput = try? AVCaptureDeviceInput(device: self.videoDevice)
if self.session.canAddInput(self.deviceInput) {
self.session.addInput(self.deviceInput)
}
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.session)
self.previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
let rootLayer = self.view.layer
rootLayer.masksToBounds = true
self.previewLayer?.frame = CGRect(x: 0, y: 0, width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)
rootLayer.insertSublayer(self.previewLayer!, at: 0)
self.session.startRunning()
DispatchQueue.main.async {
self.session.beginConfiguration()
if self.session.canAddOutput(self.videoOutput) {
self.session.addOutput(self.videoOutput)
}
self.videoConnection = self.videoOutput.connection(withMediaType: AVMediaTypeVideo)
if self.videoConnection?.isVideoStabilizationSupported == true {
self.videoConnection?.preferredVideoStabilizationMode = .auto
}
self.session.commitConfiguration()
let audioDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
let audioIn = try? AVCaptureDeviceInput(device: audioDevice)
if self.session.canAddInput(audioIn) {
self.session.addInput(audioIn)
}
if self.session.canAddOutput(self.audioOutput) {
self.session.addOutput(self.audioOutput)
}
self.audioConnection = self.audioOutput.connection(withMediaType: AVMediaTypeAudio)
}
}
private func startRecording() {
if self.assetWriter?.startWriting() != true {
print("error: \(self.assetWriter?.error.debugDescription ?? "")")
}
self.videoOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)
self.audioOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)
}
private func stopRecording() {
self.videoOutput.setSampleBufferDelegate(nil, queue: nil)
self.audioOutput.setSampleBufferDelegate(nil, queue: nil)
self.assetWriter?.finishWriting {
print("saved")
}
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer
sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
if !self.isRecordingSessionStarted {
let presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
self.assetWriter?.startSession(atSourceTime: presentationTime)
self.isRecordingSessionStarted = true
}
let description = CMSampleBufferGetFormatDescription(sampleBuffer)!
if CMFormatDescriptionGetMediaType(description) == kCMMediaType_Audio {
if self.audioInput!.isReadyForMoreMediaData {
print("appendSampleBuffer audio");
self.audioInput?.append(sampleBuffer)
}
} else {
if self.videoInput!.isReadyForMoreMediaData {
print("appendSampleBuffer video");
if !self.videoInput!.append(sampleBuffer) {
print("Error writing video buffer");
}
}
}
}
}
Related
I am implementing the functionality for to record the video in my iOS application,
Also, i am using ReplayKit to record a full screen instead of the camera's default capturing
In that there is a requirement for customizing,
(1) Resolution
(2) FPS (frames)
(3) Bit Rate
to implement the above functionalities i am currently working on (1) Resolution and (2) FPS.
For this i have set the resolution and FPS as coded below.
class PreviewView: UIView {
private var captureSession: AVCaptureSession?
private var shakeCountDown: Timer?
let videoFileOutput = AVCaptureMovieFileOutput()
var recordingDelegate:AVCaptureFileOutputRecordingDelegate!
var recorded = 0
var secondsToReachGoal = 30
var videoDevice: AVCaptureDevice?
var onRecord: ((Int, Int)->())?
var onReset: (() -> ())?
var onComplete: (() -> ())?
//MARK:- Screen Recording Variables
let recorder = RPScreenRecorder.shared()
var isRecording = false
init() {
super.init(frame: .zero)
var allowedAccess = false
let blocker = DispatchGroup()
blocker.enter()
AVCaptureDevice.requestAccess(for: .video) { flag in
allowedAccess = flag
blocker.leave()
}
blocker.wait()
recorder.isMicrophoneEnabled = true
if !allowedAccess {
print("!!! NO ACCESS TO CAMERA")
return
}
// setup session
let session = AVCaptureSession()
session.beginConfiguration()
videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video, position: .back)
guard videoDevice != nil, let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice!), session.canAddInput(videoDeviceInput) else {
print("!!! NO CAMERA DETECTED")
return
}
session.addInput(videoDeviceInput)
session.commitConfiguration()
self.captureSession = session
//MARK: Test Cases
//Setup the resolution
captureSession?.sessionPreset = AVCaptureSession.Preset.inputPriority
// Setup the frame
videoDevice?.set(frameRate: 20) // 1 to 30 FPS
}
override class var layerClass: AnyClass {
AVCaptureVideoPreviewLayer.self
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
return layer as! AVCaptureVideoPreviewLayer
}
override func didMoveToSuperview() {
super.didMoveToSuperview()
recordingDelegate = self
} }
and, to set frame rates (FPS) i have created one extension as below:
extension AVCaptureDevice {
func set(frameRate: Double) {
var isFPSSupported = false
do {
let supportedFrameRange = activeFormat.videoSupportedFrameRateRanges
for range in supportedFrameRange {
if (range.maxFrameRate >= Double(frameRate) && range.minFrameRate <= Double(frameRate)) {
isFPSSupported = true
break
}
}
if isFPSSupported {
try lockForConfiguration()
activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: Int32(frameRate))
activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: Int32(frameRate))
unlockForConfiguration()
}
} catch {
print("lockForConfiguration error: \(error.localizedDescription)")
}
}
}
I have checked the scenario for preset the session like,
(1) AVCaptureSession.Preset.inputPriority
(2) AVCaptureSession.Preset.hd1280x720
etc...
But, i got result on saved video like screenshots below:
As screenshot describe the FPS is 59.88 FPS which is not same as what we set through the code, as in code i have set 20 as FPS.
and the second question is how we can set the resolution?
Because in all the preset session scenarios it's taking always the resolution like,
828 x 1792
How can we achieve this?
Any help would be appreciable
Thanks in advance
I am currently creating a feed with a table view playing videos from internet and then I am saving them in the cache, this is working fine. the problem is that after I scroll up and scroll down a couple of times to test that is working on other cells, the uiview where I am adding the videos gets black and doesn't reproduce the video. video preview The weird part, is that it works great on the simulator and this never happens but it does on my phone. I appreciate your help.
this is the code for mi tableViewCell, I check is the post has a video, if so I do a cache from the url and assign it to the avplayer
import UIKit
import AVFoundation
import Alamofire
import Firebase
import AVKit
class PostCell: UITableViewCell {
#IBOutlet weak var postImageView: UIImageView!
#IBOutlet weak var numberOfLikes: UIButton!
#IBOutlet weak var usernameLbl: UILabel!
#IBOutlet weak var likesBtn: UIButton!
#IBOutlet weak var commentsBubbleBtn: UIButton!
#IBOutlet weak var addCommentBtn: UIButton!
#IBOutlet weak var dotsBtn: UIButton!
#IBOutlet weak var profilePic: UIImageView!
#IBOutlet weak var postCaptionLbl: UILabel!
#IBOutlet weak var timeAgoLbl: UILabel!
#IBOutlet weak var videoView: UIView!
#IBOutlet weak var soundBtn: UIButton!
var userUIDtoSend = ""
var audio = true
var like = Bool()
var postID = ""
var player: AVPlayer!
var playerLayer: AVPlayerLayer?
var playerCache = [String: AVPlayer]()
var videoIsPlaying = false
var url : URL?
func updateUI(postImageView: String, caption: String, type: String, profilePic: String, username: String, postUID: String) {
postCaptionLbl.text = caption
timeAgoLbl.text = "1h ago"
self.postID = postUID
soundBtn.layer.borderWidth = 2.0
soundBtn.layer.cornerRadius = 4.0
soundBtn.layer.borderColor = soundBtn.tintColor.cgColor
soundBtn.layer.masksToBounds = true
self.profilePic.downloadImage(from: profilePic)
self.profilePic.sd_setImage(with: URL(string: profilePic))
self.usernameLbl.text = username
let tapLikes = UITapGestureRecognizer()
tapLikes.addTarget(self, action: #selector(likeBtnWasPressed))
likesBtn.addGestureRecognizer(tapLikes)
likesBtn.isUserInteractionEnabled = true
checkLikes()
if type == "video" {
// print("Video is PLAYING")
self.videoIsPlaying = true
self.soundBtn.isHidden = false
let tapRec = UITapGestureRecognizer()
tapRec.addTarget(self, action: #selector(self.audioToggle))
self.soundBtn.addGestureRecognizer(tapRec)
self.soundBtn.isUserInteractionEnabled = true
self.videoView.isHidden = false
self.postImageView.isHidden = true
CacheManager.shared.getFileWith(stringUrl: postImageView) { result in
switch result {
case .success(let url2):
//self.url = URL(fileURLWithPath: "\(url2)")
self.url = URL(string: "\(url2)")
self.player = AVPlayer(url: self.url!)
//
//self.player?.automaticallyWaitsToMinimizeStalling = true
self.playerLayer = AVPlayerLayer(player: self.player)
self.playerLayer?.frame = self.videoView.bounds
// //self.playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.player?.play()
self.videoView.layer.addSublayer(self.playerLayer!)
self.player!.isMuted = self.audio
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.player!.currentItem)
print("\n\n\n\n")
print("error in the player:", self.player.error ?? "no error")
// print("Post tile: ", self.postCaptionLbl.text ?? "")
print("ready for display: " ,self.playerLayer!.isReadyForDisplay)
case .failure(let error):
// handle errror
print("this is the fucking error from video: ", error)
self.url = URL(string: postImageView)!
self.player = AVPlayer(url: self.url!)
self.playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.player?.automaticallyWaitsToMinimizeStalling = false
self.player?.isMuted = self.audio
self.playerLayer = AVPlayerLayer(player: self.player)
self.playerLayer?.frame = self.videoView.frame
self.videoView.layer.addSublayer(self.playerLayer!)
self.player?.play()
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.player?.currentItem)
print("this is player cache", self.playerCache)
}
}
} else {
self.videoIsPlaying = false
NotificationCenter.default.removeObserver(self)
// print("DISPLAYING IMAGE")
self.soundBtn.isHidden = true
self.videoView.isHidden = true
self.postImageView.isHidden = false
player?.pause()
self.postImageView.sd_setShowActivityIndicatorView(true)
self.postImageView.sd_setIndicatorStyle(.gray)
self.postImageView.sd_setImage(with: URL(string: postImageView))
}
}
func pauseVideo() {
if self.player != nil {
self.player!.seek(to: kCMTimeZero)
self.player!.pause()
}
}
override func prepareForReuse() {
super.prepareForReuse()
playerLayer?.removeFromSuperlayer()
player?.pause()
}
#objc fileprivate func playerItemDidReachEnd(_ notification: Notification) {
if self.player != nil {
self.player!.seek(to: kCMTimeZero)
self.player!.play()
}
}
#objc func audioToggle(){
print("tapped")
if self.player?.isMuted == true {
self.player?.isMuted = false
self.soundBtn.setTitle("Mute", for: .normal)
} else {
self.player?.isMuted = true
self.soundBtn.setTitle("Sound", for: .normal)
}
}
}
I tried in a collection view in another viewController and this happens as well. It seems it something wring whit the avplayer when it runs in actual device.
So guys, as I have multiple views and some zooming buttons that I need them to change the preview camera output, I think it will be correctly to use Singleton for Session initialization, but I have no idea how to do that, and can't find any good information, could someone help me please?
UPDATE
okay, so, I managed somehow to write it, I don't know if it's okay, here is the code:
protocol Singleton: class {
static var sharedInstance: Self { get }
}
final class AVFSessionSingleton: Singleton {
static let sharedInstance = AVFSessionSingleton()
private init() {
session = newVideoCaptureSession()!
}
var session: AVCaptureSession!
var imageOutput : AVCaptureStillImageOutput?
//FUNCTION
func newVideoCaptureSession () -> AVCaptureSession? {
func initCaptureDevice() -> AVCaptureDevice? {
var captureDevice: AVCaptureDevice?
let devices: NSArray = AVCaptureDevice.devices() as NSArray
for device: Any in devices {
if (device as AnyObject).position == AVCaptureDevicePosition.back {
captureDevice = device as? AVCaptureDevice
}
}
print("device inited")
return captureDevice
}
func initOutput() {
self.imageOutput = AVCaptureStillImageOutput()
}
func initInputDevice(captureDevice : AVCaptureDevice) -> AVCaptureInput? {
var deviceInput : AVCaptureInput?
do {
deviceInput = try AVCaptureDeviceInput(device: captureDevice)
}
catch _ {
deviceInput = nil
}
return deviceInput
}
func initSession(deviceInput: AVCaptureInput) {
self.session = AVCaptureSession()
self.session?.sessionPreset = AVCaptureSessionPresetPhoto
self.session?.addInput(deviceInput)
self.session?.addOutput(self.imageOutput!)
}
return session
}
}
So now I want to call it in that way I can manage the layouts for a Preview, any suggestions please?....
I am making an app about book.
In the app,
i want to make the app auto-filling book info by getting ISBN(Barcode)
views
There is 2 classes.
one is 'UploadMain',the other is 'ScanView'
I can get ISBN by scanning,
but i have a problem to pass data from ScanView to UploadMain.
In ScanView i have used optional Binding like below
if let UploadVC = self.storyboard?.instantiateViewControllerWithIdentifier("UploadMain") as? UploadMain {
UploadVC.ISBNstring = self.detectionString!
}
Code for UploadMain Class
override func viewDidLoad(){
super.viewDidLoad()
ISBN.delegate = self
}
override func viewWillAppear(animated: Bool){
ISBN.text = ISBNstring
}
i don't know whats the problem my code.
Full Code of UploadMain
import UIKit
import Foundation
class UploadMain: UIViewController,UITextFieldDelegate {
var ISBNstring: String = ""
var TitleString: String = ""
var AuthorString: String = ""
var PubString: String = ""
var PriceSting: String = ""
#IBOutlet weak var ISBN: UITextField!
#IBOutlet weak var bookTitle: UITextField!
#IBOutlet weak var bookAuthor: UITextField!
#IBOutlet weak var bookPub: UITextField!
#IBOutlet weak var bookPrice: UITextField!
override func viewDidLoad(){
super.viewDidLoad()
ISBN.delegate = self
}
override func viewWillAppear(animated: Bool){
ISBN.text = ISBNstring
}
#IBAction func Upload(sender: AnyObject) {
dismissViewControllerAnimated(true, completion: nil)
}
}
ScanView class
import UIKit
import AVFoundation
import Foundation
class ScanView : UIViewController, AVCaptureMetadataOutputObjectsDelegate {
let session : AVCaptureSession = AVCaptureSession()
var previewLayer : AVCaptureVideoPreviewLayer!
var detectionString : String!
let apiKey : String = "---------dddddd"
override func viewDidLoad() {
super.viewDidLoad()
// For the sake of discussion this is the camera
let device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
// Create a nilable NSError to hand off to the next method.
// Make sure to use the "var" keyword and not "let"
var error : NSError? = nil
var input: AVCaptureDeviceInput = AVCaptureDeviceInput()
do {
input = try AVCaptureDeviceInput(device: device) as AVCaptureDeviceInput
} catch let myJSONError {
print(myJSONError)
}
// If our input is not nil then add it to the session, otherwise we're kind of done!
if input != AVCaptureDeviceInput() {
session.addInput(input)
}
else {
// This is fine for a demo, do something real with this in your app. :)
print(error)
}
let output = AVCaptureMetadataOutput()
output.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
session.addOutput(output)
output.metadataObjectTypes = output.availableMetadataObjectTypes
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.frame = self.view.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(previewLayer)
// Start the scanner. You'll have to end it yourself later.
session.startRunning()
}
// This is called when we find a known barcode type with the camera.
func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {
var highlightViewRect = CGRectZero
var barCodeObject : AVMetadataObject!
let barCodeTypes = [AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code]
// The scanner is capable of capturing multiple 2-dimensional barcodes in one scan.
for metadata in metadataObjects {
for barcodeType in barCodeTypes {
if metadata.type == barcodeType {
barCodeObject = self.previewLayer.transformedMetadataObjectForMetadataObject(metadata as! AVMetadataMachineReadableCodeObject)
highlightViewRect = barCodeObject.bounds
detectionString = (metadata as! AVMetadataMachineReadableCodeObject).stringValue
self.session.stopRunning()
self.alert(detectionString)
// Daum Book API 호출
let apiURI = NSURL(string: "https://apis.daum.net/search/book?apikey=\(apiKey)&q=\(detectionString)&searchType=isbn&output=json")
let apidata : NSData? = NSData(contentsOfURL: apiURI!)
NSLog("API Result = %#", NSString(data: apidata!, encoding: NSUTF8StringEncoding)!)
**if let UploadVC = self.storyboard?.instantiateViewControllerWithIdentifier("UploadMain") as? UploadMain {
UploadVC.ISBNstring = self.detectionString!
}**
break
}
}
}
print(detectionString)
self.navigationController?.popViewControllerAnimated(true)
}
func alert(Code: String){
let actionSheet:UIAlertController = UIAlertController(title: "Barcode", message: "\(Code)", preferredStyle: UIAlertControllerStyle.Alert)
// for alert add .Alert instead of .Action Sheet
// start copy
let firstAlertAction:UIAlertAction = UIAlertAction(title: "OK", style: UIAlertActionStyle.Default, handler:
{
(alertAction:UIAlertAction!) in
// action when pressed
self.session.startRunning()
})
actionSheet.addAction(firstAlertAction)
}
}
In the ScanView you are creating the new instance of UploadMain, that is not available in window hierarchy, So that data is not available to the UploadMain. To solve your problem you need to create one protocol and pass the delegate of that protocol to ScanView. So create one protocol like this.
protocol IsbnDelegate {
func passData(isbnStr: String)
}
Now inherit this protocol in UploadMain and override its method passData in the UploadMain like below
class UploadMain: UIViewController,UITextFieldDelegate,IsbnDelegate {
//your code
//Add this method
func passData(isbnStr: String) {
self.ISBN.text = isbnStr
}
//Also override prepareForSegue like this
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
let destVC = segue.destinationViewController as! ScanView
destVC.delegate = self
}
}
After that create one delegate object in ScanView, Change your code of ScanView like this
import UIKit
import AVFoundation
import Foundation
class ScanView : UIViewController, AVCaptureMetadataOutputObjectsDelegate {
let session : AVCaptureSession = AVCaptureSession()
var previewLayer : AVCaptureVideoPreviewLayer!
var detectionString : String!
let apiKey : String = "---------dddddd"
var delegate: IsbnDelegate?
override func viewDidLoad() {
super.viewDidLoad()
// For the sake of discussion this is the camera
let device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
// Create a nilable NSError to hand off to the next method.
// Make sure to use the "var" keyword and not "let"
var error : NSError? = nil
var input: AVCaptureDeviceInput = AVCaptureDeviceInput()
do {
input = try AVCaptureDeviceInput(device: device) as AVCaptureDeviceInput
} catch let myJSONError {
print(myJSONError)
}
// If our input is not nil then add it to the session, otherwise we're kind of done!
if input != AVCaptureDeviceInput() {
session.addInput(input)
}
else {
// This is fine for a demo, do something real with this in your app. :)
print(error)
}
let output = AVCaptureMetadataOutput()
output.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
session.addOutput(output)
output.metadataObjectTypes = output.availableMetadataObjectTypes
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.frame = self.view.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(previewLayer)
// Start the scanner. You'll have to end it yourself later.
session.startRunning()
}
// This is called when we find a known barcode type with the camera.
func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {
var highlightViewRect = CGRectZero
var barCodeObject : AVMetadataObject!
let barCodeTypes = [AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code]
// The scanner is capable of capturing multiple 2-dimensional barcodes in one scan.
for metadata in metadataObjects {
for barcodeType in barCodeTypes {
if metadata.type == barcodeType {
barCodeObject = self.previewLayer.transformedMetadataObjectForMetadataObject(metadata as! AVMetadataMachineReadableCodeObject)
highlightViewRect = barCodeObject.bounds
detectionString = (metadata as! AVMetadataMachineReadableCodeObject).stringValue
self.session.stopRunning()
self.alert(detectionString)
// Daum Book API 호출
let apiURI = NSURL(string: "https://apis.daum.net/search/book?apikey=\(apiKey)&q=\(detectionString)&searchType=isbn&output=json")
let apidata : NSData? = NSData(contentsOfURL: apiURI!)
NSLog("API Result = %#", NSString(data: apidata!, encoding: NSUTF8StringEncoding)!)
//Here We are passing the data of your ScanView to UploadMain
self.delegate.passData(self.detectionString!)
break
}
}
}
print(detectionString)
self.navigationController?.popViewControllerAnimated(true)
}
func alert(Code: String){
let actionSheet:UIAlertController = UIAlertController(title: "Barcode", message: "\(Code)", preferredStyle: UIAlertControllerStyle.Alert)
// for alert add .Alert instead of .Action Sheet
// start copy
let firstAlertAction:UIAlertAction = UIAlertAction(title: "OK", style: UIAlertActionStyle.Default, handler:
{
(alertAction:UIAlertAction!) in
// action when pressed
self.session.startRunning()
})
actionSheet.addAction(firstAlertAction)
}
}
For more detail about protcol follow this link
1) Apple Documentation
2) Tutorial 1
3) Tutorial 2
Hope this will help you.
To me it appears you are passing data back from the view controller.
When you call
if let UploadVC = self.storyboard?.instantiateViewControllerWithIdentifier("UploadMain") as? UploadMain {
UploadVC.ISBNstring = self.detectionString!
}
You are creating a new instance of that view controller, and it is successfully receiving the data (But you could never use it).
What you are wanting to do is send data
I could write it out for you, but theres actually some great tutorials
text
video
I have been struggling to change a piece of code for recording audio from swift 1.2 to swift 2. With the help of people here, I made some changes and eventually got rid of all the compiler errors. But now, after I run the code, login into Twitter, and then click on the Record button in the simulator, it crashes and gives me a runtime error. Please see the picture here:
Also, I'm not sure if this is important or not, but I removed "AVFormatIDKey : NSNumber(int: Int32(kAudioFormatAppleLossless))" from the code and commented out "self.audioRecorder.meteringEnabled = true" and "self.audioRecorder.prepareToRecord()" from the code, and it wasn't crashing anymore, but obviously that's not how the code is supposed to run at the very end...
Here is the full version of code. Any thoughts on this?
Thanks a lot for your help
import UIKit
import AVFoundation
class RecordViewController: UIViewController {
required init?(coder aDecoder: NSCoder) {
var baseString : String = NSSearchPathForDirectoriesInDomains(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0] as String
self.audioURL = "sound.m4a"
var pathComponents = [baseString, self.audioURL]
var audioNSURL = NSURL.fileURLWithPathComponents(pathComponents)
var session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
} catch (_) {
}
// session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)
// var recordSettings: [String : AnyObject] = Dictionary()
// recordSettings[AVFormatIDKey] = kAudioFormatMPEG4AAC
// recordSettings[AVFormatIDKey] = NSNumber(unsignedInt: kAudioFormatMPEG4AAC)
// recordSettings[AVSampleRateKey] = 44100.0
// recordSettings[AVNumberOfChannelsKey] = 2
let recordSettings = [
AVSampleRateKey : NSNumber(float: Float(44100.0)),
AVFormatIDKey : NSNumber(int: Int32(kAudioFormatAppleLossless)),
AVNumberOfChannelsKey : NSNumber(int: 1),
AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Medium.rawValue)),
AVEncoderBitRateKey : NSNumber(int: Int32(320000))
]
// self.audioRecorder = AVAudioRecorder(URL: audioNSURL, settings: recordSettings, error: nil)
self.audioRecorder = AVAudioRecorder()
print("aaaaa")
do {
self.audioRecorder = try AVAudioRecorder(URL: audioNSURL!, settings: recordSettings)
} catch (_) {
}
self.audioRecorder.meteringEnabled = true
self.audioRecorder.prepareToRecord()
super.init(coder: aDecoder)
}
// required init(coder aDecoder: NSCoder!) {
// super.init(coder : aDecoder)
// self.audioRecorder = AVAudioRecorder()
// }
#IBOutlet weak var recordButton: UIButton!
#IBOutlet weak var playButton: UIButton!
#IBOutlet weak var saveButton: UIBarButtonItem!
var audioRecorder : AVAudioRecorder
var audioURL = ""
override func viewDidLoad() {
super.viewDidLoad()
self.playButton.enabled = false
self.saveButton.enabled = false
// Do any additional setup after loading the view.
}
#IBAction func cancelTapped(sender: AnyObject) {
self.dismissViewControllerAnimated(true, completion: nil)
}
#IBAction func saveTapped(sender: AnyObject) {
}
#IBAction func recordTapped(sender: AnyObject) {
self.playButton.enabled = true
}
#IBAction func playTapped(sender: AnyObject) {
}
}
I see one forcefully unwrapped optional. Replace your error handling with the following
if let url = audioNSURL {
self.audioRecorder = try? AVAudioRecorder(URL: url, settings: recordSettings)
}
Thank you both Eric and Lukas. I did some further trimming to my whole code and after all I guess I made it work. Now, it compiles without error and also it doesn't do a runtime crash. I'm just posting the final version of my working code so that if anyone else was facing similar issues they can use it.
Thanks again and best to all the awesome guys who contribute here on Stackoverflow.
import UIKit
import AVFoundation
class RecordViewController: UIViewController {
var audioRecorder : AVAudioRecorder?
#IBOutlet weak var recordButton: UIButton!
#IBOutlet weak var playButton: UIButton!
#IBOutlet weak var saveButton: UIBarButtonItem!
override func viewDidLoad() {
super.viewDidLoad()
self.playButton.enabled = false
self.saveButton.enabled = false
}
func setRecorder() {
do {
let baseString : String = NSSearchPathForDirectoriesInDomains(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true).first!
let pathComponents = [baseString, "music.m4a"]
let audioURL = NSURL.fileURLWithPathComponents(pathComponents)
let session = AVAudioSession.sharedInstance()
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try session.overrideOutputAudioPort(AVAudioSessionPortOverride.Speaker)
try session.setActive(true)
var recordSettings = [String : AnyObject]()
recordSettings[AVFormatIDKey] = Int(kAudioFormatMPEG4AAC)
recordSettings[AVSampleRateKey] = 44100.0
recordSettings[AVNumberOfChannelsKey] = 2
self.audioRecorder = try AVAudioRecorder(URL: audioURL!, settings: recordSettings)
self.audioRecorder!.meteringEnabled = true
self.audioRecorder!.prepareToRecord()
} catch (_) {
}
}
#IBAction func cancelTapped(sender: AnyObject) {
self.dismissViewControllerAnimated(true, completion: nil)
}
#IBAction func saveTapped(sender: AnyObject) {
}
#IBAction func recordTapped(sender: AnyObject) {
self.playButton.enabled = true
}
#IBAction func playTapped(sender: AnyObject) {
}
}