How to advertise an iBeacon using an iPad 5 in Swift 4 - ios

So I found a good tutorial on using iBeacons, I created a receiver and a broadcaster, but when i ran the code, I couldn’t see that there was any beacon being broadcasted. Does anyone have a good solution that would solve my issue?
import UIKit
import CoreLocation
import CoreBluetooth
import Foundation
/// ibeacon class
class iBeaconConfiguration
{
// You can use uuidgen in terminal to generate new one.
static let uuid = UUID(uuidString: "7FA08BC7-A55F-45FC-85C0-0BF26F899530")!
private init() {}
}
If you’re wondering, the class above this is just for the UUID
class BroadcastViewController: UIViewController {
fileprivate var broadcasting: Bool = false
fileprivate var beacon: CLBeaconRegion?
fileprivate var peripheralManager: CBPeripheralManager?
#IBOutlet var statusLabel: UILabel!
#IBOutlet var triggerButton: UIButton!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
self.setNeedsStatusBarAppearanceUpdate()
}
override func viewDidLoad() {
super.viewDidLoad()
statusLabel = UILabel()
triggerButton = UIButton(frame: CGRect(x: 130, y: 10, width: 100, height: 50))
triggerButton.backgroundColor = UIColor.blue
triggerButton.addTarget(self, action: #selector(broadcastBeacon(sender:)), for: .touchUpInside)
self.view.addSubview(triggerButton)
let button = UIButton(frame: CGRect(x: 10, y: 10, width: 50, height: 50))
button.backgroundColor = UIColor.red
button.addTarget(self, action: #selector(dismiss1), for: .touchUpInside)
self.view.addSubview(button)
self.view.backgroundColor = UIColor.white
let UUID: UUID = iBeaconConfiguration.uuid
let major: CLBeaconMajorValue = CLBeaconMajorValue(arc4random() % 100 + 1)
let minor: CLBeaconMinorValue = CLBeaconMinorValue(arc4random() % 2 + 1)
self.beacon = CLBeaconRegion(proximityUUID: UUID, major: major, minor: minor, identifier: "tw.darktt.beaconDemo")
self.peripheralManager = CBPeripheralManager(delegate: self, queue: nil)
}
deinit
{
self.beacon = nil
self.peripheralManager = nil
}
override func didReceiveMemoryWarning()
{
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func dismiss1() {
//self.dismiss(animated: true, completion: nil)
present(ReceiverViewController(), animated: true, completion: nil)
}
}
// MARK: - Status Bar -
extension BroadcastViewController {
override var preferredStatusBarStyle: UIStatusBarStyle
{
if self.broadcasting {
return .lightContent
}
return .default
}
override var preferredStatusBarUpdateAnimation: UIStatusBarAnimation
{
return .fade
}
override var prefersStatusBarHidden: Bool
{
return false
}
}
//MARK: - Actions -
extension BroadcastViewController {
#IBAction fileprivate func broadcastBeacon(sender: UIButton) -> Void
{
let state: CBManagerState = self.peripheralManager!.state
if (state == .poweredOff && !self.broadcasting) {
let OKAction: UIAlertAction = UIAlertAction(title: "OK", style: .cancel, handler: nil)
let alert: UIAlertController = UIAlertController(title: "Bluetooth OFF", message: "Please power on your Bluetooth!", preferredStyle: .alert)
alert.addAction(OKAction)
self.present(alert, animated: true, completion: nil)
return
}
let titleFromStatus: (Void) -> String = {
let title: String = (self.broadcasting) ? "Start" : "Stop"
return title + " Broadcast"
}
let buttonTitleColor: UIColor = (self.broadcasting) ? UIColor.blue : UIColor.white
sender.setTitle("nil", for: .normal)
sender.setTitleColor(buttonTitleColor, for: .normal)
let labelTextFromStatus: (Void) -> String = {
let text: String = (self.broadcasting) ? "Not Broadcast" : "Broadcasting..."
return text
}
self.statusLabel.text = "broadcast started"
let animations: () -> Void = {
let backgroundColor: UIColor = (self.broadcasting) ? UIColor.white : UIColor.blue
self.view.backgroundColor = backgroundColor
self.broadcasting = !self.broadcasting
self.setNeedsStatusBarAppearanceUpdate()
}
let completion: (Bool) -> Void = {
finish in
self.advertising(start: self.broadcasting)
}
UIView.animate(withDuration: 0.25, animations: animations, completion: completion)
}
// MARK: - Broadcast Beacon
func advertising(start: Bool) -> Void
{
if self.peripheralManager == nil {
return
}
if (!start) {
self.peripheralManager!.stopAdvertising()
return
}
let state: CBManagerState = self.peripheralManager!.state
if (state == .poweredOn) {
let UUID:UUID = (self.beacon?.proximityUUID)!
let serviceUUIDs: Array<CBUUID> = [CBUUID(nsuuid: UUID)]
// Why NSMutableDictionary can not convert to Dictionary<String, Any> 😂
var peripheralData: Dictionary<String, Any> = self.beacon!.peripheralData(withMeasuredPower: 1) as NSDictionary as! Dictionary<String, Any>
peripheralData[CBAdvertisementDataLocalNameKey] = "iBeacon Demo"
peripheralData[CBAdvertisementDataServiceUUIDsKey] = serviceUUIDs
self.peripheralManager!.startAdvertising(peripheralData)
}
}
}
// MARK: - CBPeripheralManager Delegate -
extension BroadcastViewController: CBPeripheralManagerDelegate {
func peripheralManagerDidUpdateState(_ peripheral: CBPeripheralManager)
{
let state: CBManagerState = peripheralManager!.state
if state == .poweredOff {
self.statusLabel.text = "Bluetooth Off"
if self.broadcasting {
self.broadcastBeacon(sender: self.triggerButton)
}
}
if state == .unsupported {
self.statusLabel.text = "Unsupported Beacon"
}
if state == .poweredOn {
self.statusLabel.text = "Not Broadcast"
}
}
}
In addition to all that, i read the documentation that apple has on their website, but I didn’t get very far with that. I also went to probably like, 10 different Github projects and tried them but they’re all written in a way earlier version of swift.

You cannot add anything extra to the iBeacon advertisement and have it be recognized as such. These two lines are a problem and must be removed:
peripheralData[CBAdvertisementDataLocalNameKey] = "iBeacon Demo"
peripheralData[CBAdvertisementDataServiceUUIDsKey] = serviceUUIDs
The first line attempts to add a broadcasted name to the advertisement. You can't do this, as there is no extra room in the packet for this kind of information.
The second line attempts to add a GATT service UUID to the advertisement. This does not make sense. An iBeacon advertisement is a Bluetooth LE manufacturer advertisement, which does not advertisement GATT services. Adding this will either make the advertisement invalid or cause the advertising API to fail.
There may be other issues with the code shown. I'd suggest simplifying the code to start tranission when the app begins, so you can eliminate any possible UI bugs. And I would use an Android detector like Locate so you can see the advertisement even if you have the ProximityUIID wrong. Android, unlike iOS can see any beacon advertisement.

Assuming you have an iBeacon scanning app working (otherwise, it'll be pretty tough to know if you get your device broadcasting correctly)...
I suggest starting simple... get the "start broadcasting" part working, then add in the extras.
Create a new project (or add a new VC to your current project). Add "Start" and "Stop" buttons, which you'll connect to the #IBAction funcs below.
Make this your new VC's class:
//
// BroadcasterViewController.swift
//
// Created by Don Mag on 3/16/18.
//
import UIKit
import CoreLocation
import CoreBluetooth
class BroadcasterViewController: UIViewController, CBPeripheralManagerDelegate {
var myBeacon: CLBeaconRegion!
var beaconData: NSDictionary!
var peripheralMgr: CBPeripheralManager!
let myRegionID = "myRegionID"
#IBAction func startBeacon(_ sender: Any) {
// don't do anything if myBeacon already exists
guard myBeacon == nil else { return }
let beaconUUID = "7FA08BC7-A55F-45FC-85C0-0BF26F899530"
let beaconMajor: CLBeaconMajorValue = 123
let beaconMinor: CLBeaconMinorValue = 456
let uuid = UUID(uuidString: beaconUUID)!
myBeacon = CLBeaconRegion(proximityUUID: uuid, major: beaconMajor, minor: beaconMinor, identifier: "MyIdentifier")
beaconData = myBeacon.peripheralData(withMeasuredPower: nil)
peripheralMgr = CBPeripheralManager(delegate: self, queue: nil, options: nil)
}
#IBAction func stopBeacon(_ sender: Any) {
// don't do anything if there's no peripheral manager
guard peripheralMgr != nil else { return }
peripheralMgr.stopAdvertising()
peripheralMgr = nil
beaconData = nil
myBeacon = nil
}
func peripheralManagerDidUpdateState(_ peripheral: CBPeripheralManager) {
if peripheral.state == .poweredOn {
peripheralMgr.startAdvertising(beaconData as! [String: AnyObject]!)
} else if peripheral.state == .poweredOff {
peripheralMgr.stopAdvertising()
}
}
}
Run the app on your "broadcaster" device, and tap the Start button. Then run your "scanner" app on another device, and see if you find your new "Device as iBeacon"
Note: as I'm sure you're aware, your scanner must be scanning for the same beaconUUID that you use on your broadcaster. In this example code, I used the same UUID string that you posted in your question.
This worked fine for me - so hopefully it will do the same for you. If you get this part working, then you can implement the rest of your app -- the UI, status labels, any other options -- and add some error handling (Bluetooth not enabled or not given permissions, etc).

Related

Record and play Video based on TensorFlow example Swift

#DEFINE UPDATE
I realised I had forgotten to ask for recording permission. That has now been fixed. However, when I press the "Record button" I get the error Cannot create file. So when I start the recording, something is fishy with the path maybe?
#UNDEF UPDATE
I am working on an app where I want to have my own neural network with the functionality to start recording a video. Thereafter I want to play the video and use information from the neural network.
I have a working function in Android, now I am trying to make something similar for iPhone. As a start, I have used an ImageClassifierExample from TensorFlowLite. The first task is to add a button Record which starts recording a video and then a button Play which plays the video.
I have implemented the two features, but when I try and play the video, it is just loading. It can either be the recording is not working, or the video player is not working (or both). I have checked so the paths are the same.
I am not so familiar with iOS development so some help would be nice.
This is the base I am starting from.
Here is my slightly adopted ViewController:
// Copyright 2019 The TensorFlow Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
import AVFoundation
import AVKit
import UIKit
class ViewController: UIViewController {
// MARK: Storyboards Connections
#IBOutlet weak var previewView: PreviewView!
#IBOutlet weak var cameraUnavailableLabel: UILabel!
#IBOutlet weak var resumeButton: UIButton!
#IBOutlet weak var bottomSheetView: CurvedView!
#IBOutlet weak var bottomSheetViewBottomSpace: NSLayoutConstraint!
#IBOutlet weak var bottomSheetStateImageView: UIImageView!
// MARK: Constants
private let animationDuration = 0.5
private let collapseTransitionThreshold: CGFloat = -40.0
private let expandThransitionThreshold: CGFloat = 40.0
private let delayBetweenInferencesMs: Double = 1000
// MARK: Instance Variables
// Holds the results at any time
private var result: Result?
private var initialBottomSpace: CGFloat = 0.0
private var previousInferenceTimeMs: TimeInterval = Date.distantPast.timeIntervalSince1970 * 1000
// MARK: Controllers that manage functionality
// Handles all the camera related functionality
private lazy var cameraCapture = CameraFeedManager(previewView: previewView)
private var isRecording = false // <<<----- Mine
private let captureSession: AVCaptureSession = AVCaptureSession()
// Handles all data preprocessing and makes calls to run inference through the `Interpreter`.
private var modelDataHandler: ModelDataHandler? =
ModelDataHandler(modelFileInfo: MobileNet.modelInfo, labelsFileInfo: MobileNet.labelsInfo)
#IBAction func startRecording(_ sender: Any) {. // <<<----- Mine
print("Recording pressed")
if (!isRecording) {
cameraCapture.startRecording()
} else {
cameraCapture.stopRecording()
}
isRecording = !isRecording
}
// Handles the presenting of results on the screen
private var inferenceViewController: InferenceViewController?
// MARK: View Handling Methods
override func viewDidLoad() {
super.viewDidLoad()
guard modelDataHandler != nil else {
fatalError("Model set up failed")
}
#if targetEnvironment(simulator)
previewView.shouldUseClipboardImage = true
NotificationCenter.default.addObserver(self,
selector: #selector(classifyPasteboardImage),
name: UIApplication.didBecomeActiveNotification,
object: nil)
#endif
cameraCapture.delegate = self
addPanGesture()
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
changeBottomViewState()
#if !targetEnvironment(simulator)
cameraCapture.checkCameraConfigurationAndStartSession()
#endif
}
#if !targetEnvironment(simulator)
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
cameraCapture.stopSession()
}
#endif
override var preferredStatusBarStyle: UIStatusBarStyle {
return .lightContent
}
func presentUnableToResumeSessionAlert() {
let alert = UIAlertController(
title: "Unable to Resume Session",
message: "There was an error while attempting to resume session.",
preferredStyle: .alert
)
alert.addAction(UIAlertAction(title: "OK", style: .default, handler: nil))
self.present(alert, animated: true)
}
// MARK: Storyboard Segue Handlers
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
super.prepare(for: segue, sender: sender)
if segue.identifier == "EMBED" {
guard let tempModelDataHandler = modelDataHandler else {
return
}
inferenceViewController = segue.destination as? InferenceViewController
inferenceViewController?.wantedInputHeight = tempModelDataHandler.inputHeight
inferenceViewController?.wantedInputWidth = tempModelDataHandler.inputWidth
inferenceViewController?.maxResults = tempModelDataHandler.resultCount
inferenceViewController?.threadCountLimit = tempModelDataHandler.threadCountLimit
inferenceViewController?.delegate = self
}
}
#objc func classifyPasteboardImage() {
guard let image = UIPasteboard.general.images?.first else {
return
}
guard let buffer = CVImageBuffer.buffer(from: image) else {
return
}
previewView.image = image
DispatchQueue.global().async {
self.didOutput(pixelBuffer: buffer)
}
}
deinit {
NotificationCenter.default.removeObserver(self)
}
}
// MARK: InferenceViewControllerDelegate Methods
extension ViewController: InferenceViewControllerDelegate {
func didChangeThreadCount(to count: Int) {
if modelDataHandler?.threadCount == count { return }
modelDataHandler = ModelDataHandler(
modelFileInfo: MobileNet.modelInfo,
labelsFileInfo: MobileNet.labelsInfo,
threadCount: count
)
}
}
// MARK: CameraFeedManagerDelegate Methods
extension ViewController: CameraFeedManagerDelegate {
func didOutput(pixelBuffer: CVPixelBuffer) {
let currentTimeMs = Date().timeIntervalSince1970 * 1000
guard (currentTimeMs - previousInferenceTimeMs) >= delayBetweenInferencesMs else { return }
previousInferenceTimeMs = currentTimeMs
// Pass the pixel buffer to TensorFlow Lite to perform inference.
result = modelDataHandler?.runModel(onFrame: pixelBuffer)
// Display results by handing off to the InferenceViewController.
DispatchQueue.main.async {
let resolution = CGSize(width: CVPixelBufferGetWidth(pixelBuffer), height: CVPixelBufferGetHeight(pixelBuffer))
self.inferenceViewController?.inferenceResult = self.result
self.inferenceViewController?.resolution = resolution
self.inferenceViewController?.tableView.reloadData()
}
}
// MARK: Session Handling Alerts
func sessionWasInterrupted(canResumeManually resumeManually: Bool) {
// Updates the UI when session is interupted.
if resumeManually {
self.resumeButton.isHidden = false
} else {
self.cameraUnavailableLabel.isHidden = false
}
}
func sessionInterruptionEnded() {
// Updates UI once session interruption has ended.
if !self.cameraUnavailableLabel.isHidden {
self.cameraUnavailableLabel.isHidden = true
}
if !self.resumeButton.isHidden {
self.resumeButton.isHidden = false
}
}
func sessionRunTimeErrorOccured() {
// Handles session run time error by updating the UI and providing a button if session can be manually resumed.
self.resumeButton.isHidden = false
previewView.shouldUseClipboardImage = true
}
func presentCameraPermissionsDeniedAlert() {
let alertController = UIAlertController(title: "Camera Permissions Denied", message: "Camera permissions have been denied for this app. You can change this by going to Settings", preferredStyle: .alert)
let cancelAction = UIAlertAction(title: "Cancel", style: .cancel, handler: nil)
let settingsAction = UIAlertAction(title: "Settings", style: .default) { (action) in
UIApplication.shared.open(URL(string: UIApplication.openSettingsURLString)!, options: [:], completionHandler: nil)
}
alertController.addAction(cancelAction)
alertController.addAction(settingsAction)
present(alertController, animated: true, completion: nil)
previewView.shouldUseClipboardImage = true
}
func presentVideoConfigurationErrorAlert() {
let alert = UIAlertController(title: "Camera Configuration Failed", message: "There was an error while configuring camera.", preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .default, handler: nil))
self.present(alert, animated: true)
previewView.shouldUseClipboardImage = true
}
}
// MARK: Bottom Sheet Interaction Methods
extension ViewController {
// MARK: Bottom Sheet Interaction Methods
/**
This method adds a pan gesture to make the bottom sheet interactive.
*/
private func addPanGesture() {
let panGesture = UIPanGestureRecognizer(target: self, action: #selector(ViewController.didPan(panGesture:)))
bottomSheetView.addGestureRecognizer(panGesture)
}
/** Change whether bottom sheet should be in expanded or collapsed state.
*/
private func changeBottomViewState() {
guard let inferenceVC = inferenceViewController else {
return
}
if bottomSheetViewBottomSpace.constant == inferenceVC.collapsedHeight - bottomSheetView.bounds.size.height {
bottomSheetViewBottomSpace.constant = 0.0
}
else {
bottomSheetViewBottomSpace.constant = inferenceVC.collapsedHeight - bottomSheetView.bounds.size.height
}
setImageBasedOnBottomViewState()
}
/**
Set image of the bottom sheet icon based on whether it is expanded or collapsed
*/
private func setImageBasedOnBottomViewState() {
if bottomSheetViewBottomSpace.constant == 0.0 {
bottomSheetStateImageView.image = UIImage(named: "down_icon")
}
else {
bottomSheetStateImageView.image = UIImage(named: "up_icon")
}
}
/**
This method responds to the user panning on the bottom sheet.
*/
#objc func didPan(panGesture: UIPanGestureRecognizer) {
// Opens or closes the bottom sheet based on the user's interaction with the bottom sheet.
let translation = panGesture.translation(in: view)
switch panGesture.state {
case .began:
initialBottomSpace = bottomSheetViewBottomSpace.constant
translateBottomSheet(withVerticalTranslation: translation.y)
case .changed:
translateBottomSheet(withVerticalTranslation: translation.y)
case .cancelled:
setBottomSheetLayout(withBottomSpace: initialBottomSpace)
case .ended:
translateBottomSheetAtEndOfPan(withVerticalTranslation: translation.y)
setImageBasedOnBottomViewState()
initialBottomSpace = 0.0
default:
break
}
}
/**
This method sets bottom sheet translation while pan gesture state is continuously changing.
*/
private func translateBottomSheet(withVerticalTranslation verticalTranslation: CGFloat) {
let bottomSpace = initialBottomSpace - verticalTranslation
guard bottomSpace <= 0.0 && bottomSpace >= inferenceViewController!.collapsedHeight - bottomSheetView.bounds.size.height else {
return
}
setBottomSheetLayout(withBottomSpace: bottomSpace)
}
/**
This method changes bottom sheet state to either fully expanded or closed at the end of pan.
*/
private func translateBottomSheetAtEndOfPan(withVerticalTranslation verticalTranslation: CGFloat) {
// Changes bottom sheet state to either fully open or closed at the end of pan.
let bottomSpace = bottomSpaceAtEndOfPan(withVerticalTranslation: verticalTranslation)
setBottomSheetLayout(withBottomSpace: bottomSpace)
}
/**
Return the final state of the bottom sheet view (whether fully collapsed or expanded) that is to be retained.
*/
private func bottomSpaceAtEndOfPan(withVerticalTranslation verticalTranslation: CGFloat) -> CGFloat {
// Calculates whether to fully expand or collapse bottom sheet when pan gesture ends.
var bottomSpace = initialBottomSpace - verticalTranslation
var height: CGFloat = 0.0
if initialBottomSpace == 0.0 {
height = bottomSheetView.bounds.size.height
}
else {
height = inferenceViewController!.collapsedHeight
}
let currentHeight = bottomSheetView.bounds.size.height + bottomSpace
if currentHeight - height <= collapseTransitionThreshold {
bottomSpace = inferenceViewController!.collapsedHeight - bottomSheetView.bounds.size.height
}
else if currentHeight - height >= expandThransitionThreshold {
bottomSpace = 0.0
}
else {
bottomSpace = initialBottomSpace
}
return bottomSpace
}
/**
This method layouts the change of the bottom space of bottom sheet with respect to the view managed by this controller.
*/
func setBottomSheetLayout(withBottomSpace bottomSpace: CGFloat) {
view.setNeedsLayout()
bottomSheetViewBottomSpace.constant = bottomSpace
view.setNeedsLayout()
}
}
CameraFeedManager:
// Copyright 2019 The TensorFlow Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
import UIKit
import AVFoundation
// MARK: CameraFeedManagerDelegate Declaration
protocol CameraFeedManagerDelegate: AnyObject {
/**
This method delivers the pixel buffer of the current frame seen by the device's camera.
*/
func didOutput(pixelBuffer: CVPixelBuffer)
/**
This method initimates that the camera permissions have been denied.
*/
func presentCameraPermissionsDeniedAlert()
/**
This method initimates that there was an error in video configurtion.
*/
func presentVideoConfigurationErrorAlert()
/**
This method initimates that a session runtime error occured.
*/
func sessionRunTimeErrorOccured()
/**
This method initimates that the session was interrupted.
*/
func sessionWasInterrupted(canResumeManually resumeManually: Bool)
/**
This method initimates that the session interruption has ended.
*/
func sessionInterruptionEnded()
}
/**
This enum holds the state of the camera initialization.
*/
enum CameraConfiguration {
case success
case failed
case permissionDenied
}
/**
This class manages all camera related functionality
*/
class CameraFeedManager: NSObject, AVCaptureFileOutputRecordingDelegate {
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) { // << --- Mine
print("Video recorded to: " + outputFileURL.absoluteString)
}
// MARK: Camera Related Instance Variables
private let session: AVCaptureSession = AVCaptureSession()
private let previewView: PreviewView
private let sessionQueue = DispatchQueue(label: "sessionQueue")
private var cameraConfiguration: CameraConfiguration = .failed
private lazy var videoDataOutput = AVCaptureVideoDataOutput()
private var movieDataOutput = AVCaptureMovieFileOutput() // << --- Mine
private var isSessionRunning = false
// MARK: CameraFeedManagerDelegate
weak var delegate: CameraFeedManagerDelegate?
// MARK: Initializer
init(previewView: PreviewView) {
self.previewView = previewView
super.init()
// Initializes the session
session.sessionPreset = .high
self.previewView.session = session
self.previewView.previewLayer.connection?.videoOrientation = .portrait
self.previewView.previewLayer.videoGravity = .resizeAspectFill
self.attemptToConfigureSession()
}
// MARK: Session Start and End methods
/**
This method starts an AVCaptureSession based on whether the camera configuration was successful.
*/
func checkCameraConfigurationAndStartSession() {
sessionQueue.async {
switch self.cameraConfiguration {
case .success:
self.addObservers()
self.startSession()
case .failed:
DispatchQueue.main.async {
self.delegate?.presentVideoConfigurationErrorAlert()
}
case .permissionDenied:
DispatchQueue.main.async {
self.delegate?.presentCameraPermissionsDeniedAlert()
}
}
}
}
/**
This method stops a running an AVCaptureSession.
*/
func stopSession() {
self.removeObservers()
sessionQueue.async {
if self.session.isRunning {
self.session.stopRunning()
self.isSessionRunning = self.session.isRunning
}
}
}
/**
This method resumes an interrupted AVCaptureSession.
*/
func resumeInterruptedSession(withCompletion completion: #escaping (Bool) -> ()) {
sessionQueue.async {
self.startSession()
DispatchQueue.main.async {
completion(self.isSessionRunning)
}
}
}
/**
This method starts the AVCaptureSession
**/
private func startSession() {
self.session.startRunning()
self.isSessionRunning = self.session.isRunning
}
// MARK: Session Configuration Methods.
/**
This method requests for camera permissions and handles the configuration of the session and stores the result of configuration.
*/
private func attemptToConfigureSession() {
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized:
self.cameraConfiguration = .success
case .notDetermined:
self.sessionQueue.suspend()
self.requestCameraAccess(completion: { (granted) in
self.sessionQueue.resume()
})
case .denied:
self.cameraConfiguration = .permissionDenied
default:
break
}
self.sessionQueue.async {
self.configureSession()
}
}
/**
This method requests for camera permissions.
*/
private func requestCameraAccess(completion: #escaping (Bool) -> ()) {
AVCaptureDevice.requestAccess(for: .video) { (granted) in
if !granted {
self.cameraConfiguration = .permissionDenied
}
else {
self.cameraConfiguration = .success
}
completion(granted)
}
}
/**
This method handles all the steps to configure an AVCaptureSession.
*/
private func configureSession() {
guard cameraConfiguration == .success else {
return
}
session.beginConfiguration()
// Tries to add an AVCaptureDeviceInput.
guard addVideoDeviceInput() == true else {
self.session.commitConfiguration()
self.cameraConfiguration = .failed
return
}
// Tries to add an AVCaptureVideoDataOutput.
guard addVideoDataOutput() else {
self.session.commitConfiguration()
self.cameraConfiguration = .failed
return
}
session.commitConfiguration()
self.cameraConfiguration = .success
}
func startRecording() {. // << --- Mine
self.session.addOutput(movieDataOutput)
guard let homeDirectory = FileManager.default.urls(for: .desktopDirectory, in: .userDomainMask).first else { return }
let url = URL(fileURLWithPath: homeDirectory.absoluteString + "/mymovie.mov")
movieDataOutput.startRecording(to: url , recordingDelegate: self)
}
func stopRecording() { // <<< -- Mine
self.movieDataOutput.stopRecording()
self.session.removeOutput(movieDataOutput)
}
/**
This method tries to an AVCaptureDeviceInput to the current AVCaptureSession.
*/
private func addVideoDeviceInput() -> Bool {
/**Tries to get the default back camera.
*/
guard let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
return false
}
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: camera)
if session.canAddInput(videoDeviceInput) {
session.addInput(videoDeviceInput)
return true
}
else {
return false
}
}
catch {
fatalError("Cannot create video device input")
}
}
/**
This method tries to an AVCaptureVideoDataOutput to the current AVCaptureSession.
*/
private func addVideoDataOutput() -> Bool {
let sampleBufferQueue = DispatchQueue(label: "sampleBufferQueue")
videoDataOutput.setSampleBufferDelegate(self, queue: sampleBufferQueue)
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.videoSettings = [ String(kCVPixelBufferPixelFormatTypeKey) : kCMPixelFormat_32BGRA]
if session.canAddOutput(videoDataOutput) {
session.addOutput(videoDataOutput)
videoDataOutput.connection(with: .video)?.videoOrientation = .portrait
return true
}
return false
}
// MARK: Notification Observer Handling
private func addObservers() {
NotificationCenter.default.addObserver(self, selector: #selector(CameraFeedManager.sessionRuntimeErrorOccured(notification:)), name: NSNotification.Name.AVCaptureSessionRuntimeError, object: session)
NotificationCenter.default.addObserver(self, selector: #selector(CameraFeedManager.sessionWasInterrupted(notification:)), name: NSNotification.Name.AVCaptureSessionWasInterrupted, object: session)
NotificationCenter.default.addObserver(self, selector: #selector(CameraFeedManager.sessionInterruptionEnded), name: NSNotification.Name.AVCaptureSessionInterruptionEnded, object: session)
}
private func removeObservers() {
NotificationCenter.default.removeObserver(self, name: NSNotification.Name.AVCaptureSessionRuntimeError, object: session)
NotificationCenter.default.removeObserver(self, name: NSNotification.Name.AVCaptureSessionWasInterrupted, object: session)
NotificationCenter.default.removeObserver(self, name: NSNotification.Name.AVCaptureSessionInterruptionEnded, object: session)
}
// MARK: Notification Observers
#objc func sessionWasInterrupted(notification: Notification) {
if let userInfoValue = notification.userInfo?[AVCaptureSessionInterruptionReasonKey] as AnyObject?,
let reasonIntegerValue = userInfoValue.integerValue,
let reason = AVCaptureSession.InterruptionReason(rawValue: reasonIntegerValue) {
print("Capture session was interrupted with reason \(reason)")
var canResumeManually = false
if reason == .videoDeviceInUseByAnotherClient {
canResumeManually = true
} else if reason == .videoDeviceNotAvailableWithMultipleForegroundApps {
canResumeManually = false
}
self.delegate?.sessionWasInterrupted(canResumeManually: canResumeManually)
}
}
#objc func sessionInterruptionEnded(notification: Notification) {
self.delegate?.sessionInterruptionEnded()
}
#objc func sessionRuntimeErrorOccured(notification: Notification) {
guard let error = notification.userInfo?[AVCaptureSessionErrorKey] as? AVError else {
return
}
print("Capture session runtime error: \(error)")
if error.code == .mediaServicesWereReset {
sessionQueue.async {
if self.isSessionRunning {
self.startSession()
} else {
DispatchQueue.main.async {
self.delegate?.sessionRunTimeErrorOccured()
}
}
}
} else {
self.delegate?.sessionRunTimeErrorOccured()
}
}
}
/**
AVCaptureVideoDataOutputSampleBufferDelegate
*/
extension CameraFeedManager: AVCaptureVideoDataOutputSampleBufferDelegate {
/** This method delegates the CVPixelBuffer of the frame seen by the camera currently.
*/
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
// Converts the CMSampleBuffer to a CVPixelBuffer.
let pixelBuffer: CVPixelBuffer? = CMSampleBufferGetImageBuffer(sampleBuffer)
guard let imagePixelBuffer = pixelBuffer else {
return
}
// Delegates the pixel buffer to the ViewController.
delegate?.didOutput(pixelBuffer: imagePixelBuffer)
}
}
PlayerController:
import Foundation
import UIKit
import AVFoundation
import AVKit
class PlayerController : UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
guard let homeDirectory = FileManager.default.urls(for: .desktopDirectory, in: .userDomainMask).first else { return }
let url = URL(fileURLWithPath: homeDirectory.absoluteString + "/mymovie.mov")
print(url.absoluteString)
let player = AVPlayer(url: url) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true) {
playerViewController.player!.play()
}
}
}
The solution was to create the path using:
private func documentDirectory() -> String {
let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory,
.userDomainMask,
true)
return documentDirectory[0]
}
private func append(toPath path: String,
withPathComponent pathComponent: String) -> String? {
if var pathURL = URL(string: path) {
pathURL.appendPathComponent(pathComponent)
return pathURL.absoluteString
}
return nil
}
and
guard let path = append(toPath: documentDirectory(), withPathComponent: "movie_test.mov") else {return}

Adding Background image to Firebase SignIn Screen in xCode 9.2

Currently I bought a source code for a social media app that uses firebase for the sign up/ log in page, but I'm seeing that log in page has no background image and sign up buttons are at the bottom leaving a blank screen on the entire page.
I'm a noobi when it comes to coding in xCode so hope you can help me with adding a background image.
So currently I have 2 files that control the Auth screen (Authclient.swift & WelcomeViewController.swift)
I've been going through the files and looks like "WelcomeViewController.swift" controls the sign in screen... This is the code I have in that file:
import UIKit
import SwiftHEXColors
import Firebase
import FirebaseAuth
import FirebaseAuthUI
import FirebaseGoogleAuthUI
import FirebaseFacebookAuthUI
import FirebaseTwitterAuthUI
import FirebasePhoneAuthUI
class WelcomeViewController: UIViewController, FUIAuthDelegate {
#IBOutlet weak var progressView:UIView? // view shown while data is loading
#IBOutlet weak var welcomeView:UIView? // view when data is loaded. like sign-in or intro
var client:AuthClient?
override func viewDidLoad() {
self.welcomeView?.isHidden = true
self.progressView?.isHidden = false
let config = RemoteConfig.remoteConfig()
#if DEBUG
config.configSettings = RemoteConfigSettings(developerModeEnabled: true)
#endif
config.fetch(withExpirationDuration: 100) { (status, error) -> Void in
if status == .success {
print("Config fetched!")
config.activateFetched()
} else {
print("Config not fetched")
print("Error: \(error?.localizedDescription ?? "No error available.")")
}
self.defineTheme(config)
self.welcomeView?.isHidden = false
self.progressView?.isHidden = true
// if user authorized, go to main page
if (Auth.auth().currentUser) != nil {
self.performSegue(withIdentifier: "auth.mute", sender: nil)
} else {
self.buttonPressed(self)
}
}
}
// FIRAuthUIDelegate
func authUI(_ authUI: FUIAuth, didSignInWith user: User?, error: Error?) {
if let errorHandler = error as NSError? {
self.showError(errorHandler.localizedDescription)
// print user-info. find more here: https://firebase.google.com/docs/auth/ios/errors
print(errorHandler.userInfo)
} else {
if let currentUser = user {
// update displayname and photo
let name = currentUser.displayName ?? kDefaultUsername
let photo = currentUser.photoURL?.absoluteString ?? kDefaultProfilePhoto
client?.saveUser(userId: currentUser.uid,
name: name,
photo: photo,
override: false)
//user?.sendEmailVerification(completion: nil)
}
self.performSegue(withIdentifier: "auth", sender: nil)
}
}
// Helpers
func showError(_ error:String) {
print("Error: \(error)")
let alert = UIAlertController(title: kAlertErrorTitle, message: error, preferredStyle: .alert)
alert.addAction(UIAlertAction(title: kAlertErrorDefaultButton, style: .default) { (action) in })
self.present(alert, animated: true) {}
}
func defineTheme(_ config:RemoteConfig) {
var primary = UIColor.white
var secondary = UIColor.blue
if let string = config[kPrimaryColor].stringValue, !string.isEmpty {
primary = UIColor(hexString: string)!
}
if let string = config[kSecondaryColor].stringValue, !string.isEmpty {
secondary = UIColor(hexString: string)!
}
UINavigationBar.appearance().barTintColor = primary
UINavigationBar.appearance().tintColor = secondary
UIBarButtonItem.appearance().setTitleTextAttributes(
[NSAttributedStringKey.foregroundColor:secondary], for: UIControlState.normal)
UITabBar.appearance().barTintColor = primary
UITabBar.appearance().tintColor = secondary
UIButton.appearance().tintColor = secondary
}
// Actions
#IBAction func buttonPressed(_ sender: AnyObject) {
let authUI = FUIAuth.defaultAuthUI()
authUI?.delegate = self
/*
* Uncommend this lines to add Google and Facebook authorization. But first
* enabled it in Firebase Console. More information you can find here:
* https://firebase.google.com/docs/auth/ios/google-signin
* https://firebase.google.com/docs/auth/ios/facebook-login
*/
let providers: [FUIAuthProvider] = [
// FUIGoogleAuth(),
// FUIFacebookAuth(),
// FUITwitterAuth(),
FUIPhoneAuth(authUI:authUI!),
]
authUI?.providers = providers
/*
kEulaUrl needs to be set in Config.swift file. required for publishing
*/
authUI?.tosurl = URL(string:kEulaUrl)
if (Auth.auth().currentUser) != nil {
self.performSegue(withIdentifier: "auth.mute", sender: nil)
} else {
let authViewController = authUI!.authViewController()
self.present(authViewController, animated: true) {
// ..
}
}
}
}
Can anyone point me in the right direction to add a background image to this screen. Already have my 3 images in Assets.xcassets named bgLogin.imageset.
Thanks
This is what you want to do.
Create an extension of their baseViewController
extension FUIAuthBaseViewController {
Inside of that extension, override their viewWillAppear() and set the image there
open override func viewWillAppear(_ animated: Bool) {
self.navigationItem.leftBarButtonItem = nil
self.view.backgroundColor = .white
// if view is base view add logo as subview
let vc = self.navigationController?.viewControllers.first
if vc == self.navigationController?.visibleViewController {
makeLogoImage()
} else {
// hide the image in proceeding views by covering it with a white background
vc?.view.backgroundColor = .white
}
}
/**
Create imageView and display it at the top of the screen.
*/
func makeLogoImage() {
let imageView = UIImageView(image: UIImage(named: "angel.png"))
let width = view.frame.width
let height = view.frame.height
imageView.frame = CGRect(x: width / 4, y: height / 8 , width: width / 2, height: width / 2)
imageView.contentMode = .scaleAspectFill
self.view.addSubview(imageView)
self.view.sendSubview(toBack: imageView)
}

Swift background mode for BLE iOS9

I want to improve the MPCRevisited project which is Chat app that using multi peer method. I'm using BLE to connect one device to another device (iPad and iPod) and send and receive the data. However, when I press home button to make background mode on one device, after 5 seconds, I can't send or receive the data.
image description here
I've already check all the thing in background modes, but still its not working at all.
import UIKit
import MultipeerConnectivity
class ParkBenchTimer {
let startTime:CFAbsoluteTime
var endTime:CFAbsoluteTime?
init() {
startTime = CFAbsoluteTimeGetCurrent()
}
func stop() -> CFAbsoluteTime {
endTime = CFAbsoluteTimeGetCurrent()
return duration!
}
var duration:CFAbsoluteTime? {
if let endTime = endTime {
return endTime - startTime
} else {
return nil
}
}
}
class ChatViewController: UIViewController, UITextFieldDelegate, UITableViewDelegate, UITableViewDataSource {
#IBOutlet weak var chatTextField: UITextField!
#IBOutlet weak var chatTableView: UITableView!
var messagesArray: [[String : String]] = []
let mpcManager = MPCManager.sharedInstance
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
self.chatTableView.delegate = self
self.chatTableView.dataSource = self
self.chatTableView.estimatedRowHeight = 60.0
self.chatTableView.rowHeight = UITableViewAutomaticDimension
self.chatTextField.delegate = self
self.mpcManager.messageRecievedDelegate = self
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
// MARK: IBAction method implementation
#IBAction func endChat(sender: AnyObject) {
let messageDictionary: [String: String] = ["message": "_end_chat_"]
if self.mpcManager.sendData(dictionaryWithData: messageDictionary, toPeer: self.mpcManager.session.connectedPeers[0] as MCPeerID){
self.dismissViewControllerAnimated(true, completion: { () -> Void in
self.mpcManager.session.disconnect()
})
}
}
// MARK: UITableView related method implementation
func numberOfSectionsInTableView(tableView: UITableView) -> Int {
return 1
}
func tableView(tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
return self.messagesArray.count
}
func tableView(tableView: UITableView, cellForRowAtIndexPath indexPath: NSIndexPath) -> UITableViewCell {
guard let cell = tableView.dequeueReusableCellWithIdentifier("idCell") else {
assert(true)
return UITableViewCell()
}
guard let currentMessage = self.messagesArray[safe: indexPath.row] else {
print(" ")
assert(true)
return UITableViewCell()
}
if let sender = currentMessage["sender"] {
var senderLabelText: String
var senderColor: UIColor
if sender == "self" {
senderLabelText = "I said:"
senderColor = UIColor.purpleColor()
} else {
senderLabelText = sender + " said:"
senderColor = UIColor.orangeColor()
}
cell.detailTextLabel?.text = senderLabelText
cell.detailTextLabel?.textColor = senderColor
}
if let message = currentMessage["message"] {
cell.textLabel?.text = message
}
return cell
}
// MARK: UITextFieldDelegate method implementation
func textFieldShouldReturn(textField: UITextField) -> Bool {
textField.resignFirstResponder()
guard let textFieldText = textField.text else {
assert(true)
return false
}
let messageDictionary: [String: String] = ["message": textFieldText]
guard let connectedPeer = self.mpcManager.session.connectedPeers[safe: 0] else {
print(" ")
assert(true)
return false
}
if self.mpcManager.sendData(dictionaryWithData: messageDictionary, toPeer: connectedPeer) {
let dictionary = ["sender": "self", "message": textFieldText]
self.messagesArray.append(dictionary)
self.updateTableview()
} else {
print("Could not send data")
}
textField.text = ""
return true
}
// MARK: Custom method implementation
func updateTableview(){
chatTableView.reloadData()
if self.chatTableView.contentSize.height > self.chatTableView.frame.size.height {
let indexPathToScrollTo = NSIndexPath(forRow: messagesArray.count - 1, inSection: 0)
self.chatTableView.scrollToRowAtIndexPath(indexPathToScrollTo, atScrollPosition: .Bottom, animated: true)
}
}
}
extension ChatViewController : MPCManagerRecievedMessageDelegate {
func managerRecievedData(data:NSData ,fromPeer:MCPeerID) {
// Convert the data (NSData) into a Dictionary object.
let dataDictionary = NSKeyedUnarchiver.unarchiveObjectWithData(data) as! [String : String]
// Check if there's an entry with the "message" key.
if let message = dataDictionary["message"] {
// Make sure that the message is other than "_end_chat_".
if message != "_end_chat_"{
// Create a new dictionary and set the sender and the received message to it.
let messageDictionary: [String: String] = ["sender": fromPeer.displayName, "message": message]
// Add this dictionary to the messagesArray array.
messagesArray.append(messageDictionary)
// Reload the tableview data and scroll to the bottom using the main thread.
self.updateTableview()
} else {
}
}
}
func managerDidRecievedMessage(message: String, fromPeer: MCPeerID) {
// Create a new dictionary and set the sender and the received message to it.
//let messageDictionary: [String: String] = ["sender": fromPeer.displayName, "message": message]
// Add this dictionary to the messagesArray array.
//messagesArray.append(messageDictionary)
// Reload the tableview data and scroll to the bottom using the main thread.
//self.updateTableview()
}
func managerDidEndChat(fromPeer:MCPeerID) {
// In this case an "_end_chat_" message was received.
// Show an alert view to the user.
let alert = UIAlertController(title: "", message: "\(fromPeer.displayName) ended this chat.", preferredStyle: UIAlertControllerStyle.Alert)
let doneAction: UIAlertAction = UIAlertAction(title: "Okay", style: UIAlertActionStyle.Default) { (alertAction) -> Void in
self.mpcManager.session.disconnect()
self.dismissViewControllerAnimated(true, completion: nil)
}
alert.addAction(doneAction)
self.presentViewController(alert, animated: true, completion: nil)
}
}
This is my code.
Please help me if someone knows this problem. What I want to do is one device to keep sending the message and other device to become background and foreground back and forth.
Thank you.
Looking at some other StackOverflow posts (here and here), it seems like the Multipeer Connectivity Framework is not built to function in the background and your functionality will disappear after a couple minutes.
Bluetooth will function in the background, with the capabilities that you checked, but you will have to create your own messaging platform; even though Multipeer relies partially on Bluetooth, the capabilities are separate entities.

EXC_BAD_ACESS crash xcode

I'm getting this crash : link
in my app ,i did many search abou it but i didn't find how to solve it.
i also enabled Zombies object & Analyzer of xcode but without succes.
Here is my code :
import UIKit
class CameraViewController: UIViewController{
#IBOutlet weak var playerContainer: UIView!
#IBOutlet weak var imageView: UIImageView!
#IBOutlet weak var loading: UIActivityIndicatorView!
var device : GTLUserendpointGeneralCamera!
var myfoxCam : CloseliCameraDevice!
var retryToPlayCount = 0
override func viewDidLoad() {
super.viewDidLoad()
NSNotificationCenter.defaultCenter().addObserver(self,selector: "playerStatusChanged:",name: CameraPlayerStatusChangedNotification,object: nil)
NSNotificationCenter.defaultCenter().addObserver(self,selector: "playerStoppedDispose:",name: CameraPlayerStoppedNotification,object: nil)
}
override func viewDidAppear(animated: Bool) {
if(device.brand == "myfox"){
showLive()
}
}
override func viewWillDisappear(animated: Bool) {
MyFoxManager.sharedInstance.destroyPlayer()
}
deinit {
NSNotificationCenter.defaultCenter().removeObserver(self)
}
func showLive(){
loading.startAnimating()
MyFoxManager.sharedInstance.showLive(myfoxCam , image: self.imageView)
}
func playerStatusChanged(notification: NSNotification){
let waiting = notification.userInfo!["NSLocalizedDescription"] as! Bool
dispatch_async(dispatch_get_main_queue()) { [weak self] in
if let strongSelf = self {
if(waiting){
strongSelf.loading.startAnimating()
}else{
strongSelf.retryToPlayCount = 0
strongSelf.loading.stopAnimating()
}
}
}
}
func playerStoppedDispose(notification: NSNotification){
let code = notification.userInfo!["NSLocalizedFailureReason"] as! Int
dispatch_async(dispatch_get_main_queue()) { [weak self] in
if let strongSelf = self {
if(code & strongSelf.retryToPlayCount < 4){
strongSelf.retryToPlayCount = strongSelf.retryToPlayCount + 1
strongSelf.performSelector("showLive", withObject: nil, afterDelay: 2.0)
}
}
}
}
}
And
import UIKit
import Foundation
import CoreBluetooth
class MyFoxManager: NSObject {
static let sharedInstance = MyFoxManager()
var closeSDK : CloseliSDK!
var login = false
private override init() {
closeSDK = CloseliSDK(productKey: MyFoxAppId, withPassword: MyFoxAppSecret, serverType: "us")
}
func getCamera(device: GTLUserendpointGeneralCamera) -> CloseliCameraDevice? {
if(!login){
do {
try closeSDK.loginWithToken(device.appUrl, withAccount: device.url)
login = true
}catch{
return nil
}
}
do {
let myfoxCameras = try closeSDK.getCameraListError()
let myfoxCamera = myfoxCameras.filter{$0.deviceUUID == device.idDevice}
return myfoxCamera[0] as? CloseliCameraDevice
}catch {
return nil
}
}
func showLive(myfoxCam: CloseliCameraDevice, image: UIImageView){
closeSDK.preparetoLivePreview(myfoxCam , withUI: image)
}
func destroyPlayer(){
closeSDK.destoryPlayer()
}
}
Notification definition from third-party library :
/**
Notification for live preview status changing.
NOTICE: The notification may not be sent by CloseliSDK, so please fill parameter with sender to nil when adding observer
#param userInfo in NSNotification is a NSDictionary, value for NSLocalizedDescriptionKey is a NSNumber, YES means live preview is buffering, NO means live preview begin.
BOOL bWaiting = [[[notification userInfo] valueForKey:NSLocalizedDescriptionKey] boolValue];
*/
extern NSString *const CameraPlayerStatusChangedNotification;
/**
Notification for player stopped.
NOTICE: The notification may not be sent by CloseliSDK, so please fill parameter with sender to nil when adding observer
#param userInfo in NSNotification is a NSDictionary, value for NSLocalizedFailureReasonErrorKey is a NSNumber, represent the reason why it stopped, normally 0.
long stopReason = [[[notification userInfo] valueForKey:NSLocalizedFailureReasonErrorKey] longValue];
stopReason: 0x3261-means there is another client playing in p2p mode.
*/
extern NSString *const CameraPlayerStoppedNotification;
Any idea ?
Thanks in advance

Add initial note

I am looking at adding an inital note to the note page within my app. this is so that when people click to the notes part there will be some detail on how to use it rather than just a big empty screen. I have no idea where to implement this though. Could you please help, below is the page where it talks about the dictionaries.
import UIKit
import MessageUI
class DetailViewController: UIViewController, MFMailComposeViewControllerDelegate, UITextViewDelegate {
#IBOutlet weak var tView: UITextView!
#IBAction func BarButton(sender: UIBarButtonItem) {
let textToShare = ""
if let myWebsite = NSURL(string: "")
{
let objectsToShare = [textToShare, myWebsite]
let activityVC = UIActivityViewController(activityItems: objectsToShare, applicationActivities: nil)
self.presentViewController(activityVC, animated: true, completion: nil)
}
OpenMail()
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
tView.text = (allNotes[currentNoteIndex] as Note).note
tView.becomeFirstResponder()
// Set controller as swipe gesture recogniser, to allow keyboard dismissal for text box
var swipe: UISwipeGestureRecognizer = UISwipeGestureRecognizer(target: self, action: "dismissKeyboard")
swipe.direction = UISwipeGestureRecognizerDirection.Down
self.view.addGestureRecognizer(swipe)
self.tView.delegate = self
}
override func viewWillDisappear(animated: Bool) {
super.viewWillDisappear(animated)
if tView.text == "" {
allNotes.removeAtIndex(currentNoteIndex)
}
else {
(allNotes[currentNoteIndex] as Note).note = tView.text
}
Note.saveNotes()
noteTable?.reloadData()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func configuredMailComposeViewController() -> MFMailComposeViewController {
// Open mail controller on screen and prepare with preset values.
let mailComposerVC = MFMailComposeViewController()
var MessageText: String!
MessageText = tView.text
mailComposerVC.mailComposeDelegate = self
mailComposerVC.setToRecipients([""])
mailComposerVC.setSubject("")
mailComposerVC.setMessageBody(MessageText, isHTML: false)
return mailComposerVC
}
func showSendMailErrorAlert() {
// Alert user to email error
let sendMailErrorAlert = UIAlertView(title: "Could Not Send Email", message: "Your device could not send e-mail. Please check e-mail configuration and try again.", delegate: self, cancelButtonTitle: "OK")
sendMailErrorAlert.show()
}
// MARK: MFMailComposeViewControllerDelegate Method
func mailComposeController(controller: MFMailComposeViewController!, didFinishWithResult result: MFMailComposeResult, error: NSError!) {
controller.dismissViewControllerAnimated(true, completion: nil)
}
func OpenMail() {
//Function to open mail composer on screen
let mailComposeViewController = configuredMailComposeViewController()
if MFMailComposeViewController.canSendMail() {
self.presentViewController(mailComposeViewController, animated: true, completion: nil)
} else {
self.showSendMailErrorAlert()
}
}
func dismissKeyboard() {
// Dismiss keyboard for textfield
self.tView.resignFirstResponder()
}
}
note.swift
import UIKit
var allNotes:[Note] = []
var currentNoteIndex:NSInteger = -1
var noteTable:UITableView?
let KAllNotes:String = "notes"
class Note: NSObject {
var date:String
var note:String
override init() {
date = NSDate().description
note = ""
}
func dictionary() -> NSDictionary {
return ["note":note, "date":date]
}
class func saveNotes() {
var aDictionaries:[NSDictionary] = []
for (var i:NSInteger = 0; i < allNotes.count; i++) {
aDictionaries.append(allNotes[i].dictionary())
}
NSUserDefaults.standardUserDefaults().setObject(aDictionaries, forKey: KAllNotes)
// aDictionaries.writeToFile(filePath(), atomically: true)
}
class func loadnotes() {
allNotes.removeAll(keepCapacity: true)
var defaults:NSUserDefaults = NSUserDefaults.standardUserDefaults()
var savedData:[NSDictionary]? = defaults.objectForKey(KAllNotes) as? [NSDictionary]
// var savedData:NSArray? = NSArray(contentsOfFile: filePath())
if let data:[NSDictionary] = savedData {
for (var i:NSInteger = 0; i < data.count; i++) {
var n:Note = Note()
n.setValuesForKeysWithDictionary(data[i] as [NSObject : AnyObject])
allNotes.append(n)
}
}
}
class func filePath() -> String {
var d:[String]? = NSSearchPathForDirectoriesInDomains(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.AllDomainsMask, true) as? [String]
if let directories:[String] = d {
var docsDirectory:String = directories[0]
var path:String = docsDirectory.stringByAppendingPathComponent("\(KAllNotes).notes")
return path;
}
return ""
}
}
Thanks in advance
Sam
Add an NSUserDefault boolean that stores whether or not the initial note should be shown, e.g. that the app has been launched for the first time. Then load an initial note accordingly. When a note is added or the initial note is deleted, then change the boolean accordingly so the initial note doesn't show up next time.
You could also initialize your database with an initial note. Not clear from your code how the notes are saved, but this approach would probably rely on the NSUserDefault approach above, except it could be done in the AppDelegate or something.
example:
let InitialSetupComplete = "InitialSetupComplete" // Note: I would define this at the top of a file
let defaults = NSUserDefaults.standardUserDefaults()
if defaults.boolForKey(InitialSetupComplete) {
// Show initial note
}
// Later on when the note is deleted, or modified (or immediately after initial note loaded into the database, see below)
defaults.setBool(true, forKey: InitialSetupComplete)
Would be easier/cleaner just to initialize your database with the initial note in the app delegate (e.g. call within applicationDidFinishLaunching), so your view controller doesn't have to figure this out. Similar code, except you would use setBool right away after the initial note has been saved to the database. I don't know anything about your database from the question, so can't really provide a more detailed example than this. Hope this helps.

Resources