Problem:
I am currently facing a problem with developing an iOS Mobile Application in Swift that utilizes:
BTLE: Connecting to a peripheral device and sending/receiving data to/from it.
Networking: If the peripheral is connected to a network (wireless and/or ethernet), then the communication over BTLE "could" instead happen over the network.
Model-View-ViewModel architecture
RxSwift
About the App:
It starts with a Bluetooth Setup view, which walks the user through the process of pairing with the peripheral device (disjoint from the TabBarController).
After successfully pairing with the device, all configuration is requested by the iOS App from the device, which is sent as JSON.
This JSON contains the different Model information (programming) that the App displays to the user for manipulation and needs to be stored in a array somehow in a Singleton manor to where a view-model can request any index for displaying to the user.
After all the data is received, the Bluetooth View dismisses and the TabBarView's are presented.
Current Examples:
A good example to relate this App to would be the Apple Watch and the correlating iOS App that allows you to configure everything. I am having to do somewhat the same concept.
Another good example app from this blog post where they are doing something similar to what I am trying to achieve. The difference I am running into though, is their dependency injection setup for MVVM (as well as other similar examples). I've used a storyboard, where as they have programmatically instantiated their view controllers in the AppDelegate.
And my problem...
How can I pass the data (efficiently) from BluetoothView to TabBarView without NSNotifications or PrepareForSegues? Keeping in mind that I am intending to use the library RxSwift for asynchronous event handling and event/data streams. I am trying to keep this App as stateless as possible.
Are the Servers in this blog post a good practice for retrieving view-models and/or updating them?
I find that, when using RxSwift, the "view-model" ends up being a single pure function that takes observable parameters from the input UI parameters and returns observables that are then bound to the output UI elements.
Something that really helped me wrap my head around Rx was the tutorial videos for cycle.js.
As for your specific conundrum...
What you are doing doesn't have to be "forward" movement. Look at it this way... The TabBarView needs some data, and it doesn't care where that data comes from. So give the TabBarView access to a function that returns an observable which contains the necessary data. That closure will present the Bluetooth View, make the connection, get the necessary data and then dismiss the Bluetooth View and call onNext with the required data.
Looking at this gist might help get across what I'm talking about. Granted the gist uses PromiseKit instead of RxSwift, but the same principle can be used (instead of fulfill, you would want to call onNext and then onCompletion.) In the gist, the view controller that needs the data simply calls a function and subscribes to the result (in this case, the result contains a UIImage.) It is the function's job to determine what image sources are available, ask the user which source they want to retrieve the image from and present the appropriate view controller to get the image.
The current contents of the gist are below:
//
// UIViewController+GetImage.swift
//
// Created by Daniel Tartaglia on 4/25/16.
// Copyright © 2016 MIT License
//
import UIKit
import PromiseKit
enum ImagePickerError: ErrorType {
case UserCanceled
}
extension UIViewController {
func getImage(focusView view: UIView) -> Promise<UIImage> {
let proxy = ImagePickerProxy()
let cameraAction: UIAlertAction? = !UIImagePickerController.isSourceTypeAvailable(.Camera) ? nil : UIAlertAction(title: "Camera", style: .Default) { _ in
let controller = UIImagePickerController()
controller.delegate = proxy
controller.allowsEditing = true
controller.sourceType = .Camera
self.presentViewController(controller, animated: true, completion: nil)
}
let photobinAction: UIAlertAction? = !UIImagePickerController.isSourceTypeAvailable(.PhotoLibrary) ? nil : UIAlertAction(title: "Photos", style: .Default) { _ in
let controller = UIImagePickerController()
controller.delegate = proxy
controller.allowsEditing = false
controller.sourceType = .PhotoLibrary
self.presentViewController(controller, animated: true, completion: nil)
}
let cancelAction = UIAlertAction(title: "Cancel", style: .Cancel, handler: nil)
let alert = UIAlertController(title: nil, message: nil, preferredStyle: .ActionSheet)
if let cameraAction = cameraAction {
alert.addAction(cameraAction)
}
if let photobinAction = photobinAction {
alert.addAction(photobinAction)
}
alert.addAction(cancelAction)
let popoverPresentationController = alert.popoverPresentationController
popoverPresentationController?.sourceView = view
popoverPresentationController?.sourceRect = view.bounds
presentViewController(alert, animated: true, completion: nil)
let promise = proxy.promise
return promise.always {
self.dismissViewControllerAnimated(true, completion: nil)
proxy.retainCycle = nil
}
}
}
private final class ImagePickerProxy: NSObject, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
let (promise, fulfill, reject) = Promise<UIImage>.pendingPromise()
var retainCycle: ImagePickerProxy?
required override init() {
super.init()
retainCycle = self
}
#objc func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {
let image = (info[UIImagePickerControllerEditedImage] as? UIImage) ?? (info[UIImagePickerControllerOriginalImage] as! UIImage)
fulfill(image)
}
#objc func imagePickerControllerDidCancel(picker: UIImagePickerController) {
reject(ImagePickerError.UserCanceled)
}
}
Related
I'm newbie in iOS development, so some things which I will show and ask here can be stupid and please don't be angry :) So, I need to add support of picking files from local storage in my app. This feature will be used for picking file -> encoding to Base64 and then sending to remote server. Right now I have some problems with adding this functionality to my app. I had found this tutorial and did everything what was mentioned here:
added import - import MobileCoreServices
added implementation - UIDocumentPickerDelegate
added this code scope for showing picker:
let documentPicker = UIDocumentPickerViewController(documentTypes: [String(kUTTypeText),String(kUTTypeContent),String(kUTTypeItem),String(kUTTypeData)], in: .import)
documentPicker.delegate = self
self.present(documentPicker, animated: true)
and also added handler of selected file:
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
print(urls)
}
In general file chooser appears on simulator screen, but I see warning in XCode:
'init(documentTypes:in:)' was deprecated in iOS 14.0
I visited the official guideline and here also found similar info about deprecation some method. So, how I can solve my problem with file choosing by the way which will be fully compatible with the latest iOS version. And another question - how I can then encode selected file? Right now I have an ability of file choosing and printing its location, but I need to get its data like name, content for encoding and some others. Maybe someone faced with similar problems and knows a solution? I need to add it in ordinary viewcontroller, so when I tried to add this implementation:
UIDocumentPickerViewController
I saw such error message:
Multiple inheritance from classes 'UIViewController' and 'UIDocumentPickerViewController'
I will be so pleased for any info: tutorials or advice :)
I decided to post my own solution of my problem. As I am new in ios development my answer can contain some logical problems :) Firstly I added some dialogue for choosing file type after pressing Attach button:
#IBAction func attachFile(_ sender: UIBarButtonItem) {
let attachSheet = UIAlertController(title: nil, message: "File attaching", preferredStyle: .actionSheet)
attachSheet.addAction(UIAlertAction(title: "File", style: .default,handler: { (action) in
let supportedTypes: [UTType] = [UTType.png,UTType.jpeg]
let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: supportedTypes)
documentPicker.delegate = self
documentPicker.allowsMultipleSelection = false
documentPicker.shouldShowFileExtensions = true
self.present(documentPicker, animated: true, completion: nil)
}))
attachSheet.addAction(UIAlertAction(title: "Photo/Video", style: .default,handler: { (action) in
self.chooseImage()
}))
attachSheet.addAction(UIAlertAction(title: "Cancel", style: .cancel))
self.present(attachSheet, animated: true, completion: nil)
}
then when a user will choose File he will be moved to ordinary directory where I handle his selection:
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
var selectedFileData = [String:String]()
let file = urls[0]
do{
let fileData = try Data.init(contentsOf: file.absoluteURL)
selectedFileData["filename"] = file.lastPathComponent
selectedFileData["data"] = fileData.base64EncodedString(options: .lineLength64Characters)
}catch{
print("contents could not be loaded")
}
}
as you can see in scope above I formed special dicionary for storing data before sending it to a server. Here you can also see encoding to Base64.
When the user will press Photo/Video item in alert dialogue he will be moved to gallery for picture selecting:
func chooseImage() {
imagePicker.allowsEditing = false
imagePicker.sourceType = .photoLibrary
present(imagePicker, animated: true, completion: nil)
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
var selectedImageData = [String:String]()
guard let fileUrl = info[UIImagePickerController.InfoKey.imageURL] as? URL else { return }
print(fileUrl.lastPathComponent)
if let pickedImage = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
selectedImageData["filename"] = fileUrl.lastPathComponent
selectedImageData["data"] = pickedImage.pngData()?.base64EncodedString(options: .lineLength64Characters)
}
dismiss(animated: true, completion: nil)
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
dismiss(animated: true, completion: nil)
}
via my method all file content will be encoded to base64 string.
P.S. Also I'm so pleased to #MaticOblak because he showed me the initial point for my research and final solution. His solution also good, but I have managed to solve my problem in way which is more convenient for my project :)
As soon as you have file URL you can use that URL to retrieve the data it contains. When you have the data you can convert it to Base64 and send it to server. You gave no information about how you will send it to server but the rest may look something like this:
func sendFileWithURL(_ url: URL, completion: #escaping ((_ error: Error?) -> Void)) {
func finish(_ error: Error?) {
DispatchQueue.main.async {
completion(error)
}
}
DispatchQueue(label: "DownloadingFileData." + UUID().uuidString).async {
do {
let data: Data = try Data(contentsOf: url)
let base64String = data.base64EncodedString()
// TODO: send string to server and call the completion
finish(nil)
} catch {
finish(error)
}
}
}
and you would use it as
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
urls.forEach { sendFileWithURL($0) { <#Your code here#> } }
}
To break it down:
To get file data you can use Data(contentsOf: url). This method even works on remote files so you could for instance use an URL of an image link anywhere on internet you have access to. It is important to know that this method will pause your thread which is usually not what you want.
To avoid breaking the current thread we create a new queue using DispatchQueue(label: "DownloadingFileData." + UUID().uuidString). The name of the queue is not very important but can be useful when debugging.
When data is received we convert it to Base64 string using data.base64EncodedString() and this data can then be sent to server. You just need to fill in the TODO: part.
Retrieving your file data can have some errors. Maybe access restriction or file no longer there or no internet connection... This is handled by throwing. If the statement with try fails for any reason then the catch parts executes and you receive an error.
Since all of this is done on background thread it usually makes sense to go back to main thread. This is what the finish function does. If you do not require that you can simply remove it and have:
func sendFileWithURL(_ url: URL, completion: #escaping ((_ error: Error?) -> Void)) {
DispatchQueue(label: "DownloadingFileData." + UUID().uuidString).async {
do {
let data: Data = try Data(contentsOf: url)
let base64String = data.base64EncodedString()
// TODO: send string to server and call the completion
completion(nil)
} catch {
completion(error)
}
}
}
There are other things to consider in this approach. For instance you can see if user selects multiple files then each of them will open its own queue and start the process. That means that if user selects multiple files it is possible that at some point many or all of them will be loaded in memory. That may take too much memory and crash your application. It is for you to decide if this approach is fine for you or you wish to serialize the process. The serialization should be very simple with queues. All you need is to have a single one:
private lazy var fileProcessingQueue: DispatchQueue = DispatchQueue(label: "DownloadingFileData.main")
func sendFileWithURL(_ url: URL, completion: #escaping ((_ error: Error?) -> Void)) {
func finish(_ error: Error?) {
DispatchQueue.main.async {
completion(error)
}
}
fileProcessingQueue.async {
do {
let data: Data = try Data(contentsOf: url)
let base64String = data.base64EncodedString()
// TODO: send string to server and call the completion
finish(nil)
} catch {
finish(error)
}
}
}
Now one operation will finish before another one starts. But that may only apply for getting file data and conversion to base64 string. If uploading is then done on another thread (Which usually is) then you may still have multiple ongoing requests which may contain all of the data needed to upload.
I recently got this error with the UIImagePickerController in Xcode Version 12.0.1
[Camera] Failed to read exposureBiasesByMode dictionary: Error Domain=NSCocoaErrorDomain Code=4864 "*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL" UserInfo={NSDebugDescription=*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL}
Has anyone else seen this error? How do you fix it?
If you customize your image picker as imagePicker.allowsEditing = true
you have to fetch image using:
if let pickedImage = info[UIImagePickerController.InfoKey.editedImage] as? UIImage {
capturedImage = pickedImage
}
If you instead use imagePicker.allowsEditing = false, use this to pick image:
if let pickedImage = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
capturedImage = pickedImage
}
If you don't follow this combination, you may get this error.
in my case, I got this bug from trying to use the image data and syncing with Files. Adding this permission in Info.plist made all the difference and made that error go away:
<key>LSSupportsOpeningDocumentsInPlace</key> <true/>
I experienced the same issue. I imported AVKit instead og AVFoundation and tried to present the video in the native recorder view. That gave me an exception telling me to add NSMicrophoneUsageDescription to the info.plist file, and after this, I was able to display the live video in a custom view.
So I believe the issue is with iOS 14 being very picky about permissions, and probably something goes wrong with showing the correct exception when the video is not presented in the native view.
Anyway, this worked for me:
import AVKit
import MobileCoreServices
#IBOutlet weak var videoViewContainer: UIView!
private let imagePickerController = UIImagePickerController()
override func viewDidLoad() {
super.viewDidLoad()
initCameraView()
}
func initCameraView() {
// Device setup
imagePickerController.delegate = self
imagePickerController.sourceType = .camera
imagePickerController.mediaTypes = [kUTTypeMovie as String]
imagePickerController.cameraCaptureMode = .video
imagePickerController.cameraDevice = .rear
// UI setup
addChild(imagePickerController)
videoViewContainer.addSubview(imagePickerController.view)
imagePickerController.view.frame = videoViewContainer.bounds
imagePickerController.allowsEditing = false
imagePickerController.showsCameraControls = false
imagePickerController.view.autoresizingMask = [.flexibleWidth, .flexibleHeight]
}
And then the added description for the NSMicrophoneUsageDescription in the info.plist file :-)
Hope it will work for you as well!
I managed to solve the problem. In fact, it is not directly related to react-native-image-crop-picker. The problem was that I was using react-native-actionsheet to give the user the option to open the camera or the gallery. When I opened the react-native-actionsheet and pressed one of the options, the camera was superimposing the react-native-actionsheet (modal) which generated a conflict, because apparently in IOS it is not possible for one Modal to overlap the other.
So, to solve the problem, I defined a timeout so that it is possible to close the modal before opening the camera.
I got this error when I tried to copy from a URL I couldn't copy. Which was coming from the mediaURL from the UIImagePickerControllerDelegate.
Basically, what I did was to use UISaveVideoAtPathToSavedPhotosAlbum
Like in this example ⤵️
if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(url.absoluteString) {
UISaveVideoAtPathToSavedPhotosAlbum(url.absoluteString, self, #selector(self.didSaveVideo), nil)
} else {
return /* do something*/
}
#objc private func didSaveVideo(videoPath: String, error: NSError, contextInfo: Any) {}
I found the same error with Xcode 12 & iOS 14 when imagePicker's source type is camera.
But the app is working fine, I could take picture using camera and put it in my collection view cell. Thus, maybe something on Xcode 12 I guess.
#objc func addPerson() {
let picker = UIImagePickerController()
if UIImagePickerController.isSourceTypeAvailable(.camera) {
picker.sourceType = .camera
} else {
fatalError("Camera is not available, please use real device.")
}
picker.allowsEditing = true
picker.delegate = self
present(picker, animated: true)
}
I faced the same error with Xcode 12 & iOS 14.
But in my case, I used ActionSheet to choose camera or photo library before that. So I changed to open camera just after close that ActionSheet, and it works well.
Hope this will be helpful on your issue.
enum MediaOptions: Int {
case Photos
case Camera
}
func selectImage(mediaType: MediaOptions) {
self.mediaOption = mediaType
let iPicker = UIImagePickerController()
iPicker.delegate = self
iPicker.allowsEditing = false
if mediaType == .Camera {
if UIImagePickerController.isSourceTypeAvailable(.camera) {
iPicker.sourceType = .camera
iPicker.allowsEditing = true
}
} else {
iPicker.sourceType = .photoLibrary
}
self.present(iPicker, animated: true, completion: nil)
self.imagePicker = iPicker
}
func choosePhoto() {
let actionSheet = UIAlertController(title: "Choose", message: "", preferredStyle: .actionSheet)
if UIImagePickerController.isSourceTypeAvailable(.camera) {
actionSheet.addAction(UIAlertAction(title: "Camera", style: .default, handler: { (action) -> Void in
actionSheet.dismiss(animated: true) {
self.selectImage(mediaType: .Camera) // Just moved here - inside the dismiss callback
}
}))
}
if UIImagePickerController.isSourceTypeAvailable(.photoLibrary) {
actionSheet.addAction(UIAlertAction(title: "Photo Library", style: .default, handler: { (action) -> Void in
self.selectImage(mediaType: .Photos)
}))
}
actionSheet.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: nil))
self.present(actionSheet, animated: true, completion: nil)
}
In my case, I was missing an Info.plist key for NSCameraUsageDescription.
You should enter the purpose of using camera as the description.
It fixed the crash for me.
Plus, if you don't give the purpose, your app is likely to be rejected.
If like me you have this second message :
[access] This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.
Then you have to add this to your info.plist dictionary:
<key>NSCameraUsageDescription</key>
<string>so you can choose a photo or take a picture for object detection</string>
It solved the problem for me
I'm currently trying to build a proof-of-concept iOS app to check if we are able to implement some sort of indoor positioning capability without deploying beacons or any other hardware.
What we have
There is a database containing all registered access points in our building including their X- and Y-coordinates. The coordinates are mapped to a custom-built grid that spans the whole building.
The app will be released using our Enterprise distribution, so there are no constraints concerning any Apple Store requirements. The app will be running exclusively on devices that automatically connect to the proper WiFi using a certificate.
What we'd like to build
In order to improve the usability of the app, we'd like to show the user his current position. Using Apples native CLLocation services is not accurate enough because we are operating inside a building. The basic idea is to fetch all nearby access points including their BSSID and signal strength and calculate a more or less accurate position using both signal strength and the location database for our access points (see above).
What i've tried so far
Using SystemConfiguration.CaptiveNetwork to get the BSSID
import SystemConfiguration.CaptiveNetwork
func getCurrentBSSID() -> String {
guard let currentInterfaces = CNCopySupportedInterfaces() as? [String] else { return "" }
for interface in currentInterfaces {
print("Looking up BSSID info for \(interface)") // en0
let SSIDDict = CNCopyCurrentNetworkInfo(interface as CFString) as! [String : AnyObject]
return SSIDDict[kCNNetworkInfoKeyBSSID as String] as! String
}
return ""
}
This solution works (after setting the proper entitlements), but i'm only able to read the BSSID of the CURRENTLY CONNECTED access point.
Using UIStatusBarDataNetworkItemView to read signal strength
private func wifiStrength() -> Int? {
let app = UIApplication.shared
var rssi: Int?
guard let statusBar = app.value(forKey: "statusBar") as? UIView, let foregroundView = statusBar.value(forKey: "foregroundView") as? UIView else {
return rssi
}
for view in foregroundView.subviews {
if let statusBarDataNetworkItemView = NSClassFromString("UIStatusBarDataNetworkItemView"), view .isKind(of: statusBarDataNetworkItemView) {
if let val = view.value(forKey: "wifiStrengthRaw") as? Int {
rssi = val
break
}
}
}
return rssi
}
This one is kind of obvious, it only reads the signal strength for the connected WiFi network, not the access point specific one.
QUESTION
Is there any way to read a list of available access points (not WiFi networks) including their BSSID and signal strength? We cannot jailbreak the devices since they are under device management.
Maybe there is some way to do it using MobileWiFi.framework (see this link), but i couldn't wrap my head around doing it in Swift (kind of a beginner when it comes to iOS development).
I am afraid it is not possible to implement this on not jailbroken device.
I found some code for this, but it was outdated. I don't think that you will use it on iOS 3/4 devices.
NEHotspotHelper works only when Settings->Wifi page is active. You can get signal strength there, but I unsure how it will work.
MobileWiFi.framework requires entitlement, which can't be set without jailbreak.
Useful links:
Technical Q&A QA1942
Probably iBeacons or QR (AR) is the only options.
Although many resources say that while using Apple "official" frameworks, you can only get network's SSID that your iPhone is at the moment connected to. Here are workaround:
You can use NEHotspotConfigurationManager class but at first you must to enable the Hotspot Configuration Entitlement (property list key) in Xcode.
You can also use NEHotspotHelper class (although it requires Apple's permission). For this you need to apply for the Network Extension entitlement and then modify your Provisioning Profile plus some additional actions. Look at this SO post for further details.
Here's a code snippet how to use NEHotspotConfigurationManager:
import NetworkExtension
class ViewController: UIViewController {
let SSID = ""
#IBAction func connectAction(_ sender: Any) {
let hotspotConfig = NEHotspotConfiguration(ssid: SSID, passphrase: "", isWEP: false)
NEHotspotConfigurationManager.shared.apply(hotspotConfig) {[unowned self] (error) in
if let error = error {
self.showError(error: error)
} else {
self.showSuccess()
}
}
}
#IBAction func disconnectAction(_ sender: Any) {
NEHotspotConfigurationManager.shared.removeConfiguration(forSSID: SSID)
}
private func showError(error: Error) {
let alert = UIAlertController(title: "Error", message: error.localizedDescription, preferredStyle: .alert)
let action = UIAlertAction(title: "Darn", style: .default, handler: nil)
alert.addAction(action)
present(alert, animated: true, completion: nil)
}
private func showSuccess() {
let alert = UIAlertController(title: "", message: "Connected", preferredStyle: .alert)
let action = UIAlertAction(title: "Cool", style: .default, handler: nil)
alert.addAction(action)
present(alert, animated: true, completion: nil)
}
}
Here's a code snippet how to use NEHotspotHelper:
import NetworkExtension
import SystemConfiguration.CaptiveNetwork
func getSSID() -> String {
if #available(iOS 11.0, *) {
let networkInterfaces = NEHotspotHelper.supportedNetworkInterfaces()
let wiFi = NEHotspotNetwork()
let st = "SSID:\(wiFi.SSID), BSSID:\(wiFi.BSSID)"
return st
for hotspotNetwork in NEHotspotHelper.supportedNetworkInterfaces() {
let signalStrength = hotspotNetwork.signalStrength
print(signalStrength)
}
} else {
let interfaces = CNCopySupportedInterfaces()
guard interfaces != nil else {
return ""
}
let if0: UnsafePointer<Void>? = CFArrayGetValueAtIndex(interfaces, 0)
guard if0 != nil else {
return ""
}
let interfaceName: CFStringRef = unsafeBitCast(if0!, CFStringRef.self)
let dictionary = CNCopyCurrentNetworkInfo(interfaceName) as NSDictionary?
guard dictionary != nil else {
return ""
}
return String(dictionary![String(kCNNetworkInfoKeySSID)])
}
}
You can use transportable Differential GPS reference station inside your building and improve accuracy to about 1-3 cm and then rely on mobile phone built-in GPS.
I have an app that can download many publications from a server at once. For each publication that already exists in the app, I want to prompt the user if he wants to overwrite the existing version.
Is there any clean way to present UIAlertControllers so that when the user has answered one, the app presents the next one?
Here is the output
Though two alert actions were called in a subsequent statements, second alert will be shown only after user interacts with alert on screen I mean only after tapping ok or cancel.
If this is what you want, as I mentioned in my comment you can make use of Asynchronous Operation and Operation Queue with maximum concurrent operation as 1
Here is the code.
First declare your own Asynchronous Operation
struct AlertObject {
var title : String! = nil
var message : String! = nil
var successAction : ((Any?) -> ())! = nil
var cancelAction : ((Any?) -> ())! = nil
init(with title : String, message : String, successAction : #escaping ((Any?) -> ()), cancelAction : #escaping ((Any?) -> ())) {
self.title = title
self.message = message
self.successAction = successAction
self.cancelAction = cancelAction
}
}
class MyAsyncOperation : Operation {
var alertToShow : AlertObject! = nil
var finishedStatus : Bool = false
override init() {
super.init()
}
override var isFinished: Bool {
get {
return self.finishedStatus
}
set {
self.willChangeValue(forKey: "isFinished")
self.finishedStatus = newValue
self.didChangeValue(forKey: "isFinished")
}
}
override var isAsynchronous: Bool{
get{
return true
}
set{
self.willChangeValue(forKey: "isAsynchronous")
self.isAsynchronous = true
self.didChangeValue(forKey: "isAsynchronous")
}
}
required convenience init(with alertObject : AlertObject) {
self.init()
self.alertToShow = alertObject
}
override func start() {
if self.isCancelled {
self.isFinished = true
return
}
DispatchQueue.main.async {
let alertController = UIAlertController(title: self.alertToShow.title, message: self.alertToShow.message, preferredStyle: .alert)
alertController.addAction(UIAlertAction(title: "OK", style: .default, handler: { (action) in
self.alertToShow.successAction(nil) //pass data if you have any
self.operationCompleted()
}))
alertController.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: { (action) in
self.alertToShow.cancelAction(nil) //pass data if you have any
self.operationCompleted()
}))
UIApplication.shared.keyWindow?.rootViewController?.present(alertController, animated: true, completion: nil)
}
}
func operationCompleted() {
self.isFinished = true
}
}
Though code looks very complicated in essence its very simple. All that you are doing is you are overriding the isFinished and isAsynchronous properties of Operation.
If you know how Operation queues works with Operation it should be very clear as to why am I overriding these properties. If in case u dont know! OperationQueue makes use of KVO on isFinished property of Operation to start the execution of next dependent operation in Operation queue.
When OperationQueue has maximum concurrent operation count as 1, isFinished flag of Operation decides when will next operation be executed :)
Because user might take action at some different time frame on alert, making operation Asynchronous (By default Operations are synchronous) and overriding isFinised property is important.
AlertObject is a convenience object to hold alert's meta data. You can modify it to match your need :)
Thats it. Now whenever whichever viewController wants to show alert it can simply use MyAsyncOperation make sure you have only one instance of Queue though :)
This is how I use it
let operationQueue = OperationQueue() //make sure all VCs use the same operation Queue instance :)
operationQueue.maxConcurrentOperationCount = 1
let alertObject = AlertObject(with: "First Alert", message: "Success", successAction: { (anything) in
debugPrint("Success action tapped")
}) { (anything) in
debugPrint("Cancel action tapped")
}
let secondAlertObject = AlertObject(with: "Second Alert", message: "Success", successAction: { (anything) in
debugPrint("Success action tapped")
}) { (anything) in
debugPrint("Cancel action tapped")
}
let alertOperation = MyAsyncOperation(with: alertObject)
let secondAlertOperation = MyAsyncOperation(with: secondAlertObject)
operationQueue.addOperation(alertOperation)
operationQueue.addOperation(secondAlertOperation)
As you can see I have two alert operations added in subsequent statement. Even after that alert will be shown only after user dismisses the currently displayed alert :)
Hope this helps
Althought answer with Queue is very good, you can achieve te same as easy as:
var messages: [String] = ["first", "second"]
func showAllerts() {
guard let message = messages.first else { return }
messages = messages.filter({$0 != message})
let alert = UIAlertController(title: "title", message: message, preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .default, handler: { [weak self] (action) in
// do something
self?.showAllerts()
}))
alert.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: { [weak self] (action) in
self?.showAllerts()
}))
present(alert, animated: true, completion: nil)
}
(replace array of messages with whatever you want)
I would recommend creating a Queue data structure (https://github.com/raywenderlich/swift-algorithm-club/tree/master/Queue).
Where alert objects are queued in the order that the alerts are initialized. When a user selects an action on one of the alerts, dequeue the next possible alert and present it.
I had the same problem in my app and tried several solutions, but all of them were messy. Then I thought of a very simple and effective way: use a delay to retry presentation until it can be shown. This approach is much cleaner in that you don't need coordinated code in multiple places and you don't have to hack your action handlers.
Depending on your use case, you might care that this approach doesn't necassarily preserve the order of the alerts, in which case you can easily adapt it to store the alerts in an array to preserve order, showing and removing only the first on in the array each time.
This code overrides the standard presentation method in UIViewController, use it in your subclass which is presenting the alerts. This could also be adapted to an app level method if needed that descends from the rootViewController to find the top most presented VC and shows from there, etc.
- (void)presentViewController:(UIViewController *)viewControllerToPresent animated:(BOOL)flag completion:(void (^)(void))completion {
// cannot present if already presenting.
if (self.presentedViewController) {
// cannot present now, try again in 100ms.
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
// make sure we ourselve are still presented and able to present.
if (self.presentingViewController && !self.isBeingDismissed) {
// retry on self
[self presentViewController:viewControllerToPresent animated:flag completion:completion];
}
});
} else {
// call super to really do it
[super presentViewController:viewControllerToPresent animated:flag completion:completion];
}
}
Few years ago I wrote a kind of presentation service, that process queue of items to present. When there isn't any presented view in current moment it take another one from array of items to present.
Maybe it will help someone:
https://github.com/Flawion/KOControls/blob/master/Sources/KOPresentationQueuesService.swift
Usage is very simple:
let itemIdInQueue = present(viewControllerToPresent, inQueueWithIndex: messageQueueIndex)
Essentially I'm trying to creating a function that sets up multiple notifications and while it does this, I want it to display a loading/progress bar in an alert view and update this while it sets up the notifications.
The problem is that the UI is not updating when I'm adding the progress view to the alert view as a subview, until the processing is complete.
Usually I fix this kind of problem with 'DispatchQueue.main.async'. But this doesn't seem to make a difference.
Below is the code. In the 'PresentLoadingView' function, the processing in the 'SetUpNotifications' function that is passed as the 'viewPresented' closure function finishes before the UI actually shows that the 'progressView' is added to the 'loadingView'.
I'm pretty sure this is a threading issue as it doesn't seem to reach the main UI thread when using 'DispatchQueue.main.async', possibly something to do with it having an #escaping parameter? Or just the fact that it is a closure within a closure?
var centre: UNUserNotificationCenter?
func AttemptToSetUpNotifications() {
centre = UNUserNotificationCenter.current()
centre!.requestAuthorization(options: [.alert, .badge], completionHandler: SetUpNotificationsChecks)
}
func SetUpNotificationsChecks(granted: Bool, error: Error?) {
if error != nil {
return
}
if !granted {
return
}
PresentLoadingView(viewPresented: SetUpNotifications)
}
var loadingView: UIAlertController?
func PresentLoadingView(viewPresented: #escaping ((UIProgressView)-> Void)) {
loadingView = UIAlertController(title: "Setup Notifications", message: "Setting up notifications, please wait...", preferredStyle: .alert)
self.present(loadingView!, animated: true, completion: {
// Add your progressbar after alert is shown
let margin:CGFloat = 8.0
let rect = CGRect(x:margin, y:72.0, width:self.loadingView!.view.frame.width - margin * 2.0 , height:2.0)
let progressView = UIProgressView(frame: rect)
progressView.progress = 0.0
progressView.tintColor = Common.CommonTintColor
self.loadingView!.view.addSubview(progressView)
// DispatchQueue.main.async {
// self.loadingView!.view.addSubview(progressView)
// }
viewPresented(progressView)
})
}
Any help would be greatly appreciated, sorry if the problem is my lack of concurrency knowledge, and if you need anymore info, please ask.
Thanks.