Making the iPhone vibrate - ios

How can the iPhone be set to vibrate once?
For example, when a player loses a life or the game is over, the iPhone should vibrate.

From "iPhone Tutorial: Better way to check capabilities of iOS devices":
There are two seemingly similar functions that take a parameter kSystemSoundID_Vibrate:
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
Both of the functions vibrate the iPhone. But, when you use the first
function on devices that don’t support vibration, it plays a beep
sound. The second function, on the other hand, does nothing on
unsupported devices. So if you are going to vibrate the device
continuously, as an alert, common sense says, use function 2.
First, add the AudioToolbox framework AudioToolbox.framework to your target in Build Phases.
Then, import this header file:
#import <AudioToolbox/AudioServices.h>

Swift 2.0+
AudioToolbox now presents the kSystemSoundID_Vibrate as a SystemSoundID type, so the code is:
import AudioToolbox.AudioServices
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate)
Instead of having to go thru the extra cast step
(Props to #Dov)
Original Answer (Swift 1.x)
And, here's how you do it on Swift (in case you ran into the same trouble as I did)
Link against AudioToolbox.framework (Go to your project, select your target, build phases, Link Binary with Libraries, add the library there)
Once that is completed:
import AudioToolbox.AudioServices
// Use either of these
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
AudioServicesPlayAlertSound(SystemSoundID(kSystemSoundID_Vibrate))
The cheesy thing is that SystemSoundID is basically a typealias (fancy swift typedef) for a UInt32, and the kSystemSoundID_Vibrate is a regular Int. The compiler gives you an error for trying to cast from Int to UInt32, but the error reads as "Cannot convert to SystemSoundID", which is confusing. Why didn't apple just make it a Swift enum is beyond me.
#aponomarenko's goes into the details, my answer is just for the Swifters out there.

A simple way to do so is with Audio Services:
#import <AudioToolbox/AudioToolbox.h>
...
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);

I had great trouble with this for devices that had vibration turned off in some manner, but we needed it to work regardless, because it is critical to our application functioning, and since it is just an integer to a documented method call, it will pass validation. So I have tried some sounds that were outside of the well documented ones here: TUNER88/iOSSystemSoundsLibrary
I have then stumbled upon 1352, which is working regardless of the silent switch or the settings on the device (Settings->vibrate on ring, vibrate on silent).
- (void)vibratePhone;
{
if([[UIDevice currentDevice].model isEqualToString:#"iPhone"])
{
AudioServicesPlaySystemSound (1352); //works ALWAYS as of this post
}
else
{
// Not an iPhone, so doesn't have vibrate
// play the less annoying tick noise or one of your own
AudioServicesPlayAlertSound (1105);
}
}

Important Note: Alert of Future Deprecation.
As of iOS 9.0, the API functions description for:
AudioServicesPlaySystemSound(inSystemSoundID: SystemSoundID)
AudioServicesPlayAlertSound(inSystemSoundID: SystemSoundID)
includes the following note:
This function will be deprecated in a future release.
Use AudioServicesPlayAlertSoundWithCompletion or
AudioServicesPlaySystemSoundWithCompletion instead.
The right way to go will be using any of these two:
AudioServicesPlayAlertSoundWithCompletion(kSystemSoundID_Vibrate, nil)
or
AudioServicesPlayAlertSoundWithCompletion(kSystemSoundID_Vibrate) {
//your callback code when the vibration is done (it may not vibrate in iPod, but this callback will be always called)
}
remember to import AVFoundation

For an iPhone 7/7 Plus or newer, use these three Haptic feedback APIs.
Available APIs
For notifications:
let generator = UINotificationFeedbackGenerator()
generator.notificationOccured(style: .error)
Available styles are .error, .success, and .warning. Each has its own distinctive feel.
From the docs:
A concrete UIFeedbackGenerator subclass that creates haptics to communicate successes, failures, and warnings.
For simple vibrations:
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccured()
Available styles are .heavy, .medium, and .light. These are simple vibrations with varying degrees of "hardness".
From the docs:
A concrete UIFeedbackGenerator subclass that creates haptics to simulate physical impacts
For when the user selected an item
let generator = UISelectionFeedbackGenerator()
generator.selectionChanged()
This is the least noticeable of all the haptics, and so is the most suitable for when haptics should not be taking over the app experience.
From the docs:
A concrete UIFeedbackGenerator subclass that creates haptics to indicate a change in selection.
Notes
There are a couple of things worth remembering when using these APIs.
Note A
You do not actually create the haptic. You request the system generate a haptic. The system will decide based on the below:
If haptics are possible on the device (whether it has a Taptic Engine in this case)
Whether the app may record audio (haptics do not generate during recording to prevent unwanted interference)
Whether haptics are enabled in system Settings.
Therefore, the system will silently ignore your request for a haptic if it is not possible. If this is due to an unsupported device, you could try this:
func haptic() {
// Get whether the device can generate haptics or not
// If feedbackSupportLevel is nil, will assign 0
let feedbackSupportLevel = UIDevice.current.value(forKey: "_feedbackSupportLevel") as? Int ?? 0
switch feedbackSupportLevel {
case 2:
// 2 means the device has a Taptic Engine
// Put Taptic Engine code here, using the APIs explained above
case 1:
// 1 means no Taptic Engine, but will support AudioToolbox
// AudioToolbox code from the myriad of other answers!
default: // 0
// No haptic support
// Do something else, like a beeping noise or LED flash instead of haptics
}
Substitute the comments in the switch-case statements, and this haptic generation code will be portable to other iOS devices. It will generate the highest level of haptic possible.
Note B
Due to the fact that generating haptics is a hardware-level task, there may be latency between when you call the haptic-generation code, and when it actually happens. For this reason, the Taptic Engine APIs all have a prepare() method, to put it in a state of readiness. Using your Game Over example: You may know that the game is about to end, by the user having very low HP, or a dangerous monster being near them.
If you don't generate a haptic within a few seconds, the Taptic Engine will go back into an idle state (to save battery life)
In this case, preparing the Taptic Engine would create a higher-quality, more responsive experience.
For example, let's say your app uses a pan gesture recogniser to change the portion of the world visible. You want a haptic to generate when the user 'looks' round 360 degrees. Here is how you could use prepare():
#IBAction func userChangedViewablePortionOfWorld(_ gesture: UIPanGestureRecogniser!) {
haptic = UIImpactFeedbackGenerator(style: .heavy)
switch gesture.state {
case .began:
// The user started dragging the screen.
haptic.prepare()
case .changed:
// The user trying to 'look' in another direction
// Code to change viewable portion of the virtual world
if virtualWorldViewpointDegreeMiddle = 360.0 {
haptic.impactOccured()
}
default:
break
}

And if you're using Xamarin (monotouch) framework, simply call
SystemSound.Vibrate.PlayAlertSound()

In my travels I have found that if you try either of the following while you are recording audio, the device will not vibrate even if it is enabled.
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
My method was called at a specific time in the measurement of the devices movements. I had to stop the recording and then restart it after the vibration had occurred.
It looked like this.
-(void)vibrate {
[recorder stop];
AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);
[recorder start];
}
recorder is an AVRecorder instance.
Hope this helps others that have had the same problem before.

In iOS 10, and on newer iPhones, you can also use haptic API. This haptic feedback is softer than the AudioToolbox API.
For your GAME OVER scenario, a heavy UI impact feedback should be suitable.
UIImpactFeedbackGenerator(style: .heavy).impactOccurred()
You could use the other haptic feedback styles.

In Swift:
import AVFoundation
...
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))

In my case I was using the AVCaptureSession.
AudioToolbox was in project's build phases and it was imported but still didn't work. In order to make it work I stopped the session before vibration and continued on after that.
#import <AudioToolbox/AudioToolbox.h>
...
#property (nonatomic) AVCaptureSession *session;
...
- (void)vibratePhone;
{
[self.session stopRunning];
NSLog(#"vibratePhone %#",#"here");
if([[UIDevice currentDevice].model isEqualToString:#"iPhone"])
{
AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);
}
else
{
AudioServicesPlayAlertSound (kSystemSoundID_Vibrate);
}
[self.session startRunning];
}

You can use
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
for iPhone and few newer iPods.
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
for iPads.

Related

Programmatically trigger the action that a headphone pause button would do

I am trying to find a way to pause any playing media on the device, so I was thinking of triggering the same logic that is fired when a user press the headphone "middle button"
I managed to prevent music from resuming (after I pause it within my app, which basically start an AVAudioSession for recording) by NOT setting the AVAudioSession active property to false and leave it hanging, but I am pretty sure thats a bad way to do it. If I deactivate it the music resumes. The other option I am thinking of is playing some kind of silent loop that would "imitate" the silence I need to do. But I think if what I am seeking is doable, it would be the best approach as I understood from this question it cannot be done using the normal means
func stopAudioSession() {
let audioSession = AVAudioSession.sharedInstance(
do {
if audioSession.secondaryAudioShouldBeSilencedHint{
print("someone is playing....")
}
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)
isSessionActive = false
} catch let error as NSError {
print("Unable to deactivate audio session: \(error.localizedDescription)")
print("retying.......")
}
}
In this code snippet as the function name implies I set active to false, tried to find other options but I could not find another way of stopping my recording session and prevent resume of the other app that was already playing
If someone can guide me to which library I should look into, if for example I can tap into the H/W part and trigger it OR if I can find out which library is listening to this button press event and handling the pause/play functionality
A friend of mine who is more experienced in IOS development suggested the following workaround and it worked - I am posting it here as it might help someone trying to achieve a similar behaviour.
In order to stop/pause what is currently being played on a user device, you will need to add a music player into your app. then at the point where you need to pause/stop the current media, you just initiate the player, play and then pause/stop it - simple :)
like so:
let musicPlayer = MPMusicPlayerApplicationController.applicationQueuePlayer
func stopMedia(){
MPMediaLibrary.requestAuthorization({(newPermissionStatus: MPMediaLibraryAuthorizationStatus) in
self.musicPlayer.setQueue(with: .songs())
self.musicPlayer.play()
print("Stopping music player")
self.musicPlayer.pause()
print("Stopped music player")
})
}
the part with MPMediaLibrary.requestAuthorization is needed to avoid an authorisation error when accessing user's media library.
and of course you will need to add the Privacy - Media Library Usage Description
key into your Info.plist file

Callkit loudspeaker bug / how WhatsApp fixed it?

I have an app with Callkit functionality. When I press the loudspeaker button, it will flash and animate to the OFF state (sometimes the speaker is set to LOUD but the icon is still OFF). When I tap on it multiple times... it can be clearly seen that this functionality is not behaving correctly.
However, WhatsApp has at the beginning the loudspeaker turned OFF and after 3+ seconds it activates it and its working. Has anyone encountered anything similar and can give me a solution?
Youtube video link to demonstrate my problem
There is a workaround proposed by an apple engineer which should fix callkit not activating the audio session correctly:
a workaround would be to configure your app's audio session (call configureAudioSession()) earlier in your app's lifecycle, before the -provider:performAnswerCallAction: method is invoked. For instance, you could call configureAudioSession() immediately before calling -[CXProvider reportNewIncomingCallWithUUID:update:completion:] in order to ensure that the audio session is fully configured prior to informing CallKit about the incoming call.
From: https://forums.developer.apple.com/thread/64544#189703
If this doesn't help, you probably should post an example project which reproduces your behaviour for us to be able to analyse it further.
Above answer is correct, "VoiceChat" mode ruin everything.
Swift 4 example for WebRTC.
After connection was established call next
let rtcAudioSession = RTCAudioSession.sharedInstance()
rtcAudioSession.lockForConfiguration()
do {
try rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue, with:
AVAudioSession.CategoryOptions.mixWithOthers)
try rtcAudioSession.setMode(AVAudioSession.Mode.default.rawValue)
try rtcAudioSession.overrideOutputAudioPort(.none)
try rtcAudioSession.setActive(true)
} catch let error {
debugPrint("Couldn't force audio to speaker: \(error)")
}
rtcAudioSession.unlockForConfiguration()
You can use AVAudioSession.sharedInstance() as well instead RTC
Referd from Abnormal behavior of speaker button on system provided call screen
The same issue has been experienced in the previous versions as well. So this is not the new issue happening on the call kit.
This issue has to be resolved from iOS. We don't have any control over this.
Please go through the apple developer forum
CallKit/detect speaker set
and
[CALLKIT] audio session not activating?
Maybe you can setMode to AVAudioSessionModeDefault.
When I use CallKit + WebRTC
I configure AVAudioSessionModeDefault mode.
Alloc CXProvider and reportNewIncomingCallWithUUID
Use WebRTC , after ICEConnected, WebRTC change mode to AVAudioSessionModeVoiceChat, then speaker issue happen.
Later I setMode back to AVAudioSessionModeDefault, the speaker works well.
I've fixed the issue by doing following steps.
In CXAnswerCallAction, use below code to set audiosession config.
RTCDispatcher.dispatchAsync(on: RTCDispatcherQueueType.typeAudioSession) {
let audioSession = RTCAudioSession.sharedInstance()
audioSession.lockForConfiguration()
let configuration = RTCAudioSessionConfiguration.webRTC()
configuration.categoryOptions = [AVAudioSessionCategoryOptions.allowBluetoothA2DP,AVAudioSessionCategoryOptions.duckOthers,
AVAudioSessionCategoryOptions.allowBluetooth]
try? audioSession.setConfiguration(configuration)
audioSession.unlockForConfiguration()}
After call connected, I'm resetting AudioSession category to default.
func configureAudioSession() {
let session = RTCAudioSession.sharedInstance()
session.lockForConfiguration()
do {
try session.setCategory(AVAudioSession.Category.playAndRecord.rawValue, with: .allowBluetooth)
try session.setMode(AVAudioSession.Mode.default.rawValue)
try session.setPreferredSampleRate(44100.0)
try session.setPreferredIOBufferDuration(0.005)
}
catch let error {
debugPrint("Error changeing AVAudioSession category: \(error)")
}
session.unlockForConfiguration()}
Thanks to SO #Алексей Смольский for the help.

Timing accuracy with swift using GCD dispatch_after

I'm trying to create a metronome for iOS in Swift. I'm using a GCD dispatch queue to time an AVAudioPlayer. The variable machineDelay is being used to time the player, but its running slower than the time I'm asking of it.
For example, if I ask for a delay of 1sec, it plays at 1.2sec. 0.749sec plays at about 0.92sec, and 0.5sec plays at about 0.652sec. I could try to compensate by adjusting for this discrepancy but I feel like there's something I'm missing here.
If there's a better way to do this altogether, please give suggestions. This is my first personal project so I welcome ideas.
Here are the various functions that should apply to this question:
func milliseconds(beats: Int) -> Double {
let ms = (60 / Double(beats))
return ms
}
func audioPlayerDidFinishPlaying(player: AVAudioPlayer, successfully flag: Bool) {
if self.playState == false {
return
}
playerPlay(playerTick, delay: NSTimeInterval(milliseconds(bpm)))
}
func playerPlay(player: AVAudioPlayer, delay: NSTimeInterval) {
let machineDelay: Int64 = Int64((delay - player.duration) * Double(NSEC_PER_SEC))
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, machineDelay),dispatch_get_main_queue(), { () -> Void in
player.play()
})
}
I have never really done anything with sound on iOS but I can tell you why you are getting those inconsistent timings.
What happens when you use dispatch_after() is that some timer is set somewhere in the OS and at some point soon after it expires, it puts your block on the queue. "at some point after" is going to be short, but depending on what the OS is doing, it will almost certainly not be close to zero.
The main queue is serviced by the main thread using the run loop. This means your task to play the sound is competing for use of the CPU with all the UI functionality. This means that the chance of it playing the sound immediately is quite low.
Finally, the completion handler will fire at some short time after the sound finishes playing but not necessarily straight away.
All of these little delays add up to the latency you are seeing. Unfortunately, depending on what the device is doing, that latency can vary. This is never going to work for something that needs precise timings.
There are, I think, a couple of ways to achieve what you want. However, audio programming is beyond my area of expertise. You probably want to start by looking at Core Audio. My five minutes of research suggests either Audio Queue Services or OpenAL, but those five minutes are literally everything I know about sound on iOS.
dispatch_after is not intended for sample accurate callbacks.
If you are writing audio applications there is no way to escape, you need to implement some CoreAudio code in one way or another.
It will "pull" specific counts of samples. Do the math (figuratively ;)

Xcode 7 UI Testing: how to dismiss a series of system alerts in code

I am writing UI test cases using the new Xcode 7 UI Testing feature. At some point of my app, I ask the user for permission of camera access and push notification. So two iOS popups will show up: "MyApp Would Like to Access the Camera" popup and "MyApp Would Like to Send You Notifications" popup. I'd like my test to dismiss both popups.
UI recording generated the following code for me:
[app.alerts[#"cameraAccessTitle"].collectionViews.buttons[#"OK"] tap];
However, [app.alerts[#"cameraAccessTitle"] exists] resolves to false, and the code above generates an error: Assertion Failure: UI Testing Failure - Failure getting refresh snapshot Error Domain=XCTestManagerErrorDomain Code=13 "Error copying attributes -25202".
So what's the best way of dismissing a stack of system alerts in test? The system popups interrupt my app flow and fail my normal UI test cases immediately. In fact, any recommendations regarding how I can bypass the system alerts so I can resume testing the usual flow are appreciated.
This question might be related to this SO post which also doesn't have an answer: Xcode7 | Xcode UI Tests | How to handle location service alert?
Thanks in advance.
Xcode 7.1
Xcode 7.1 has finally fixed the issue with system alerts. There are, however, two small gotchas.
First, you need to set up a "UI Interuption Handler" before presenting the alert. This is our way of telling the framework how to handle an alert when it appears.
Second, after presenting the alert you must interact with the interface. Simply tapping the app works just fine, but is required.
addUIInterruptionMonitorWithDescription("Location Dialog") { (alert) -> Bool in
alert.buttons["Allow"].tap()
return true
}
app.buttons["Request Location"].tap()
app.tap() // need to interact with the app for the handler to fire
The "Location Dialog" is just a string to help the developer identify which handler was accessed, it is not specific to the type of alert.
I believe that returning true from the handler marks it as "complete", which means it won't be called again. For your situation I would try returning false so the second alert will trigger the handler again.
Xcode 7.0
The following will dismiss a single "system alert" in Xcode 7 Beta 6:
let app = XCUIApplication()
app.launch()
// trigger location permission dialog
app.alerts.element.collectionViews.buttons["Allow"].tap()
Beta 6 introduced a slew of fixes for UI Testing and I believe this was one of them.
Also note that I am calling -element directly on -alerts. Calling -element on an XCUIElementQuery forces the framework to choose the "one and only" matching element on the screen. This works great for alerts where you can only have one visible at a time. However, if you try this for a label and have two labels the framework will raise an exception.
Objective - C
-(void) registerHandlerforDescription: (NSString*) description {
[self addUIInterruptionMonitorWithDescription:description handler:^BOOL(XCUIElement * _Nonnull interruptingElement) {
XCUIElement *element = interruptingElement;
XCUIElement *allow = element.buttons[#"Allow"];
XCUIElement *ok = element.buttons[#"OK"];
if ([ok exists]) {
[ok tap];
return YES;
}
if ([allow exists]) {
[allow tap];
return YES;
}
return NO;
}];
}
-(void)setUp {
[super setUp];
self.continueAfterFailure = NO;
self.app = [[XCUIApplication alloc] init];
[self.app launch];
[self registerHandlerforDescription:#"“MyApp” would like to make data available to nearby Bluetooth devices even when you're not using app."];
[self registerHandlerforDescription:#"“MyApp” Would Like to Access Your Photos"];
[self registerHandlerforDescription:#"“MyApp” Would Like to Access the Camera"];
}
Swift
addUIInterruptionMonitorWithDescription("Description") { (alert) -> Bool in
alert.buttons["Allow"].tap()
alert.buttons["OK"].tap()
return true
}
Gosh.
It always taps on "Don't Allow" even though I deliberately say tap on "Allow"
At least
if app.alerts.element.collectionViews.buttons["Allow"].exists {
app.tap()
}
allows me to move on and do other tests.
For the ones who are looking for specific descriptions for specific system dialogs (like i did) there is none :) the string is just for testers tracking purposes. Related apple document link : https://developer.apple.com/documentation/xctest/xctestcase/1496273-adduiinterruptionmonitor
Update : xcode 9.2
The method is sometimes triggered sometimes not. Best workaround for me is when i know there will be a system alert, i add :
sleep(2)
app.tap()
and system alert is gone
God! I hate how XCTest has the worst time dealing with UIView Alerts. I have an app where I get 2 alerts the first one wants me to select "Allow" to enable locations services for App permissions, then on a splash page the user has to press a UIButton called "Turn on location" and finally there is a notification sms alert in a UIViewAlert and the user has to select "OK". The problem we were having was not being able to interact with the system Alerts, but also a race condition where behavior and its appearance on screen was untimely. It seems that if you use the alert.element.buttons["whateverText"].tap the logic of XCTest is to keep pressing until the time of the test runs out. So basically keep pressing anything on the screen until all the system alerts are clear of view.
This is a hack but this is what worked for me.
func testGetPastTheStupidAlerts() {
let app = XCUIApplication()
app.launch()
if app.alerts.element.collectionViews.buttons["Allow"].exists {
app.tap()
}
app.buttons["TURN ON MY LOCATION"].tap()
}
The string "Allow" is completely ignored and the logic to app.tap() is called evreytime an alert is in view and finally the button I wanted to reach ["Turn On Location"] is accessible and the test pass
~Totally confused, thanks Apple.
The only thing I found that reliably fixed this was to set up two separate tests to handle the alerts. In the first test, I call app.tap() and do nothing else. In the second test, I call app.tap() again and then do the real work.
On xcode 9.1, alerts are only being handled if the test device has iOS 11. Doesn't work on older iOS versions e.g 10.3 etc. Reference: https://forums.developer.apple.com/thread/86989
To handle alerts use this:
//Use this before the alerts appear. I am doing it before app.launch()
let allowButtonPredicate = NSPredicate(format: "label == 'Always Allow' || label == 'Allow'")
//1st alert
_ = addUIInterruptionMonitor(withDescription: "Allow to access your location?") { (alert) -> Bool in
let alwaysAllowButton = alert.buttons.matching(allowButtonPredicate).element.firstMatch
if alwaysAllowButton.exists {
alwaysAllowButton.tap()
return true
}
return false
}
//Copy paste if there are more than one alerts to handle in the app
#Joe Masilotti's answer is correct and thanks for that, it helped me a lot :)
I would just like to point out the one thing, and that is the UIInterruptionMonitor catches all system alerts presented in series TOGETHER, so that the action you apply in the completion handler gets applied to every alert ("Don't allow" or "OK"). If you want to handle alert actions differently, you have to check, inside the completion handler, which alert is currently presented e.g. by checking its static text, and then the action will be applied only on that alert.
Here's small code snippet for applying the "Don't allow" action on the second alert, in series of three alerts, and "OK" action on the remaining two:
addUIInterruptionMonitor(withDescription: "Access to sound recording") { (alert) -> Bool in
if alert.staticTexts["MyApp would like to use your microphone for recording your sound."].exists {
alert.buttons["Don’t Allow"].tap()
} else {
alert.buttons["OK"].tap()
}
return true
}
app.tap()
This is an old question but there is now another way to handle these alerts.
The system alert isn't accessibly from the app context of the app you are launched in, however you can access the app context anyway. Look at this simple example:
func testLoginHappyPath() {
let app = XCUIApplication()
app.textFields["Username"].typeText["Billy"]
app.secureTextFields["Password"].typeText["hunter2"]
app.buttons["Log In"].tap()
}
In a vacuum with a simulator already launched and permissions already granted or denied, this will work. But if we put it in a CI pipeline where it gets a brand new simulator, all of the sudden it won't be able to find that Username field because there's a notification alert popping up.
So now there's 3 choices on how to handle that:
Implicitly
There's already a default system alert interrupt handler. So in theory, simply trying to typeText on that first field should check for an interrupting event and handle it in the affirmative.
If everything works as designed, you won't have to write any code but you'll see an interruption logged and handled in the log, and your test will take a couple seconds more.
Explicitly via interruptionmonitor
I won't rewrite the previous work on this, but this is where you explicitly set up an interruptionmonitor to handle the specific alert being popped up - or whatever alerts you expect to happen.
This is useful if the built-in handler doesn't do what you want - or doesn't work at all.
Explicitly via XCUITest framework
In xCode 9.0 and above, you can switch between app contexts fluidly by simply defining multiple XCUIApplication() instances. Then you can locate the field you need via familiar methods. So to do this explicitly would look like the following:
func testLoginHappyPath() {
let app = XCUIApplication()
let springboardApp = XCUIApplication(bundleidentifier: "com.apple.springboard")
if springboardApp.alerts[""FunHappyApp" would like permission to own your soul."].exists {
springboardApp.alerts.buttons["Allow"].tap()
}
app.textFields["Username"].typeText["Billy"]
app.secureTextFields["Password"].typeText["hunter2"]
app.buttons["Log In"].tap()
}
Sounds like the approach to implementing camera access and notifications are threaded as you say, but not physically managed and left to chance when and how they are displayed.
I suspect one is triggered by the other and when it is programatically clicked it wipes out the other one as well (which Apple would probably never allow)
Think of it you're asking for a users permission then making the decision on their behalf? Why? Because you can't get your code to work maybe.
How to fix - trace where these two components are triggering the pop up dialogues - where are they being called?, rewrite to trigger just one, send an NSNotification when one dialogue has been completed to trigger and display the remaining one.
I would seriously discourage the approach of programatically clicking dialogue buttons meant for the user.

Set an initial focal distance on iOS

I'm working on an iOS-app where one of the features is scanning QR-codes. For this I'm using the excellent library, ZBar. The scanning works fine and is generally really quick. However when you use smaller QR-codes it takes a bit longer to scan, mostly due to the fact that the autofocus needs some time to adjust. I was experimenting and noticed that the focus could be locked using the following code:
AVCaptureDevice *cameraDevice = readerView.device;
if ([cameraDevice lockForConfiguration:nil]) {
[cameraDevice setFocusMode:AVCaptureFocusModeLocked];
[cameraDevice unlockForConfiguration];
}
When this code is used after a successful scan, the coming scans are really quick. That made me wonder, could I somehow lock the focus before even scanning one code? The app will only scan rather small QR-codes so there will never be a need for focusing on something far away. Sure, I could implement something like tap to focus, but preferably I would like to avoid that extra step.
Is there a way to achieve this? Or are there maybe another way of speeding things up when dealing with smaller QR-codes?
// Alexander
In iOS7 this is now possible!
Apple has added the property autoFocusRangeRestriction to the AVCaptureDevice class. This property is of the enum AVCaptureAutoFocusRangeRestriction which has three different values:
AVCaptureAutoFocusRangeRestrictionNone - Default, no restrictions
AVCaptureAutoFocusRangeRestrictionNear - The subject that matters is close to the camera
AVCaptureAutoFocusRangeRestrictionFar - The subject that matters is far from the camera
To check if the method is available we should first check if the property autoFocusRangeRestrictionSupported is true. And since it's only supported in iOS7 an onwards we should also use respondsToSelector so we don't get an exception on earlier iOS-versions.
So the resulting code should look something like this:
AVCaptureDevice *cameraDevice = zbarReaderView.device;
if ([cameraDevice respondsToSelector:#selector(isAutoFocusRangeRestrictionSupported)] && cameraDevice.autoFocusRangeRestrictionSupported) {
// If we are on an iOS version that supports AutoFocusRangeRestriction and the device supports it
// Set the focus range to "near"
if ([cameraDevice lockForConfiguration:nil]) {
cameraDevice.autoFocusRangeRestriction = AVCaptureAutoFocusRangeRestrictionNear;
[cameraDevice unlockForConfiguration];
}
}
This seems to somewhat speed up the scanning of small QR-codes according to my initial tests :)
Update - iOS8
With iOS8, Apple has given us lots of new camera API's to play with. One of this new methods is this one:
- (void)setFocusModeLockedWithLensPosition:(float)lensPosition completionHandler:(void (^)(CMTime syncTime))handler
This method locks focus by moving the lens to a position between 0.0 and 1.0. I played around with the method, locking the lens at close values. However, in general it caused more problems then it solved. You had to keep the QR-codes/barcodes at a very specific distance, which could cause issues when you had codes of different sizes.
But. I think I have found a pretty good alternative to locking focus altogether. When the user press the scan button, I lock the lens to a close distance, and when it's finished I switch the camera back to auto focus. This gives us the benefits of keeping auto focus on, but forces the camera to begin at a close distance where a QR-code/barcode is likely to be found. This in combination with:
cameraDevice.autoFocusRangeRestriction = AVCaptureAutoFocusRangeRestrictionNear;
And:
cameraDevice.focusPointOfInterest = CGPointMake(0.5,0.5);
Results in a pretty snappy scanner.
I also built a custom scanner with the API's introduced in iOS7, instead of using ZBar. Mostly because the ZBar-libs are quite outdated and as when iPhone 5 introduced ARMv7s I now had to recompile it again for ARM64.
// Alexander
iOS 8 recently added this configuration! It is almost like they read stack overflow
/*!
#method setFocusModeLockedWithLensPosition:completionHandler:
#abstract
Sets focusMode to AVCaptureFocusModeLocked and locks lensPosition at an explicit value.
#param lensPosition
The lens position, as described in the documentation for the lensPosition property. A value of AVCaptureLensPositionCurrent can be used
to indicate that the caller does not wish to specify a value for lensPosition.
#param handler
A block to be called when lensPosition has been set to the value specified and focusMode is set to AVCaptureFocusModeLocked. If
setFocusModeLockedWithLensPosition:completionHandler: is called multiple times, the completion handlers will be called in FIFO order.
The block receives a timestamp which matches that of the first buffer to which all settings have been applied. Note that the timestamp
is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers
delivered via an AVCaptureVideoDataOutput. The client may pass nil for the handler parameter if knowledge of the operation's completion
is not required.
#discussion
This is the only way of setting lensPosition.
This method throws an NSRangeException if lensPosition is set to an unsupported level.
This method throws an NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:.
*/
- (void)setFocusModeLockedWithLensPosition:(float)lensPosition completionHandler:(void (^)(CMTime syncTime))handler NS_AVAILABLE_IOS(8_0);
EDIT: this is a method of AVCaptureDevice

Resources