Similar to this question, but looking for an answer that will work in the context of an XNA game.
How can I determine whether the device is in a landscape or portrait orientation? The answer given in the general question relies upon functionality built into PhoneApplicationPage. AFAIK, you wouldn't normally be using that class within the context of an XNA game on Windows Phone 7.
From Nick Gravelyn: http://forums.xna.com/forums/p/49684/298915.aspx#298915
Accelerometer isn't in the XNA Framework anymore. You can access it through these steps:
Add a reference to Microsoft.Devices.Sensors.dll
Add 'using Microsoft.Devices.Sensors;' to your using statements.
Hook up an event and start reading the accelerometer:
Try this:
try
{
AccelerometerSensor.Default.ReadingChanged += Default_ReadingChanged;
AccelerometerSensor.Default.Start();
}
catch (AccelerometerStartFailedException)
{
}
Add the event handler itself:
Like this:
void Default_ReadingChanged(object sender, AccelerometerReadingAsyncEventArgs e)
{
}
And you're good to go. Keep in mind, though, that accelerometer doesn't work with the emulator so there's no way to really test this without a device. You do need that try/catch because Start will throw an exception in the emulator because it doesn't support accelerometer.
This has changed it seems. In order to hook into the 'reading changed' you have to create an accelerometer, and then start it. The code required can be found at MSDN. Retrieving Accelerometer Input (Windows Phone)
it looks something like this:
#if WINDOWS_PHONE
Accelerometer accelerometer;
try
{
accelerometer = new Accelerometer();
accelerometer.ReadingChanged += new EventHandler<AccelerometerReadingEventArgs>(a_ReadingChanged);
accelerometer.Start();
}
catch (AccelerometerFailedException e)
{
}
...
}
void a_ReadingChanged(object sender, AccelerometerReadingEventArgs e)
{
//this function is not supported by the window 7 phone emulator
throw new NotImplementedException();
}
#endif
Here's a post from Shawn Hargreaves' Blog
http://blogs.msdn.com/b/shawnhar/archive/2010/07/12/orientation-and-rotation-on-windows-phone.aspx?utm_source=twitterfeed&utm_medium=twitter
If you want to automatically switch between both landscape and portrait orientations as the phone is rotated:
graphics.SupportedOrientations = DisplayOrientation.Portrait |
DisplayOrientation.LandscapeLeft |
DisplayOrientation.LandscapeRight;
Switching between LandscapeLeft and LandscapeRight can be handled automatically with no special help from the game, and is therefore enabled by default. But switching between landscape and portrait alters the backbuffer dimensions (short-and-wide vs. tall-and-thin), which will most likely require you to adjust your screen layout. Not all games will be able to handle this (and some designs only make sense one way up), so dynamic switching between landscape and portrait is only enabled for games that explicitly opt-in by setting SupportedOrientations.
Related
I am trying to create a Xamarin.Forms app that will run on both iOS and Android. Eventually I need instances of the app to communicate with each other via Bluetooth, but I'm stuck on getting the iOS side to do anything with Bluetooth. I originally tried to work with Plugin.BluetoothLE and Plugin.BLE, but after a week and a half I was not able to get advertising or scanning to work on either OS with either plugin, so I decided to try implementing simple Bluetooth interaction using the .NET wrappers of the platform APIs, which at least are well documented. I did get scanning to work fine on the Android side. With iOS, though, what I have right now builds just fine, and runs on my iPad without errors, but the DiscoveredPeripheral handler is never called, even though the iPad is just a few inches from the Android tablet and presumably should be able to see the same devices. I have verified this by setting a breakpoint in that method, which is never reached; and when I open the Bluetooth Settings on the iPad to make it discoverable the app version on the Android tablet can see it, so I don't think it's an iPad hardware issue.
It seems obvious that there is simply some part of the process I don't know to do, but it's not obvious (to me) where else to look to find out what it is. Here is the code for the class that interacts with the CBCentralManager (as far as I understand from what I've read, this should include everything necessary to return a list of peripherals):
using MyBluetoothApp.Shared; // for the interfaces and constants
using CoreBluetooth;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using Xamarin.Forms;
[assembly: Dependency(typeof(MyBluetoothApp.iOS.PeripheralScanner))]
namespace MyBluetoothApp.iOS
{
public class PeripheralScanner : IPeripheralScanner
{
private readonly CBCentralManager manager;
private List<IPeripheral> foundPeripherals;
public PeripheralScanner()
{
this.foundPeripherals = new List<IPeripheral>();
this.manager = new CBCentralManager();
this.manager.DiscoveredPeripheral += this.DiscoveredPeripheral;
this.manager.UpdatedState += this.UpdatedState;
}
public async Task<List<IPeripheral>> ScanForService(string serviceUuid)
{
return await this.ScanForService(serviceUuid, BluetoothConstants.DEFAULT_SCAN_TIMEOUT);
}
public async Task<List<IPeripheral>> ScanForService(string serviceUuid, int duration)
{
CBUUID uuid = CBUUID.FromString(serviceUuid);
//this.manager.ScanForPeripherals(uuid);
this.manager.ScanForPeripherals((CBUUID)null); // For now I'd be happy to see ANY peripherals
await Task.Delay(duration);
this.manager.StopScan();
return this.foundPeripherals;
}
private void DiscoveredPeripheral(object sender, CBDiscoveredPeripheralEventArgs args)
{
this.foundPeripherals.Add(new CPeripheral(args.Peripheral));
}
private void UpdatedState(object sender, EventArgs args)
{
CBCentralManagerState state = ((CBCentralManager)sender).State;
if (CBCentralManagerState.PoweredOn != state)
{
throw new Exception(state.ToString());
}
}
}
}
Can anyone point me in the direction of understanding what I'm missing?
EDIT: O...K, I've discovered quite by accident that if I do this in the shared code:
IPeripheralScanner scanner = DependencyService.Get<IPeripheralScanner>();
List<IPeripheral> foundPeripherals = await scanner.ScanForService(BluetoothConstants.VITL_SERVICE_UUID);
twice in a row, it works the second time. I feel both more hopeful and much more confused.
The underlying problem was that in the first instantiation of PeripheralScanner, ScanForService was being called before State was updated. I tried many ways of waiting for that event to be raised so I could be sure the state was PoweredOn, but nothing seemed to work; polling loops simply never reached the desired state, but if I threw an Exception in the UpdatedState handler it was thrown within milliseconds of launch and the state at that time was always PoweredOn. (Breakpoints in that handler caused the debugging to freeze with the output Resolved pending breakpoint, which not even the VS team seems to be able to explain).
Reading some of the Apple developer blogs I found that this situation is most often avoided by having the desired action occur within the UpdatedState handler. It finally soaked into my thick head that I was never seeing any effects from that handler running because the event was being raised and handled on a different thread. I really need to pass the service UUID to the scanning logic, and to interact with a generic List that I can return from ScanForService, so just moving it all to the handler didn't seem like a promising direction. So I created a singleton for flagging the state:
internal sealed class ManagerState // .NET makes singletons easy - Lazy<T> FTW
{
private static readonly Lazy<ManagerState> lazy = new Lazy<ManagerState>(() => new ManagerState());
internal static ManagerState Instance { get { return ManagerState.lazy.Value; } }
internal bool IsPoweredOn { get; set; }
private ManagerState()
{
this.IsPoweredOn = false;
}
}
and update it in the handler:
private void updatedState(object sender, EventArgs args)
{
ManagerState.Instance.IsPoweredOn = CBCentralManagerState.PoweredOn == ((CBCentralManager) sender).State;
}
then poll that at the beginning of ScanForService (in a separate thread each time because, again, I will not see the updates in my base thread):
while (false == await Task.Run(() => ManagerState.Instance.IsPoweredOn)) { }
I'm not at all sure this is the best solution, but it does work, at least in my case. I guess I could move the logic to the handler and create a fancier singleton class for moving all the state back and forth, but that doesn't feel as good to me.
I have a really weird issue with cocos2d-x v3, the first 15 touches or so are not registered on my iOS device (tried iPad 2 and iPad air). As soon as a touch is finally registered, everything works fine (aka all touches after that trigger the onTouch functions).
The touch events work perfectly fine in the simulator.
Also, the same code works perfectly fin in my Windows and Android builds.
Has anyone had this happen, or maybe know what could be causing it?
I'm using the listener, and I debugged up to the spot where touchesBegan forwards the input events to the listener, but even there the events don't come in until after the 15th tap or so.
It's really weird... And I figured I'd give it a shot here, as someone might have encountered this as well, before I start stripping code to as clean as possible, and then try to work my way back from there...
Kind regards,
Michaël
EDIT: As requested, here is some code. The desired behaviour is that it works in iOS devices like it should: First touch triggers the onTouchBegan.
I didn't add it as it didn't think it would matter, since the code works fine for Android.
But I appreciate that you'd like to see it, just in case I might have missed something
GameLayer is a Cocos2d::Layer.
void GameLayer::onEnter()
{
cocos2d::CCLayer::onEnter();
// Register Touch Event
auto pEventDispatcher = cocos2d::Director::getInstance()->getEventDispatcher();
if (pEventDispatcher)
{
// Touch listener
auto pTouchListener = cocos2d::EventListenerTouchOneByOne::create();
if (pTouchListener)
{
pTouchListener->setSwallowTouches( true );
pTouchListener->onTouchBegan = CC_CALLBACK_2( GameLayer::onTouchBegan, this );
pTouchListener->onTouchMoved = CC_CALLBACK_2( GameLayer::onTouchMoved, this );
pTouchListener->onTouchEnded = CC_CALLBACK_2( GameLayer::onTouchEnded, this );
pTouchListener->onTouchCancelled = CC_CALLBACK_2( GameLayer::onTouchCancelled, this );
pEventDispatcher->addEventListenerWithSceneGraphPriority( pTouchListener, this );
}
}
}
bool GameLayer::onTouchBegan( cocos2d::Touch* pTouch, cocos2d::Event* /*pEvent*/ )
{
// Breakpoint here triggers fine on first touch for Android/Windows/iOS Simulator,
// but not on iOS device (iPad/iPhone)
bool breakHere = true;
<<snip actual code>>
}
EDIT:
The problem was an std::ofstream trying to open() on the iOS device (most likely in a folder it didn't have access to).
I have lots of layers in my game and I don't do it like you do. In your code the need to get the EventDispatcher locally and create the touch listener like how you are seems odd to me. I've never seen it down that way in so many steps.
I do:
auto listener = cocos2d::EventListenerTouchOneByOne::create();
listener->setSwallowTouches(true);
listener->onTouchBegan = [&](cocos2d::Touch* touch, cocos2d::Event* event)
{
return true;
};
listener->onTouchEnded = [=](cocos2d::Touch* touch, cocos2d::Event* event)
{
// ... do something
};
cocos2d::Director::getInstance()->getEventDispatcher()->addEventListenerWithFixedPriority(listener, 31);
I got it fixed.
The problem was seemingly totally unrelated, I was trying to open an std::ofstream file (my log file), most likely in a folder it didn't have (any and/or write) access to.
Which is not required, nor wanted on the iOS device.
Once I added IOS to the exclusion list (just like Android and some more targets) everything started to work perfect.
I do not know what goes wrong exactly, and why it does start working after a few touch inputs, but I'm guess it was waiting or retrying something in the background.
I found the issue while debugging another one :)
Hopefully this helps anyone else who might stumble onto the same or a related issue.
Kind regards,
Michaël
So I had this idea to test the implementation of my screen tracking (with Google Analytics) on my app using UI automation.
The original idea was to build a UI script to go through the screens while checking if the tracking events are being sent accordingly. I need this as sometimes I'm not able to compose everything out of view controllers or the events are not forwarded in the expected order. Regardless of that, I should test this aspect of my app as well and I thought that UI automation was the answer.
I have implemented a script to go through the screens using the UI automation instrument and this is working correctly. I even went so far as using tuneup js to make the code more streamlined and easier to follow.
I was expecting to have something like (in general terms, the syntax is only a simplification):
Being on screen X
Tap button A
Expect screen Y and tracking event for the screen Y
However, as far as I was able to check, testing the screen tracking is something that is not possible with the UI automation.
Or am I missing something?
I thought of creating an invisible view that stays on top of all the view hierarchy and changing its name every time a new screen is loaded to allow me to test it with UI automation but the idea sounded a little over the top...
What do you people suggest? Look for another UI automation tool? Do it with unit testing instead?
Thanks in advance for any help
You could use a UIAlertView and inspect those alerts. Instead of sending the analytics events you can pop up the alert so you can check on it in UIAutomation.
Analytics abstraction frameworks like AnalyticsKit provide an easy way to change the analytics provider. And AnalyticsKit even has an example for that (take a look at the AnalyticsKitDebugProvider class). So the changes to your production code are minimal.
You could use a build configuration where you set a build variable to control the initialization of your analytics
id<AnalyticsKitProvider> provider
#ifdef USE_UI_AUTOMATION_ANALYTICS
provider = [[TestAutomationProvider alloc] init];
#else
provider = [[RealProvider alloc] initWithApiKey:API_KEY];
#endif
[AnalyticsKit initializeLoggers:#[provider]];
In UIAutomation you can test for the alert coming up. You can utilize assertions.js out of the tuneup.js package to write a function like this
function checkForAlert()
{
var alert = null;
retry( function() {
log("wait until alert appaers");
alert = UIATarget.localTarget().frontMostApp().alert();
assertNotNull(alert, "No alert found");
assertTrue("The name you can choose for the alert" == alert.name());
}, 5, 1.0);
return alert;
};
This combines waiting for the alert and testing if it finally appear. If the alert not appears, the test will fail.
In your test you use this in the following way:
var analyticAlert = checkForAlert() // if alert appears it will be in the var, otherwise the test fails at this point.
analyticAlert.buttons()["OK"].tap(); // dismiss the alert
To make this work you also need to set an onAlert handler. Otherwise UIAutomation would try to dismiss your alert immediately. This has to be done before your tests code. Alert handling is explained in the UIAutomation docs.
function MyOnAlertHandler(alert)
{
if("The name you choose"==alert.name()) // filter all alerts created by analytics provider
{
return true; // handle alert in your test
}
return false // automaticly dismiss all other
}
UIATarget.onAlert = MyOnAlertHandler; // set the alert handler
I'm working on an iOS-app where one of the features is scanning QR-codes. For this I'm using the excellent library, ZBar. The scanning works fine and is generally really quick. However when you use smaller QR-codes it takes a bit longer to scan, mostly due to the fact that the autofocus needs some time to adjust. I was experimenting and noticed that the focus could be locked using the following code:
AVCaptureDevice *cameraDevice = readerView.device;
if ([cameraDevice lockForConfiguration:nil]) {
[cameraDevice setFocusMode:AVCaptureFocusModeLocked];
[cameraDevice unlockForConfiguration];
}
When this code is used after a successful scan, the coming scans are really quick. That made me wonder, could I somehow lock the focus before even scanning one code? The app will only scan rather small QR-codes so there will never be a need for focusing on something far away. Sure, I could implement something like tap to focus, but preferably I would like to avoid that extra step.
Is there a way to achieve this? Or are there maybe another way of speeding things up when dealing with smaller QR-codes?
// Alexander
In iOS7 this is now possible!
Apple has added the property autoFocusRangeRestriction to the AVCaptureDevice class. This property is of the enum AVCaptureAutoFocusRangeRestriction which has three different values:
AVCaptureAutoFocusRangeRestrictionNone - Default, no restrictions
AVCaptureAutoFocusRangeRestrictionNear - The subject that matters is close to the camera
AVCaptureAutoFocusRangeRestrictionFar - The subject that matters is far from the camera
To check if the method is available we should first check if the property autoFocusRangeRestrictionSupported is true. And since it's only supported in iOS7 an onwards we should also use respondsToSelector so we don't get an exception on earlier iOS-versions.
So the resulting code should look something like this:
AVCaptureDevice *cameraDevice = zbarReaderView.device;
if ([cameraDevice respondsToSelector:#selector(isAutoFocusRangeRestrictionSupported)] && cameraDevice.autoFocusRangeRestrictionSupported) {
// If we are on an iOS version that supports AutoFocusRangeRestriction and the device supports it
// Set the focus range to "near"
if ([cameraDevice lockForConfiguration:nil]) {
cameraDevice.autoFocusRangeRestriction = AVCaptureAutoFocusRangeRestrictionNear;
[cameraDevice unlockForConfiguration];
}
}
This seems to somewhat speed up the scanning of small QR-codes according to my initial tests :)
Update - iOS8
With iOS8, Apple has given us lots of new camera API's to play with. One of this new methods is this one:
- (void)setFocusModeLockedWithLensPosition:(float)lensPosition completionHandler:(void (^)(CMTime syncTime))handler
This method locks focus by moving the lens to a position between 0.0 and 1.0. I played around with the method, locking the lens at close values. However, in general it caused more problems then it solved. You had to keep the QR-codes/barcodes at a very specific distance, which could cause issues when you had codes of different sizes.
But. I think I have found a pretty good alternative to locking focus altogether. When the user press the scan button, I lock the lens to a close distance, and when it's finished I switch the camera back to auto focus. This gives us the benefits of keeping auto focus on, but forces the camera to begin at a close distance where a QR-code/barcode is likely to be found. This in combination with:
cameraDevice.autoFocusRangeRestriction = AVCaptureAutoFocusRangeRestrictionNear;
And:
cameraDevice.focusPointOfInterest = CGPointMake(0.5,0.5);
Results in a pretty snappy scanner.
I also built a custom scanner with the API's introduced in iOS7, instead of using ZBar. Mostly because the ZBar-libs are quite outdated and as when iPhone 5 introduced ARMv7s I now had to recompile it again for ARM64.
// Alexander
iOS 8 recently added this configuration! It is almost like they read stack overflow
/*!
#method setFocusModeLockedWithLensPosition:completionHandler:
#abstract
Sets focusMode to AVCaptureFocusModeLocked and locks lensPosition at an explicit value.
#param lensPosition
The lens position, as described in the documentation for the lensPosition property. A value of AVCaptureLensPositionCurrent can be used
to indicate that the caller does not wish to specify a value for lensPosition.
#param handler
A block to be called when lensPosition has been set to the value specified and focusMode is set to AVCaptureFocusModeLocked. If
setFocusModeLockedWithLensPosition:completionHandler: is called multiple times, the completion handlers will be called in FIFO order.
The block receives a timestamp which matches that of the first buffer to which all settings have been applied. Note that the timestamp
is synchronized to the device clock, and thus must be converted to the master clock prior to comparison with the timestamps of buffers
delivered via an AVCaptureVideoDataOutput. The client may pass nil for the handler parameter if knowledge of the operation's completion
is not required.
#discussion
This is the only way of setting lensPosition.
This method throws an NSRangeException if lensPosition is set to an unsupported level.
This method throws an NSGenericException if called without first obtaining exclusive access to the receiver using lockForConfiguration:.
*/
- (void)setFocusModeLockedWithLensPosition:(float)lensPosition completionHandler:(void (^)(CMTime syncTime))handler NS_AVAILABLE_IOS(8_0);
EDIT: this is a method of AVCaptureDevice
How can the iPhone be set to vibrate once?
For example, when a player loses a life or the game is over, the iPhone should vibrate.
From "iPhone Tutorial: Better way to check capabilities of iOS devices":
There are two seemingly similar functions that take a parameter kSystemSoundID_Vibrate:
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
Both of the functions vibrate the iPhone. But, when you use the first
function on devices that don’t support vibration, it plays a beep
sound. The second function, on the other hand, does nothing on
unsupported devices. So if you are going to vibrate the device
continuously, as an alert, common sense says, use function 2.
First, add the AudioToolbox framework AudioToolbox.framework to your target in Build Phases.
Then, import this header file:
#import <AudioToolbox/AudioServices.h>
Swift 2.0+
AudioToolbox now presents the kSystemSoundID_Vibrate as a SystemSoundID type, so the code is:
import AudioToolbox.AudioServices
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate)
Instead of having to go thru the extra cast step
(Props to #Dov)
Original Answer (Swift 1.x)
And, here's how you do it on Swift (in case you ran into the same trouble as I did)
Link against AudioToolbox.framework (Go to your project, select your target, build phases, Link Binary with Libraries, add the library there)
Once that is completed:
import AudioToolbox.AudioServices
// Use either of these
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
AudioServicesPlayAlertSound(SystemSoundID(kSystemSoundID_Vibrate))
The cheesy thing is that SystemSoundID is basically a typealias (fancy swift typedef) for a UInt32, and the kSystemSoundID_Vibrate is a regular Int. The compiler gives you an error for trying to cast from Int to UInt32, but the error reads as "Cannot convert to SystemSoundID", which is confusing. Why didn't apple just make it a Swift enum is beyond me.
#aponomarenko's goes into the details, my answer is just for the Swifters out there.
A simple way to do so is with Audio Services:
#import <AudioToolbox/AudioToolbox.h>
...
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
I had great trouble with this for devices that had vibration turned off in some manner, but we needed it to work regardless, because it is critical to our application functioning, and since it is just an integer to a documented method call, it will pass validation. So I have tried some sounds that were outside of the well documented ones here: TUNER88/iOSSystemSoundsLibrary
I have then stumbled upon 1352, which is working regardless of the silent switch or the settings on the device (Settings->vibrate on ring, vibrate on silent).
- (void)vibratePhone;
{
if([[UIDevice currentDevice].model isEqualToString:#"iPhone"])
{
AudioServicesPlaySystemSound (1352); //works ALWAYS as of this post
}
else
{
// Not an iPhone, so doesn't have vibrate
// play the less annoying tick noise or one of your own
AudioServicesPlayAlertSound (1105);
}
}
Important Note: Alert of Future Deprecation.
As of iOS 9.0, the API functions description for:
AudioServicesPlaySystemSound(inSystemSoundID: SystemSoundID)
AudioServicesPlayAlertSound(inSystemSoundID: SystemSoundID)
includes the following note:
This function will be deprecated in a future release.
Use AudioServicesPlayAlertSoundWithCompletion or
AudioServicesPlaySystemSoundWithCompletion instead.
The right way to go will be using any of these two:
AudioServicesPlayAlertSoundWithCompletion(kSystemSoundID_Vibrate, nil)
or
AudioServicesPlayAlertSoundWithCompletion(kSystemSoundID_Vibrate) {
//your callback code when the vibration is done (it may not vibrate in iPod, but this callback will be always called)
}
remember to import AVFoundation
For an iPhone 7/7 Plus or newer, use these three Haptic feedback APIs.
Available APIs
For notifications:
let generator = UINotificationFeedbackGenerator()
generator.notificationOccured(style: .error)
Available styles are .error, .success, and .warning. Each has its own distinctive feel.
From the docs:
A concrete UIFeedbackGenerator subclass that creates haptics to communicate successes, failures, and warnings.
For simple vibrations:
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccured()
Available styles are .heavy, .medium, and .light. These are simple vibrations with varying degrees of "hardness".
From the docs:
A concrete UIFeedbackGenerator subclass that creates haptics to simulate physical impacts
For when the user selected an item
let generator = UISelectionFeedbackGenerator()
generator.selectionChanged()
This is the least noticeable of all the haptics, and so is the most suitable for when haptics should not be taking over the app experience.
From the docs:
A concrete UIFeedbackGenerator subclass that creates haptics to indicate a change in selection.
Notes
There are a couple of things worth remembering when using these APIs.
Note A
You do not actually create the haptic. You request the system generate a haptic. The system will decide based on the below:
If haptics are possible on the device (whether it has a Taptic Engine in this case)
Whether the app may record audio (haptics do not generate during recording to prevent unwanted interference)
Whether haptics are enabled in system Settings.
Therefore, the system will silently ignore your request for a haptic if it is not possible. If this is due to an unsupported device, you could try this:
func haptic() {
// Get whether the device can generate haptics or not
// If feedbackSupportLevel is nil, will assign 0
let feedbackSupportLevel = UIDevice.current.value(forKey: "_feedbackSupportLevel") as? Int ?? 0
switch feedbackSupportLevel {
case 2:
// 2 means the device has a Taptic Engine
// Put Taptic Engine code here, using the APIs explained above
case 1:
// 1 means no Taptic Engine, but will support AudioToolbox
// AudioToolbox code from the myriad of other answers!
default: // 0
// No haptic support
// Do something else, like a beeping noise or LED flash instead of haptics
}
Substitute the comments in the switch-case statements, and this haptic generation code will be portable to other iOS devices. It will generate the highest level of haptic possible.
Note B
Due to the fact that generating haptics is a hardware-level task, there may be latency between when you call the haptic-generation code, and when it actually happens. For this reason, the Taptic Engine APIs all have a prepare() method, to put it in a state of readiness. Using your Game Over example: You may know that the game is about to end, by the user having very low HP, or a dangerous monster being near them.
If you don't generate a haptic within a few seconds, the Taptic Engine will go back into an idle state (to save battery life)
In this case, preparing the Taptic Engine would create a higher-quality, more responsive experience.
For example, let's say your app uses a pan gesture recogniser to change the portion of the world visible. You want a haptic to generate when the user 'looks' round 360 degrees. Here is how you could use prepare():
#IBAction func userChangedViewablePortionOfWorld(_ gesture: UIPanGestureRecogniser!) {
haptic = UIImpactFeedbackGenerator(style: .heavy)
switch gesture.state {
case .began:
// The user started dragging the screen.
haptic.prepare()
case .changed:
// The user trying to 'look' in another direction
// Code to change viewable portion of the virtual world
if virtualWorldViewpointDegreeMiddle = 360.0 {
haptic.impactOccured()
}
default:
break
}
And if you're using Xamarin (monotouch) framework, simply call
SystemSound.Vibrate.PlayAlertSound()
In my travels I have found that if you try either of the following while you are recording audio, the device will not vibrate even if it is enabled.
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
My method was called at a specific time in the measurement of the devices movements. I had to stop the recording and then restart it after the vibration had occurred.
It looked like this.
-(void)vibrate {
[recorder stop];
AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);
[recorder start];
}
recorder is an AVRecorder instance.
Hope this helps others that have had the same problem before.
In iOS 10, and on newer iPhones, you can also use haptic API. This haptic feedback is softer than the AudioToolbox API.
For your GAME OVER scenario, a heavy UI impact feedback should be suitable.
UIImpactFeedbackGenerator(style: .heavy).impactOccurred()
You could use the other haptic feedback styles.
In Swift:
import AVFoundation
...
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
In my case I was using the AVCaptureSession.
AudioToolbox was in project's build phases and it was imported but still didn't work. In order to make it work I stopped the session before vibration and continued on after that.
#import <AudioToolbox/AudioToolbox.h>
...
#property (nonatomic) AVCaptureSession *session;
...
- (void)vibratePhone;
{
[self.session stopRunning];
NSLog(#"vibratePhone %#",#"here");
if([[UIDevice currentDevice].model isEqualToString:#"iPhone"])
{
AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);
}
else
{
AudioServicesPlayAlertSound (kSystemSoundID_Vibrate);
}
[self.session startRunning];
}
You can use
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
for iPhone and few newer iPods.
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
for iPads.