Is there any way to keep touches alive in iOS? - ios

I'm testing the touch detection capabilities of the iPad (specifically iPad 3 at the moment). I'm printing out a log of detected touches to see what is going on, and using a bottle with three foam (touch friendly) pads at the bottom.
The touches are detected fine and it logs any slight movements which is great. The issue is that after a while if the bottle isn't moved at all, all three touches are forcibly cancelled and remain undetected until the bottle is removed and placed back down.
So it seems there's a timeout on these touches... I have not found any specific information on this. Is there any way to revive or keep alive touches without having to physically remove and restore your touch points?

A capacitive touch sensor is a very sensitive instrument, and a lot of filtering is done before the OS reports touches to you. This is also the reason why the iPad cannot detect touches with smaller contact areas than what comes with cheap styli - rubber domes of about 7mm diameter. The touchscreen controller detects smaller touches alright, but filters them to avoid spurious signals. Note that the touchscreen of a Samsung Galaxy will already detect touches with a dull pencil - the screen is (for practical purposes) the same as that of an iPad, but the cutoff for the controller is lower.
I would expect that the self-calibrating logic inside the touch controller will calibrate your touches away. Basically, it will set the signal produced by your foam pads as the new normal. On the application side there is nothing that you can do to reverse this.
Disclaimer: I have no inside knowledge of what goes on inside the touchscreen controller. My answer is based on much reading and solid reasoning, nothing more.

Related

Detect palm touching/releasing the iphone screen

Implementing a sort of 'distress call' button which should work as following:
User starts application and covers a screen with a palm of a hand
Some time passes, user may introduce additional touches during that time or remove some of the existing (but not all of them), location/shape of touches may change
When user releases a hand (i.e. removes last touch) a distress signal is emitted by the app
Basically, the app should register two events: (1) a screen is touched (2) all touched are released
I'm trying to use touchesBegan/touchesEnded methods and those work for small area touches (fingertips) but on touching screen with a full palm or even only palm edge a touchesCancelled gets triggered immediately while hand is still on the screen. Obviously no other events are emitted upon hand release afterwards.
I tried subclassing UIWindow and UIApplication and overriding sendEvent in those but got no additional info - large area touches are triggering touch begin and immediately touch cancel, releasing hand afterwards emits nothing. In some cases large area touches fire no events at all, not even the touchesBegan. Basically, iOS doesn't let me work with a very basic scenario - detecting just the fact of screen touch/release.
Is there any way to query the screen touch state directly and not work with responder chain? Or suppress the cancellation event from firing? Or maybe I'm missing something?
Unfortunately, as of right now, no solution exists

Custom gestures from raw touch events on iOS

How do I receive raw touch events on iOS for creating my own gestures using my own state machine? I'm looking for the analogue of Android's MotionEvent for multi-touch gestures. I need to be able to draw on the background box, but I do not need OS components on this box.
iOS provides a gesture recognizer for defining custom gestures, and it defines UITouch objects that deliver to the application. However, the former is incapable of the implementing the complexity of my system, particularly at the efficiency I need, and the latter does preprocessing, such as to determine the tapCount. My system must itself decide whether a touch is a tap or not.
Additionally, I need to ascertain for myself whether multiple simultaneously-touching fingers form a gesture or not. I can't have iOS interpreting four-finger gestures input (exclusively) on top of my viewing area, though I could live with iOS interpreting five-finger gestures. It's okay for iOS to preempt control from my view if any finger of the gesture starts outside of the view.
The application is a gesturing alternative to the keyboard, so it would ideally be able to operate in the keyboard area, in case this affects the answer.
Is this possible on iOS? What's the approach? I'd like to write a simple program outputting touch events to prove it can be done. Thanks for your help!
UPDATE: I was largely able to answer the question by playing with the Multitouch Visualizer app. Each tap is registered as it occurs, and gestures of 4 fingers or fewer appear to be received without iOS interfering. iOS takes over for some 5 finger gestures, but only if they satisfy certain (unknown) initial characteristics. I don't know how this is done, but I imagine via UITouch events.

UITouch Force on touchesBegan

3D touch seems really cool and I wanted to see how it would work in a musical context. I was reading a bit about 3D touch and it seems like the force property almost always reads as 0 when a touch begins (touchesBegan method).
How do I get a usable force when a touch begins because even if I'm pressing with maximum "force" I'm still getting that initial 0? I was thinking about just approximating a value within a time frame but that would involve me moving the touch. I just want a usable force upon the first touch.
Thanks!
Unfortunately this is impossible.
Every touch starts as a contact with no force at all, as soon as the user's finger makes contact (or even near-contact) with the screen. The force only starts to build as the user's finger is compressed against the screen.
The only way for touchesBegan: to report any force at all would be for it to delay artificially for a certain number of milliseconds before reporting a touch, to allow the force to build up. This would destroy the interactivity of all iOS applications; and Apple put a huge amount of work into delivering a touch the instant it is detected.

Suitable sensors in iPhone to pick the user interaction

I'm working on an app - one of its main properties is to inform the app if the user is using the phone, interacting with it in any way, or even touching it!
What are the suitable sensors in iPhone that can help me to detect these things?
And how can I benefit from the sensors to make this property work?
Thanks
The iPhone has accelerometers, gyros, and GPS. With these, you can monitor motion of the phone, sudden shocks(like when the phone is picked up and put down), orientation, and over all motion. If outside, you can also use the GPS to pick up on motion and position (lat, long, course, speed, altitude).
When interacting with the app, you've also got touch events and multi-touch events(like using two fingers to zoom in or zoom out or rotate). Most of the 'gestures' are coded and defined by apple so you don't need to figure out the user intent, just respond to their event.
Numerous sensor monitoring apps exist... eg:
http://wavefrontlabs.com/Wavefront_Labs/Sensor_Data.html
Tutorials on how to do some of this stuff :
https://www.youtube.com/watch?v=Hml2jB_Qpds
https://www.youtube.com/watch?v=Xk5cJlhePCI
https://www.youtube.com/watch?v=qY4xCMTejH8
https://www.youtube.com/watch?v=7GHc8ySyWcY
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/GestureRecognizer_basics/GestureRecognizer_basics.html

touchesMoved called at irregular intervals

I'm making a game for iOS where you mostly drag big objects across the screen. When I run the game on an actual iPad/iPhone for a while (continuously dragging the object in circles across the screen) every 5 minutes or so the dragged object goes all stuttery for about 10-30 seconds. Then, it goes back to moving silky-smooth.
Visually, it looks like the game's frame rate dropped to 15 fps for a while, but in actual fact it's running at rock-solid 60 fps all the time. However, I noticed that the only thing that doesn't move smoothly is the dragged object, while the rest of the game is all running perfectly smooth.
This led me to believe that the stuttering is related to the touch input in iOS. So I started looking at touchesMoved, and saw that it's normally called every 16 milliseconds (so touch input runs at 60 fps). So far so good.
Then I noticed that when the object starts stuttering, touchesMoved starts being called at weird time intervals, fluctuating wildly between 8 milliseconds and 50 milliseconds.
So while the touchscreen is in this weird state, sometimes touchesMoved will get called just 8 milliseconds after the previous call, and sometimes it won't get called until 50 ms after the previous call. Of course, this makes the dragged object look all choppy because its position is updated at irregular intervals.
Do you have any idea what could be causing touchesMoved to stop being called at regular intervals, as it normally does?
Bonus:
-Whenever I tilt the screen to force the screen orientation to change, roughly 70% of the time the touchscreen goes into the aforementioned state where touchesMoved starts being called irregularly. Then after 10-20 seconds it goes back to normal and everything looks smooth again.
-I've tried this on two iPads and two iPhones, with iOS 6 and 7, and the issue appears in all of these devices.
-An OpenGLES view is used to display the graphics. It syncs to the display refresh rate using CADisplayLink.
-The Xcode project I'm using to test this has been generated by the unity3d game development tool, but I've found several non-unity games where the same issue appears. this appears to be a system-wide problem. note I'm measuring the timings in objective-c using CFAbsoluteTimeGetCurrent, completely outside unity.
This is not a bug in Unity.
Something inside the OS is getting into a bad state and the touch-drag messages stop flowing smoothly. Sometimes you'll get multiple updates in a frame and sometimes you'll get none.
The issue does not happen on iPhone4 or below, or if the game is running at 30Hz frame rate.
I experienced this bug while using an in-house engine I'd written while working at a previous company. It first manifest itself after upgrading the UI system of a scrabble genre game, where you drag the tiles about on the screen. This was utterly bizarre, and I was never able to pin down the exact reproduction steps, but they seem to be timing related, somehow.
It can also be seen on Angry Birds (unless they've fixed it by now), and a variety of other games on the market, basically anything with touch-drag scrolling or movement. For Angry Birds, just go into a level and start scrolling sideways. Most of the time it'll be silky smooth, but maybe 1 in 10 times, it'll be clunky. Re-start the app and try again.
A workaround is to drop the input update frequency to 30Hz for a few frames. This jolts the internals of the OS somehow and gives it time to straighten itself out, so when you crank it back up to 60Hz, it runs smoothly again.
Do that whenever you detect that the bad state has been entered.
I've also run into this. I can verify that it's a bug in Unity 4.3.x.
My 4.2.x builds process touches on device at 60Hz with TouchPhase.Moved on every frame.
My 4.3.x builds get 20-40Hz with TouchPhase.Stationary being emitted on dropped frames.
TestFlight history saved my sanity here.
Don't forget to file a bug with UT.
It's a real disaster. Sometimes it's lag and sometimes it's really smooth. It lags even when GPU and CPU utilization are below 10%. it's not Unity bug. I'm using cocos2d v3.
If someone found a cure, please post it!
I've been running into this as well. My current hypothesis which I still have to verify is that if you ask for a given frame rate (eg. 60fps), but your actually achieved frame rate is less than that (eg. 45fps) that this causes a timing/race condition issue between Unity requesting inputs from iOS at a higher frame rate than it actually is running at. However, if you set it to 30fps (at least in my UNity 5.2 tests with iOS 9.1) then you get a solid 30Hz of input. When I disabled a chunk of my game and it was running at a very solid 60fps, then I would consistently get 60Hz of input from the touch screen. This is what I have for now, but I have to prove this in a simple project which I haven't had time to do yet. Hope this helps someone else.
I found a potential solution to this problem here: https://forums.developer.apple.com/thread/62315 (posting here because it took me a lot of research to stumble across that link whereas this StackOverflow question was the first Google result).
To follow up on this, I got a resonse on my bug report to Apple. This
is the response:
"As long as you don't cause any display updates the screen stays in
low power and therefore 30hz mode, which in turn also keeps the event
input stream down at 30 hz. A workaround is to actually cause a
display update on each received move, even if it is just a one pixel
move of a view if input is needed while no explicit screen update will
be triggered."
In my application, using a GLKView, I set its
preferredFramesPerSecond to 60. Occasionally, my input rate would
randomly drop to 30hz. The response from Apple doesn't fully explain
why this would happen, but apparently the expected method of handling
this is to call display() directly from touchesMoved() while dragging.
I've subclassed GLKView and I set preferredFramesPerSecond to 60. On
touchesBegan(), I set isPaused=true, and start calling display()
within touchesMoved(). On touchesEnd(), I set isPaused=false. Doing
this, I'm no longer having any issues - the app is more performant
than it's ever been.
Apple's example TouchCanvas.xcodeproj does all drawing from within
touchesMoved() as well, so I guess this is the intended way to handle
this.
As far as I can tell, your best bet to achieve a smooth look may be to interpolate between touch events, rather than immediately mapping your objects to your touch position.
tl;dr: It seems just having a CADisplayLink causing any OpenGL context or Metal device to draw at 60fps can cause this.
I was repro'ing this on my iPhone 7 Plus w/ iOS 10.2.1.
I made a small sample simple app with a CADisplayLink with preferredFramesPerSecond = 60, and tried the following rendering approaches:
GLKView w/ display()
CAEAGLLayer, used as prescribed by Apple at WWDC (opaque, sole layer, fullscreen, nothing drawn above it)
MTLDevice
In each case, the render method would only clear the screen, not try drawing anything else.
In each case, I saw the input rate problem.
The following "tricks" also seemed to do nothing, when called inside touchesMoved:
Calling glkView.setNeedsDisplay() (with enableSetNeedsDisplay set to true)
Moving some other view
Calling glkView.display() (actually, seems like it can raise your input rate to 40 events per second. But it doesn't look any better, as far as I can tell, and seems wrong to do, so I wouldn't recommend it.)
I gave up, after running all these tests. Instead, I interpolate my object between the touch positions instead. So it's what I'd recommend, too.
But I figured I'd include my research here in case somebody else takes a stab at it and finds a better solution.

Resources