I'm using the new UIPreviewInteraction API and want to know the location where the user lifts up his finger.
Basically the flow that I want to create is:
User 3D touches on something.
Bubbles appear around his finger.
User slides his finger on top of one of them.
Lifts his finger and the app registers his selection.
The UIPreviewInteraction API doesn't have a reference to the UITouch that initiated the interaction.
Is there another way to get it?
Alright, turns out I was going about it the wrong way.
I was trying to receive touch updates in the previewInteraction:didUpdatePreviewTransition:ended: method
But there you'll only receive updates until the 'peek' is happening. After it ends, you no longer receive any updates.
However, if you override the previewInteraction:didUpdateCommitTransition:ended: method you'll continue receiving touch location updates there using this code:
CGPoint touchPoint = [previewInteraction locationInCoordinateSpace:previewInteractionView];
You can find sample code from the Apple WWDC video here
Related
Implementing a sort of 'distress call' button which should work as following:
User starts application and covers a screen with a palm of a hand
Some time passes, user may introduce additional touches during that time or remove some of the existing (but not all of them), location/shape of touches may change
When user releases a hand (i.e. removes last touch) a distress signal is emitted by the app
Basically, the app should register two events: (1) a screen is touched (2) all touched are released
I'm trying to use touchesBegan/touchesEnded methods and those work for small area touches (fingertips) but on touching screen with a full palm or even only palm edge a touchesCancelled gets triggered immediately while hand is still on the screen. Obviously no other events are emitted upon hand release afterwards.
I tried subclassing UIWindow and UIApplication and overriding sendEvent in those but got no additional info - large area touches are triggering touch begin and immediately touch cancel, releasing hand afterwards emits nothing. In some cases large area touches fire no events at all, not even the touchesBegan. Basically, iOS doesn't let me work with a very basic scenario - detecting just the fact of screen touch/release.
Is there any way to query the screen touch state directly and not work with responder chain? Or suppress the cancellation event from firing? Or maybe I'm missing something?
Unfortunately, as of right now, no solution exists
I wonder if I can detect continuously changing pressure of user input on iPhone 6s.
With simple touch it's obvious, from the UIResponder you can hook into touchesBegan and TouchesMoved and you'll have the coordinates every time when the user touches the screen and when moves his/her finger, you can have the current position on every update.
However, I'm not sure how the new :force property works. If you read the "force" in touchesBegan, wouldn't you get some minimal pressure value that was detected at the initiation of the touch?
Is there a way to get update on the force value as it changes, just like the touchesMoved? Maybe there is a method like "forceChanged"?
From the iOS 9.1 Release Notes
On 3D Touch capable devices, touch pressure changes cause touchesMoved: to be called.
Apps should be prepared to receive touch move events with no change in the x/y coordinates.
This means that a touchesMoved: callback either indicates a continuous change in force or location or both!
Hopefully this information will one day make it into the UITouch documentation.
Is there a way to fake touch events by faking a tap on the screen at a certain coordinate? The problem im trying to solve is rendering a webpage to a opengl texture on a quad. Touches inside the quad need to be mapped to the view the webpage is rendered on. So if I could give the view the webpage a coordinate that the user touched, and the view can react to that touch as if it was a normal touch, that would be great.
so far ive found this, but the author claims your app will get rejected for using the undisclosed api calls http://www.cocoawithlove.com/2008/10/synthesizing-touch-event-on-iphone.html
Actually you can do it, there is a class for managing all the events in the device (but it was created only for testing reasons), indeed it will be rejected or pull out just as camera+ in the beginning, you should try a different approach if you want to try it anyway i think it will be best if you create your own UIEvent subclass and use [[UIApplication sharedApplication] sendEvent:yourEvent] if the touches in the event are contained in the view bounds it will pass the hit test and send the events to the view just as if the touches were user generated.
I am using swipes to navigate through the pages of my jQuery mobile / PhoneGap application. Do you know if it is possible to let the page transition start after the swipe distance of (for example) 50 px? In other words: It should start before the finger stops touching the screen.
That would advance the user experience as they don't have this little waiting time between raising their finger and the actual page transition.
Thanks for you time!
It should be capable using the 'touchStart' and 'touchMove' events. Record the position of the touch when 'touchStart' fires and then check the displacement whenever 'touchMove' fires. If the displacement exceeds 50 px, call your page switching function. 'touchEnd' will be fired when the user lifts their finger, so you may need to compensate for that if any special actions occur then.
A good place to start is Padilicious's swipe library (http://padilicious.com/code/touchevents/). This can easily be modified to support a swipe-distance setup.
Let me know if you need anymore information.
I'm overriding the touches methods in a UIView for a piano app.
If I touch with one finger the iPhone or iPad I get -as expected- the touchesBegan callabck. And if I touch with a second finger I get that event in the touchesMoved callback. This is all fine, BUT I get the second (and third etc) callback ONLY if the first finger moves while I touch with the second one.
For a piano app this is a problem since I need to be able to touch really quickly.
Does anybody know a workarround for this? Is there an alternative than using touchesBegan/Moved/Ended methods?
Did you enable multitouch (setMultipleTouchEnabled:YES) for that UIView?