Send UIEvent from iOS device UIView to external screen UIView - ios

I'm working on an app using AirPlay and I need a way to have touches on the main screen act as touches on the UI on my external screen in order to be compatible with a large number of previously existing custom UI elements. Rebuilding the UI elements would be orders of magnitude more difficult than finding a way of translating the touches from one view to another.
The external screen will feature a sort of mouse pointer to represent the interaction, as the user will need a point of reference on the screen for their actions. This may create User Interface guidelines hurdles, but I'll cross that mountain when I get to it. Hopefully I can find a way to make this item sufficiently non-mouse like. The user will interact with the device as a sort of track-pad.
I'm using a window on the device screen with "sendEvent" subclassed to catch the touch events. I'm attempting to manually walk the view hierarchy for the views that need to receive input. Finding the view I want to talk to is not difficult. For UIControl based classes I can call "sendActionsForControlEvents" to send the appropriate messages for the control. This may need some caressing, but for now that's not the main issue.
For the UI events for touchesBegan, touchesMoved, etc. I don't have a decent way of faking the UIEvent information. I can't call these functions unless I have a UIEvent, and I don't seem to have any way of create a UIEvent object. The UIEvent from sendEvent does not have a position that matches the pointer position on the secondary screen (at the least), so simply passing it on will not give me what I want.
Is there any legitimate way of synthesizing this information?

Related

Order of UIGestureRecognizer touchesBegan(_:with:) and point(inside:with:)

The Question
When the user taps on a view which of these two functions is called first? touchesBegan(_:with:) or point(inside:with:).
The Context
I want to subclass a PKCanvasView (which inherits from UIScrollView) to allow interaction through the view (i.e. allow interaction to the view below and disable interaction to the PKCanvasView) when the point of the touch is outside of the UIBezierPath of a stroke on the canvas.
This is easy enough by overriding point(inside:with:). My issue lies in the fact that I only want to allow interaction to the view below if the touch event UITouch.TouchType is not an apple pencil : .pencil. (so that the user can draw with the apple pencil and interact with the view below using their finger)
The only way I think can get this information is by also overriding touchesBegan(_:with:). Here I can access the event and its touch type. I would then somehow pass this to be read inside of point(inside:with:).
However, that all relies on extracting the UITouch.TouchType information before I check if the touch point is overlapping with any PKStroke paths.
So: Is touchesBegan(_:with:) called before point(inside:with:)?

3dtouch to present(peek without pop) UIView like contacts app

I'm trying to implement 3D Touch feature that presents a summary of information (like Peek). But I don't want that it pops. I just want to preview the information like contacts app does with contatcs:
It only presents an UIView and doesn't deal with two levels of force (peek and pop).
How can I do something like this?
Ps.: I don't want to deal with long press gesture.
Introduction
Hello
I know this is a bit to late, probably, but in case someone else stumbles upon it: I certainly believe it is possible and I don't think its a "native behavior for contacts". Although it would not be as simple as the UIKit api for peek pop views. You would need to:
Steps
subclass UIGestureRecognizer (perhaps it may work with the UITapGestureRecognizer also), and register UITouches and use their force property.
Setup a UIViewController with transparent but blurred background around the edges (together with a modalPresentationStyle .overCurrentContext if i recall correctly), with your desired content in the middle (much like the peek view). Then add a UIPanGestureRecognizer to the center view for dismissal/sliding up the buttons.
And then create a custom animation transition for that UIViewController to be triggered once the force property of the registered UITouches from the subclassed UIGestureRecognizer is high enough. And then reversed once the force property gets low enough.
Concluding notes
I believe this is a bit of a tedious task and there might be a simpler way. For example, a simpler way using a 3rd party library for long pressure gestures (that registers size of the touch), but it would not give the same feel.

Presentation overlay for iOS

When I demo my touch apps to remote teams the people on the other end dont know where I am touching. To remedy this, I have been working on an event intercepting view/window that can display touches over applications. No matter how may variations on nextResponder I call, I am unable to react to the touch and pass it along to the controllers underneath. Specifically scroll views dont react nor do buttons.
Is there a way to take an event, get its position, then pass it along to what ever component would have been responding to it initially (the controller underneath)?
Update:
I am making some progress with a UIView. The new view is always returning NO to pointInside.This works great for when the touch starts, but it doesnt track moves or releases. Is there a strategy to adding gesture recognizers to the touch in order to track its event lifecycle?
Joe
You could try creating your own subclass of UIApplication that overrides sendEvent:. Your implementation should call [super sendEvent:event] as well as process the event as needed.
Update your main.m and pass the name of you custom UIApplication class as the 3rd parameter to the call of UIApplicationMain.
After some more due diligence, I found my oversight. In the layer that was on top and displaying the touches, user interaction needed to be set to false. Once I set that to false, I was able to use that layer for display while catching events on the layers below. The project still isn't done but I am one step closer.
Take care,
Joe

UITouch & UIEvents: fighting the framework?

Imagine a view with, say, 4 subviews, next to each other but non overlapping.
Let's call them view#1 ... view#4
All 5 such views are my own UIView subclasses (yes, I've read: Event Handling as well as iOS Event Guide and this SO question and this one, not answered yet)
When the user touches one of them, UIKit "hiTests" it and delivers subsequent events to that view: view#1
Even when the finger goes outside view#1, over say view#3.
Even if this "drag" is now over view#3, view#1 still receives touchesMoved, but view#3 receives nothing.
I want view#3 to start replying to the touches. Maybe with a "touchedEntered" of my own, together with possibly a "touchesExited" on view#1.
How would I go about this?
I can see two approaches.
side step the problem and do all the touch handling in the parent
view whenever I detect a touchesMoved outside of view#1 bounds or,
transfer to the parent view telling it to "redispatch". Not very
clear how such redispatching would work, though.
For solution #2 where I am getting confused is not about the forwarding per se, but how to find the UIVIew I want to forward to. I can obviously loop through the parent subviews until I find one whose bounds/frame contain the touch, but I am wondering if I am missing something, that Apple would have already provided but I cannot relate to this problem.
Any idea?
I have done this, but I used CALayers instead of sub-UIViews. That way, there is no worries about the subviews catching/redispatching events to the parent UIView. You might not be able to do that, but it does simplify things. My solution tended to use CGRectContainsPoint() a lot.
You may want to read Event Handling again, as it comes pretty close to answering your question:
A touch object...is associated with its hit-test view for its
lifetime, even if the touch represented by the object subsequently
moves outside the view.
Given that, if you want to accomplish your goal of having different views react to the user's finger crossing over them, and if you want to do it within the touch-handling mechanism provided by UIView, you should go with your first approach: have the parent view handle the touch. The parent can use -hitTest:withEvent: or -pointInside:withEvent: as it's tracking a touch to determine if the touch is in one of the subviews, and if so can send an appropriate message.

How do I implement multitouch on iOS

I'd like to implement multitouch, and I was hoping to get some sanity checks from the brilliant folks here. :)
From what I can tell, my strategy to detect and track multitouch is going to be to use the touchesBegan _Moved and _Ended methods and use the allTouches method of the event parameter to get visibility on all relevant touches at any particular time.
I was thinking I'd essentially use the previousLocationInView as a way of linking touches that come in with my new events with the currently active touches, i.e. if there is a touchBegan for one that is at x,y = 10,14, then I can use the previous location of a touch in the next message to know which one this new touch is tied to as a way of keeping track of one finger's continuous motion etc. Does this make sense? If it does make sense, is there a better way to do it? I cannot hold onto UITouch or UIEvent pointers as a way of identifying touches with previous touches, so I cannot go that route. All I can think to do is tie them together via their previouslocationInView value (and to know which are 'new' touches).
You might want to take a look at gesture recognizers. From Apple's docs,
You could implement the touch-event handling code to recognize and handle these gestures, but that code would be complex, possibly buggy, and take some time to write. Alternatively, you could simplify the interpretation and handling of common gestures by using one of the gesture recognizer classes introduced in iOS 3.2. To use a gesture recognizer, you instantiate it, attach it to the view receiving touches, configure it, and assign it an action selector and a target object. When the gesture recognizer recognizes its gesture, it sends an action message to the target, allowing the target to respond to the gesture.
See the article on Gesture Recognizers and specifically the section titled "Creating Custom Gesture Recognizers." You will need an Apple Developer Center account to access this.

Resources