So I had some code that used the touchesBegan:withEvent:, touchesMoved:withEvent:, touchesEnded:withEvent: functions to accomplish some of my manipulations on the screen. Since everything was done on a separate screen I just overwrote those methods in the ViewController. Now, I would like to port this functionality to work on a separate UIView. Since I can not overwrite those functions I tried using the UIGestureRecognizer set of classes but I failed as I couldn't find the combination that will trigger specifically when I need it (the press on the screen - touchesBegan, while moving - touchesMoved, and when the action stopped touchesEnded). Reading through SO I managed to find an option - create my custom view and overwrite the needed methods, but I would like to avoid it as the described functionality shouldn't be always present.
Is there a way I could achieve to trigger 3 functions using UIGestureRecognizer that will trigger at start, move and end accordingly or the only possible way to achieve what I want is to create a custom UIView?
Related
When I demo my touch apps to remote teams the people on the other end dont know where I am touching. To remedy this, I have been working on an event intercepting view/window that can display touches over applications. No matter how may variations on nextResponder I call, I am unable to react to the touch and pass it along to the controllers underneath. Specifically scroll views dont react nor do buttons.
Is there a way to take an event, get its position, then pass it along to what ever component would have been responding to it initially (the controller underneath)?
Update:
I am making some progress with a UIView. The new view is always returning NO to pointInside.This works great for when the touch starts, but it doesnt track moves or releases. Is there a strategy to adding gesture recognizers to the touch in order to track its event lifecycle?
Joe
You could try creating your own subclass of UIApplication that overrides sendEvent:. Your implementation should call [super sendEvent:event] as well as process the event as needed.
Update your main.m and pass the name of you custom UIApplication class as the 3rd parameter to the call of UIApplicationMain.
After some more due diligence, I found my oversight. In the layer that was on top and displaying the touches, user interaction needed to be set to false. Once I set that to false, I was able to use that layer for display while catching events on the layers below. The project still isn't done but I am one step closer.
Take care,
Joe
I have a simple lunar lander game.
I compute positions and everything by integration - e.g. each turn I take vectors and combine them and then apply resulting vector to my lander.
Here comes the question, I have a button that I want to use for thrust.
How do I check if it is on during update method? I guess i will have some BOOL flag that gets set to YES when the button is pressed, but when do i set it to NO?
Some practical implementation would be great.
I use cocos2d-iphone and iOS.
Well, the pseudo code goes as follows:
We shall not use Buttons (aka CCMenuItem), since they provide callbacks only on touch up events. We want touch down, touch exit/entered, touch ended.
In your CCScene that you are displaying, either add a new child that is a subclass of CCLayer or even use one of the CCLayers already present in the CCScene.
In the init of your CClayer subclass, set isTouchEnabled to YES.
Implement the usual methods:
- (void)ccTouchesBegan:...
- (void)ccTouchesMoved:...
- (void)ccTouchesEnded:...
- (void)ccTouchesCancelled:...
Finally, do your magic in these methods.
Get the touch location
Check using CGRectContainsPoint whether the touch is within the thrust area.
and so on, and so forth...
I know what I want to do, but I'm stumped as to how to do it: I want to implement something like the iOS multitasking gestures. That is, I want to "steal" touches from any view inside my view hierarchy if the number of touches is greater than, say, two. Of course, the gestures are not meant to control multitasking, it's just the transparent touch-stealing I'm after.
Since this is a fairly complex app (which makes extensive use of viewController containment), I want this to be transparent to the views that it happens to (i. e. I want to be able to display arbitrary views and hierarchies, including UIScrollViews, MKMapViews, UIWebViews etc. without having to change their implementation to play nice with my gestures).
Just adding a gestureRecognizer to the common superview doesn't work, as subviews that are interaction enabled eat all the touches that fall on them.
Adding a visually transparent UI-enabled view as a sibling (but in front) of the main view hierarchy also doesn't work, since now this view eats all the touches. I've experimented with reimplementing touchesBegan: etc. in the touchView, but forwarding the touches to nextResponder doesn't work, because that'll be the common superview, in effect funnelling the touches right around the views that are supposed to be receiving them when the touchView gives them up.
I am sure I'm not the only one looking for a solution for this, and I'm sure there are smarter people than me that have this already figured out. I even suspect it might not actually be very hard, and just maybe my brain won't see the forest for the trees today. I'm thankful for any helpful answers anyway :)
I would suggest you to try using method swizzling, reimplementing the touchesbegan on UIView. I think that the best way is to store in a static shared variable the number of touches (so that each view can increment/decrement this value). It's just a very simple idea, take it with a grain of salt.
Hope this helps.
Ciao! :)
A possible, but potentially dangerous (if you aren't careful) approach is to subclass your application UIWindow and redefine the sendEvent: method.
As this method is called for each touch event received by the app, you can inspect it and then decide to call [super sendEvent:] (if the touch is not filtered), or don't call it (if the touch is filtered) or just defer its call if you are still recognizing the touch.
Another possibility is to play with the hitTest:withEvent: method but this would require your stealing view to be placed properly in the subview, and I think it doesn't fit well when you have many view controllers. I believe the previous solution is more general purpose.
Actually, adding a gesture recognizer on the common superview is the right way to do this. But it sound like you may need to set either delaysTouchesBegan or cancelsTouchesInView (or both) to ensure that the gesture recognizer handles everything before letting it through to the child views.
Imagine a view with, say, 4 subviews, next to each other but non overlapping.
Let's call them view#1 ... view#4
All 5 such views are my own UIView subclasses (yes, I've read: Event Handling as well as iOS Event Guide and this SO question and this one, not answered yet)
When the user touches one of them, UIKit "hiTests" it and delivers subsequent events to that view: view#1
Even when the finger goes outside view#1, over say view#3.
Even if this "drag" is now over view#3, view#1 still receives touchesMoved, but view#3 receives nothing.
I want view#3 to start replying to the touches. Maybe with a "touchedEntered" of my own, together with possibly a "touchesExited" on view#1.
How would I go about this?
I can see two approaches.
side step the problem and do all the touch handling in the parent
view whenever I detect a touchesMoved outside of view#1 bounds or,
transfer to the parent view telling it to "redispatch". Not very
clear how such redispatching would work, though.
For solution #2 where I am getting confused is not about the forwarding per se, but how to find the UIVIew I want to forward to. I can obviously loop through the parent subviews until I find one whose bounds/frame contain the touch, but I am wondering if I am missing something, that Apple would have already provided but I cannot relate to this problem.
Any idea?
I have done this, but I used CALayers instead of sub-UIViews. That way, there is no worries about the subviews catching/redispatching events to the parent UIView. You might not be able to do that, but it does simplify things. My solution tended to use CGRectContainsPoint() a lot.
You may want to read Event Handling again, as it comes pretty close to answering your question:
A touch object...is associated with its hit-test view for its
lifetime, even if the touch represented by the object subsequently
moves outside the view.
Given that, if you want to accomplish your goal of having different views react to the user's finger crossing over them, and if you want to do it within the touch-handling mechanism provided by UIView, you should go with your first approach: have the parent view handle the touch. The parent can use -hitTest:withEvent: or -pointInside:withEvent: as it's tracking a touch to determine if the touch is in one of the subviews, and if so can send an appropriate message.
I'm trying to handle touch events with touchesBegan in an overlay to a parent UIView but also allow the touch input to pass through to sibling UIViews underneath. I expected there would be some straight-forward way to process a touch event and then say "now send it to the next responder as if this one didn't exist", but all I can find is the nextResponder method which appears to be giving back the parent to my overlay view. That parent is then not really passing it onto the next sibling of that overlay view so I'm stuck uncertain how to do what seems like a simple task that is usually accomplished with a touch callback that gets a True or False return value to tell it whether to keep processing down the widget hierarchy.
Am I missing something obvious?
Late answer, but I think you would be better off overriding hitTest:withEvent: instead of touchesBegan. It seems to me that touchesBegan is a pretty "high-level" method that is there to just do a simple thing, so you cannot alter at that level if the event if propagated further. The right place to do that is hitTest:withEvent:.
Also have a look at this S.O. answer for more details about this point.
I understand the desired behavior you're looking for Joey - I haven't found something in the API that supports this automatic messaging-up-the-chain behavior with sibling views.
What I originally wrote below was with respect to just informing a parent UIView about a touch. This still applies, but I believe you need to take it a step further and have the parent UIView use the hit testing technique that Sergio described on each of it's subviews that are siblings to the overlay, and have the parent UIView manually invoke a "do something" method on each of it's subviews that pass the hit test. Each of those sibling views can return a BOOL value on whether to abort informing other siblings or continue the chain.
If you find yourself using this pattern a lot, consider adding a category method on UIView that encapsulates the hit testing and asking views to perform a selector.
My Original Answer
With a little bit of manual work, you can wire this together yourself. I've had to do this, and it worked for me, because I had an oft-repeated use case (an overlay view on a button), where it made sense to create some custom classes. If your situation is similar, one of these techniques will suffice.
Option 1:
If the overlay doesn't need to do anything but look pretty, have it opt out of touch handling completely with userInteractionEnabled = NO. This will make it so that the touch event goes to it's parent UIView (the one it is an overlay to).
Option 2:
Have the overlay absorb the touch event (as it would by default), and then invoke a method on the parent UIView indicating that a touch or certain gesture was recognized, and here's what it is. This way, the UIView behind the overlay still gets to act on the touch recognition, even if someone else did the interception.
With Option 2, it's more a fit for simple UIControlEvent types, like UIControlEventTouchDown and UIControlEventTouchUpInside. In my case (a custom UIButton subclass with a custom overlay view on top of it), I'll wire touch down and touch up events on the button to two separate methods. These fire if a touch down or touch up inside event occurs on the button itself. But, they are also hooks I can invoke from the overlay view if I need to simulate that a button press occurred.
Depending on your needs, you could have a known protocol between the overlay and it's parent UIView or just have the overlay test the UIView informally, with a respondsToSelector: check before invoking performSelector: on it with the custom method you want called that would have fired automatically if the UIView wasn't covered by an overlay.