I am new to iOS development. I have a custom drawn view which is composed of multiple subviews covering the target area on screen. Specifically this is a board game like chess where I used a view for each square. The squares are created as subviews on a UIView. There is one UIViewController for this. From what I read, I have to have touchesBegan, touchesEnded etc calls in my UIView to handle these. But none of these functions are getting called. I added these calls on the base view and all the subviews. So -
How do I simulate these touch events in the iOS simulator? A mouse click is not calling the touchesBegan ,touchesEnded calls on any view.
Ideally I would like to handle these in the UIViewController because I want to run the touch through some logic. Is that possible? How do I achieve it?
Please refer THIS
It is tutorial in Apple sample codes it describes how to handle touches very nicely.
Just run the sample code and go through description you will get clear idea how touches work un iOS.
Turns out when I add the view programmatically and not through the storyboard the userInteractionEnabled variable is set to NO by default. After setting it up, I get the touchesEnabled call getting called in the view.
Check this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if(![touch.view isKindOfClass:[yourView class]])
{
}
}
Hope this will help.
Related
I am trying to set up a tutorial type class that presents an overlay view and requires actions from the user before continuing. I currently have a hierarchy set up as follows.
UIWindow
|---- UIViewController
| |---- UIViewA (View performing tutorial action on)
| |---- UIViewB
|
|---- UIViewT (tutorial overlay)
|---- CGRect (defined by UIViewA)
During the tutorial, views will get dragged around, new views will be created, etc, which is why I added the tutorial overlay view to the UIWindow. This way the I don't have to mess with the view hierarchy within the view controller as suggested in many places on SO. The purpose of the overlay window is to block all actions, except for the required action expected by the tutorial.
Currently the tutorial overlay view is implemented as follows
#interface ZSOverlayView : UIView <UIGestureRecognizerDelegate>
#property (nonatomic, assign) CGRect activeRegion;
#end
#implementation ZSOverlayView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return !CGRectContainsPoint(_activeRegion, point);
}
#end
Where activeRegion is the CGRect defined by UIViewA. This successfully blocks all unwanted events and gestures from making it through the overlay, outside of the activeRegion. In this specific case UIViewB does not get the event or gestures.
The problem is that I only want a single gesture to make it through, not all of them, for UIViewA. For example if UIViewA has a double tap, pan, and custom gesture, I may only want the double tap to be active at once, or perhaps the custom gesture to be active at once, or perhaps both. The tutorial doesn't know what gestures the view has, so it needs a generic way of passing along the needed ones, and blocking the ones that aren't. Currently none of the gestures are blocked. Even if I have flags in place, which I currently do, that determine what gestures should be able to make it through, I am still running into problems in how to block specific ones, and let others through.
I'm unsure how to proceed because the tutorial overlay is not the delegate of any of the gesture recognizers, nor do I want it to be because by taking over as the delegate the tutorial might remove special conditions specified by the existing delegates.
Any ideas how to proceed to get the functionality I'm looking for?
I don't really like the solution, but the best answer was given by Lyndsey Scott in the comments.
If I'm understanding correctly, you could set the UIGestureRecognizerDelegates then just use conditionals in the delegate methods to specify what to do when the gesture view is the tutorial window vs when the gesture view is the main window.
I would have preferred not to rely on this method since I was trying to have my tutorial library do all of the work but since there hasn't been an answer in a while, I just wanted to throw it out there that this worked.
Have you tried to just use -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ?
If you're wanting to block touch events for something, you can just do it here like:
Prevent touch events for a view
Call a helper method to determine which view(s) can be touched
etc.
Edit: Better late then never. I actually ran into needing to do this (again...) and (yet again) found the answer I was referring to in the comments below =]
Anyways, using touchesBegan, you could just do this ( to obtain all gesture recognizers who are receiving the type of touch ( or touches ) you are looking for:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// In this example, I'm only actually looking for (and caring) about a single touch.
if( [touches count] == 1 ) {
UITouch *touch = (UITouch *)[touches anyObject];
// So here they are now
NSArray *gestureRecognizersForTouch = [touch.gestureRecognizers copy];
// ... whatever else.
}
}
At this point, you can either remove the recognizers, set a value to a property they all have access to in your object, submit a notification, etc. etc. etc.
So I'm writing an app in which I have a custom UIView which contains a UIScrollView which contains a series of UIImageViews. I want to make it so that when I touch one of the UIImageViews within the UIScrollView, an event happens (for now, let's just say I print an NSLog() or something).
I know that there exists this function:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
But I'm not entirely sure how to use it. Which UIView should implement this function? Where would I call it exactly? Or does it get called automatically when something is touched? How do I find the correct UIImageView?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event is already implemented by UIView (actually implemented by the UIResponder superclass, and by default it does nothing). this method is there for you to override in subclasses to do whatever you want to do.
this method is indeed called automatically whenever touches begin within the views boundary.
In your case using a UIButton with an image may be better.
instead of using touchesBegin: , subclass UIImageView, add Tap gesture to this new Class and use this Class instead of the ImageViews you are using on scrollView.
I have an xcode iOS project I'm working on where I have falling particles coming from the top of the screen down to the bottom. Based on the accelerometer, they can either fall fast, or slow.
They are all UIImageViews that are falling, and when I tap them, they should just stop moving. This works fine if they are moving slow enough and I seem to tap just a little bit under them. The problem with this is that when I tap right on top of them when they are moving fast, I can never seem to hit them.
What's the solution to this? Do I need to make a bigger UIImageView with a smaller UIImage centered in it? Or can I use the UIGestureRecognizer to look for taps in a larger radius?
Try using UITapGestureRecognizer
(or)
Try to catch the touch event using touchesBegan: method.
Try writing the touchesBegan: method as follows:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
if (touch.view == UIImageView)
{
// execute the code you need to stop the UIImageView from moving
}
}
Hope this helps. All the best with your app :-)
Background: I need to look at every UITouch that happens in an application and see if the UITouch is NOT inside a certain UIImageView.
WildcardGestureRecognizer looked very promising and I tried it. (Code is here):
How to intercept touches events on a MKMapView or UIWebView objects?
It worked great for a sandbox/quickly-created application. However that application didn't reflect the complexity that the actual target project has. The target project has a Table View Controller and more.
After adding the WildcardGestureRecognizer to the more involved iPad application, I see that none of the other controls work after the gesture recognizer is added and one click happens.
Here's some code where I was playing with the idea. Again, the sandbox code does not yet have controls on it (such as a Table View Controller or even a UIButton) to see if they work after adding the gesture recognizer to the UIWindow.
Should I pick something other than the UIWindow to add the gesture recognizer to or am I going to run into the same problem regardless? Is there a better way?
sandbox code: https://github.com/finneycanhelp/GestureKata
You might want to try another approach: create a UIWindow subclass, use that in your XIB and override hitTest:withEvent:. It returns the view that has been "selected" to receive the touch events.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v;
v = [super hitTest:point withEvent:event];
if (v == myImageView) {
// Do something. Maybe return nil to prevent the touch from getting
// sent to the image view.
}
return v;
}
Overriding this method can also be helpful when debugging.
I upgraded my iPad to 5.0 a couple days ago, and upgraded Xcode to 4.2 at the same time so I could continue to test my apps. Now I am having problems with touch code in several apps that worked with previous versions of Xcode.
I subclassed UIImageView to add some dragging features by overriding -(void)TouchesBegan and -(void)TouchesMoved. I did not override -(void)TouchesEnded in the subclass, but handled that in the view controller for the view that contains the image view.
I pulled the subclassed UIImageView into a new project for testing, and have narrowed down the issue to the fact that the parent UIView (the template created by Xcode) does not seem to be forwarding touch events to the view controller (also created by Xcode).
If I add this to my subclass:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ImageToDrag");
[self.nextResponder touchesEnded:touches withEvent:event];
}
and this to my parent view's view controller:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ViewController");
}
when I let go of the image I am dragging around the screen, I get the "touches ended event in ImageToDrag", but not the log from the view controller.
However, if I intentionally skip over the view by doing this in the subclassed view:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ImageToDrag");
[[self.nextResponder nextResponder] touchesEnded:touches withEvent:event];
}
then I get both log entries.
The only explanation I can come up with is that for some reason, UIView is consuming the touchesEnded event and not passing it to the view controller.
I have verified that exclusiveTouch is set to NO, and userInteractionEnabled is set to YES on the parent view and the subclassed UIImageView.
I have also tested compiling for iOS 5.0 and iOS 4.2, and deploying the test app to both an iOS 5 iPad and iOS 4.3.1 iPad.
The only way I have been able to get the touch event to the viewController is by skipping over the view and using the double nextResponder in the subclass. Although that method functions, it seems like a hack to me and I'm sure it will come back to bite me later.
Has anybody else seen this behavior? Is there any way for me to find out what the UIView is doing with my touch events?
Thanks,
Dan
I've been trying to track down the a similar issue for the last few hours. Finally managed to solve it with the help of this post
Actually it looks like I just managed to solve it, using the hint from
https://devforums.apple.com/message/519690#519690
Earlier, I just
forwarded the touchesEnded event to self.nextResponder. When I added
touchesBegan, Moved, Cancelled handlers with similar implementations
as the touchesEnded, the event seems to bubble up to the root view
controller.
So I guess on iOS5, views discard touchesEnded events
when they have not seen the relevant touchesBegan.
I didn't need to add Moved/etc., I just forwarded on TouchesBegan, and then TouchesEnded start working again!
Some touch handling did chance in iOS 5.0; especially if you re-link your application against the 5.0 SDK.
There's a section UIView's touch handling methods that says this:
If you override this method without calling super (a common use pattern), you must also override the other methods for handling touch events, if only as stub (empy) implementations.
So if you do one, you need to do them all. I know UIKit started taking steps to make sure this was the case in 5.0.
So I'd start there - override all the methods on your view and see what happens.