So I'm writing an app in which I have a custom UIView which contains a UIScrollView which contains a series of UIImageViews. I want to make it so that when I touch one of the UIImageViews within the UIScrollView, an event happens (for now, let's just say I print an NSLog() or something).
I know that there exists this function:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
But I'm not entirely sure how to use it. Which UIView should implement this function? Where would I call it exactly? Or does it get called automatically when something is touched? How do I find the correct UIImageView?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event is already implemented by UIView (actually implemented by the UIResponder superclass, and by default it does nothing). this method is there for you to override in subclasses to do whatever you want to do.
this method is indeed called automatically whenever touches begin within the views boundary.
In your case using a UIButton with an image may be better.
instead of using touchesBegin: , subclass UIImageView, add Tap gesture to this new Class and use this Class instead of the ImageViews you are using on scrollView.
Related
I am new to iOS development. I have a custom drawn view which is composed of multiple subviews covering the target area on screen. Specifically this is a board game like chess where I used a view for each square. The squares are created as subviews on a UIView. There is one UIViewController for this. From what I read, I have to have touchesBegan, touchesEnded etc calls in my UIView to handle these. But none of these functions are getting called. I added these calls on the base view and all the subviews. So -
How do I simulate these touch events in the iOS simulator? A mouse click is not calling the touchesBegan ,touchesEnded calls on any view.
Ideally I would like to handle these in the UIViewController because I want to run the touch through some logic. Is that possible? How do I achieve it?
Please refer THIS
It is tutorial in Apple sample codes it describes how to handle touches very nicely.
Just run the sample code and go through description you will get clear idea how touches work un iOS.
Turns out when I add the view programmatically and not through the storyboard the userInteractionEnabled variable is set to NO by default. After setting it up, I get the touchesEnabled call getting called in the view.
Check this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if(![touch.view isKindOfClass:[yourView class]])
{
}
}
Hope this will help.
I am trying to set up a tutorial type class that presents an overlay view and requires actions from the user before continuing. I currently have a hierarchy set up as follows.
UIWindow
|---- UIViewController
| |---- UIViewA (View performing tutorial action on)
| |---- UIViewB
|
|---- UIViewT (tutorial overlay)
|---- CGRect (defined by UIViewA)
During the tutorial, views will get dragged around, new views will be created, etc, which is why I added the tutorial overlay view to the UIWindow. This way the I don't have to mess with the view hierarchy within the view controller as suggested in many places on SO. The purpose of the overlay window is to block all actions, except for the required action expected by the tutorial.
Currently the tutorial overlay view is implemented as follows
#interface ZSOverlayView : UIView <UIGestureRecognizerDelegate>
#property (nonatomic, assign) CGRect activeRegion;
#end
#implementation ZSOverlayView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return !CGRectContainsPoint(_activeRegion, point);
}
#end
Where activeRegion is the CGRect defined by UIViewA. This successfully blocks all unwanted events and gestures from making it through the overlay, outside of the activeRegion. In this specific case UIViewB does not get the event or gestures.
The problem is that I only want a single gesture to make it through, not all of them, for UIViewA. For example if UIViewA has a double tap, pan, and custom gesture, I may only want the double tap to be active at once, or perhaps the custom gesture to be active at once, or perhaps both. The tutorial doesn't know what gestures the view has, so it needs a generic way of passing along the needed ones, and blocking the ones that aren't. Currently none of the gestures are blocked. Even if I have flags in place, which I currently do, that determine what gestures should be able to make it through, I am still running into problems in how to block specific ones, and let others through.
I'm unsure how to proceed because the tutorial overlay is not the delegate of any of the gesture recognizers, nor do I want it to be because by taking over as the delegate the tutorial might remove special conditions specified by the existing delegates.
Any ideas how to proceed to get the functionality I'm looking for?
I don't really like the solution, but the best answer was given by Lyndsey Scott in the comments.
If I'm understanding correctly, you could set the UIGestureRecognizerDelegates then just use conditionals in the delegate methods to specify what to do when the gesture view is the tutorial window vs when the gesture view is the main window.
I would have preferred not to rely on this method since I was trying to have my tutorial library do all of the work but since there hasn't been an answer in a while, I just wanted to throw it out there that this worked.
Have you tried to just use -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ?
If you're wanting to block touch events for something, you can just do it here like:
Prevent touch events for a view
Call a helper method to determine which view(s) can be touched
etc.
Edit: Better late then never. I actually ran into needing to do this (again...) and (yet again) found the answer I was referring to in the comments below =]
Anyways, using touchesBegan, you could just do this ( to obtain all gesture recognizers who are receiving the type of touch ( or touches ) you are looking for:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// In this example, I'm only actually looking for (and caring) about a single touch.
if( [touches count] == 1 ) {
UITouch *touch = (UITouch *)[touches anyObject];
// So here they are now
NSArray *gestureRecognizersForTouch = [touch.gestureRecognizers copy];
// ... whatever else.
}
}
At this point, you can either remove the recognizers, set a value to a property they all have access to in your object, submit a notification, etc. etc. etc.
What I want to implement is column matching type functionality. I have three buttons on right column and three on left column with some info on them. I want to draw path from one button on right side to any of the button on left side by dragging finger.
I would use UIBezierPath path to draw path, and for that definitely a start and end point is needed.
Now the issue is how can I trigger touchesBegan, touchesMoved and touchesEnded methods by tapping on buttons so that I can get start and end points.
Another option that I am thinking about is to cover all the buttons with an invisible UIView, and somehow check if the point touched of this overlay view lies in any of the button frames.
BTW I can replace these buttons with simple UIImageView as well. I added buttons just for the sake of getting touch points.
Any help.
Create a subclass of UIButton and add this...
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
[self.nextResponder touchesBegan:touches withEvent:event];
}
This will get you your touches within the button.
Uncheck User Interaction Enabled checkbox of UIButton. You will get call for touchesbegan method on that button.
I have a UIView subclass where I handle touches using touchesBegan, touchesMoved, touchesEnded.
I noticed that when I start a touch inside and then drag outside the UIView touchesEnded is not triggered, is there a way to have it called when I'm dragging outside the UIView frame?
Thanks
Use touchesCancelled instead.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
Pssibly it is calling the - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
Please check UIResponder Class Reference
touchesCancelled:withEvent:
Sent to the receiver when a system event (such as a low-memory
warning) cancels a touch event.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
Parameters
touches
A set of UITouch instances that represent the touches for the ending phase of the event represented by event. event
An object representing the event to which the touches belong.
Discussion
This method is invoked when the Cocoa Touch framework receives a
system interruption requiring cancellation of the touch event; for
this, it generates a UITouch object with a phase of
UITouchPhaseCancel. The interruption is something that might cause the
application to be no longer active or the view to be removed from the
window
When an object receives a touchesCancelled:withEvent: message it
should clean up any state information that was established in its
touchesBegan:withEvent: implementation.
The default implementation of this method does nothing. However
immediate UIKit subclasses of UIResponder, particularly UIView,
forward the message up the responder chain. To forward the message to
the next responder, send the message to super (the superclass
implementation); do not send the message directly to the next
responder. For example,
[super touchesCancelled:touches withEvent:event];
If you override this method without calling super (a common use
pattern), you must also override the other methods for handling touch
events, if only as stub (empty) implementations. Availability
Available in iOS 2.0 and later.
You should still be getting touchesMoved in your viewController even you are outside of your view's frame. I would add array to your custom UIView which would keep touch objects that this view is assigned to and implement the following logic:
When you receive touchesBegan - add touch object to array for given UIView where touch coordinates match (you may have to iterate over all your subviews and match coordinates with frame to find the right one)
When you receive touchesMoved - remove touch from UIView's array for previous location and add it to UIView for current location if any
When you receive touchesEnded & touchesCancelled - remove touches from all UIViews arrays
When you remove or add touch object from your custom UIView array you may call your delegate methods (implement observer pattern) because at that point you know if it's really pressed or unpressed event. Keep in mind you may have many touch objects in one array so before calling your delegate check whether you added first object or deleted last one because that's the only case you should call your delegate.
I have implemented such solution in my last game, but instead of UIViews I had virtual buttons.
Should "clickable" areas in iOS be buttons or is it ok to just use a generic UIView, UIImage and so on?
Say i have a block of text with an icon, borders, shadows and so on. It looks like a bilboard. What would be the best way to implement that? Using a custom UIButton and just add subviews to it or creating just a generic UIView?
Any thoughts appreciated!
You can simply add UIGestureRecognizers to your UIView and handle them. You can find the documentation here and a tutorial here.
Probably for a view containing multiple subviews, you want to use a UIView subclass. While a UIButton would be OK for adding views, state changes, enabling/disabling may do wonky things to the view as a whole (including the subviews). Using your own UIView subclass will ensure that what gets displayed doesn't get toyed around with by any state changes, giving you complete control. You can override
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
to intercept touches on your custom view. If you're going to do this, remember that the userInteractionEnabled field MUST be set to true.
An additional note: you mentioned shadows as one of the elements in your question. If you're using CALayer to do this, definitely avoid using UIButton, as it's set of layers to handle different states is quite complex.
If the target area is big enough, you could place a transparent UIButton (switch the button type to custom, but don't supply an image) over the top of the clickable view to intercept the taps.