I want to handle touches on my disabled button
self.closeButton.enabled = NO;
self.closeButtonDisabledRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(closeButtonDisablePressed)];
[self.closeButton addGestureRecognizer:self.closeButtonDisabledRecognizer];
But seems like it doesn't work. Any good solution ?
Disabling is too much like telling the SDK to ignore touches to be what you want. I suggest instead:
#property(assign, nonatomic) BOOL treatTheCloseButtonAsEnabled;
// replace the synthesized setter
- (void)setTreatTheCloseButtonAsEnabled:(BOOL)enabled {
self.closeButton.alpha = (enabled)? 1.0 : 0.5;
// or some other visible indication of the "disabled" state
}
- (IBAction)pressedCloseButton:(id)sender {
if (self.treatTheCloseButtonAsEnabled) {
// logic for a regular press
} else {
// logic for a disabled press
}
}
Maybe you can also use the touch event in the controller.It will be like this(I assume that your close button is added on controller.view):
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint touchPoint = [[touches anyObject] locationInView:self.view];
touchPoint = [self.view convertPoint:touchPoint toView:_closeButton.superview];
//to determine wether user touch on you button
if (CGRectContainsPoint(_closeButton.frame, touchPoint) == NO) {
return;
}
//handle your touch on button here
}
Related
I have to place functionality where CCButton can be dragged to proper position, wherever user needs to place them.
I have create a custom class for this but the issue is click method of the button is not being called when user clicks on the button.
touchyButton.h
#import "cocos2d.h"
#import "cocos2d-ui.h"
#interface touchyButton : CCButton { ... }
#property (nonatomic, assign) BOOL touchMoved;
#end
touchyButton.m
#import "touchyButton.h"
#implementation touchyButton
- (void) touchMoved:(UITouch *)touch withEvent:(UIEvent *)event {
NSLog(#"touchMoved...", nil);
self.touchMoved = YES;
self.anchorPoint = ccp(0.5, 0.5);
CGPoint touchLoc = [touch locationInNode:self.parent];
//CGPoint inTouchLoc = [self convertToNodeSpace:self.anchorPoint];
//CGPoint touchP = ccpAdd(touchLoc, inTouchLoc);
//self.position = [self.parent convertToNodeSpace: touchP];
self.position = touchLoc;
}
- (void) touchEnded:(UITouch *)touch withEvent:(UIEvent *)event {
self.touchMoved = NO;
NSLog(#"touchEnded...", nil);
}
#end
As the code explains, we are just trying to move the button wherever on the screen when user drags on the button.
Calling of the button in main code where it needs to be displayed.
touchyButton *btnRight = [touchyButton buttonWithTitle: #"" spriteFrame:[[CCSprite spriteWithImageNamed: #"arrR.png"] spriteFrame]];
[btnRight setBackgroundOpacity:0.5f forState: CCControlStateNormal];
[btnRight setAnchorPoint: ccp(1, 0.5)];
[btnRight setPosition: ccp(viewS.width - 10.f, viewS.height/2)];
[self addChild: btnRight];
[btnRight setTarget:self selector: #selector(performRightJump:)];
Now, when user clicks on the button, the button goes into selected state but performRightJump never fires. Can anyone suggest any alternative how I can implement the button with dragging behaviour with target action working..? Any hint would be appreciated as well.
One more thing is, in current code I can only be able to move the button's anchor point to the new touch point. Any idea how I can move the button in real fashion? The current method causes problem of tapping of first time for move, button's anchor point jumps to the tapped point.
this code is cocos2dx but useful for you
create a new class classA and classB
add this code
classA.h
{
//here implement
CC_SYNTHESIZE_RETAIN(classB *, _ classB, _ ClassB);
}
classA.cpp
bool classA::init()
{
set_classB(classB::initwithclass());
this->addChild(get_ClassB(),0);
//here create button
return true;
}
bool classA:: onTouchBegan (Touch *touch ,Event *event)
{
_classB->setposition(touchLoc);
return true;
}
void classA:: onTouchMoved(Touch *touch, Event *event)
{
}
void classA:: onTouchEnded(Touch *touch ,Event *event)
{
}
//as well as this code use in cocos2d
i have a transparent UIScrollView on top of another view.
the scroll view has content - text and images, used to display info.
the view behind it has some images that the user should be able to tap on.
and the content over them is scrollable using the mentioned scrollview.
i want to be able to normally use the scroll view (no zoom though), but when that scroll view is not actually scrolling to let the tap events through to the view behind it.
using a combination of the touch and scroll events i can determine when to let the taps through.
but the view behind it still it does not receive them.
i have tried using something like this for all touch events:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesBegan %#", (_isScrolling ? #"YES" : #"NO"));
if(!_isScrolling)
{
NSLog(#"sending");
[self.viewBehind touchesBegan:touches withEvent:event];
}
}
but it does not work.
also in my case i cannot really apply the hitTest and pointInside solutions, given the use case that i have.
First off UIScrollViews only inherently recognize UIPanGestureRecognizers and UIPinchGestureRecognizers so you need to add a UITapGestureRecognizer to the UIScrollView so it can recognize any tapping gestures as well:
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
// To prevent the pan gesture of the UIScrollView from swallowing up the
// touch event
tap.cancelsTouchesInView = NO;
[scrollView addGestureRecognizer:tap];
Then once you receive that tap gesture and the handleTap: action is triggered, you can use locationInView: to detect whether the tap gesture's position is in fact within the frame of one of the images below your scroll view, for example:
- (void)handleTap:(UITapGestureRecognizer *)recognizer {
// First get the tap gesture recognizers's location in the entire
// view's window
CGPoint tapPoint = [recognizer locationInView:self.view];
// Then see if it falls within one of your below images' frames
for (UIImageView* image in relevantImages) {
// If the image's coordinate system isn't already equivalent to
// self.view, convert it so it has the same coordinate system
// as the tap.
CGRect imageFrameInSuperview = [image.superview convertRect:image toView:self.view]
// If the tap in fact lies inside the image bounds,
// perform the appropriate action.
if (CGRectContainsPoint(imageFrameInSuperview, tapPoint)) {
// Perhaps call a method here to react to the image tap
[self reactToImageTap:image];
break;
}
}
}
This way, the above code is only performed if a tap gesture is recognized, your program only reacts to a tap on the scroll view if the tap location falls within an image; otherwise, you can just scroll your UIScrollView as usual.
Here I present my complete solution that:
Forwards touches directly to views instead of calling a control event.
User can specify which classes to forward.
User can specify which views to check if forward is needed.
Here the interface:
/**
* This subclass of UIScrollView allow views in a deeper Z index to react when touched, even if the scrollview instance is in front of them.
**/
#interface MJForwardingTouchesScrollView : UIScrollView
/**
* Set of Class objects. The scrollview will events pass through if the initial tap is not over a view of the specified classes.
**/
#property (nonatomic, strong) NSSet <Class> *forwardsTouchesToClasses;
/**
* Optional array of underlying views to test touches forward. Default is nil.
* #discussion By default the scroll view will attempt to forward to views located in the same self.superview.subviews array. However, optionally by providing specific views inside this property, the scroll view subclass will check als among them.
**/
#property (nonatomic, strong) NSArray <__kindof UIView*> *underlyingViews;
#end
And the Implementation:
#import "MJForwardingTouchesScrollView.h"
#import "UIView+Additions.h"
#implementation MJForwardingTouchesScrollView
- (instancetype)initWithCoder:(NSCoder *)aDecoder
{
self = [super initWithCoder:aDecoder];
if (self != nil)
{
_forwardsTouchesToClasses = [NSSet setWithArray:#[UIControl.class]];
}
return self;
}
- (instancetype)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self != nil)
{
_forwardsTouchesToClasses = [NSSet setWithArray:#[UIControl.class]];
}
return self;
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL pointInside = [self mjz_mustCapturePoint:point withEvent:event];
if (!pointInside)
return NO;
return [super pointInside:point withEvent:event];
}
#pragma mark Private Methods
- (BOOL)mjz_mustCapturePoint:(CGPoint)point withEvent:(UIEvent*)event
{
if (![self mjz_mustCapturePoint:point withEvent:event view:self.superview])
return NO;
__block BOOL mustCapturePoint = YES;
[_underlyingViews enumerateObjectsUsingBlock:^(__kindof UIView * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if (![self mjz_mustCapturePoint:point withEvent:event view:obj])
{
mustCapturePoint = NO;
*stop = YES;
}
}];
return mustCapturePoint;
}
- (BOOL)mjz_mustCapturePoint:(CGPoint)point withEvent:(UIEvent *)event view:(UIView*)view
{
CGPoint tapPoint = [self convertPoint:point toView:view];
__block BOOL mustCapturePoint = YES;
[view add_enumerateSubviewsPassingTest:^BOOL(UIView * _Nonnull testView) {
BOOL forwardTouches = [self mjz_forwardTouchesToClass:testView.class];
return forwardTouches;
} objects:^(UIView * _Nonnull testView, BOOL * _Nullable stop) {
CGRect imageFrameInSuperview = [testView.superview convertRect:testView.frame toView:view];
if (CGRectContainsPoint(imageFrameInSuperview, tapPoint))
{
mustCapturePoint = NO;
*stop = YES;
}
}];
return mustCapturePoint;
}
- (BOOL)mjz_forwardTouchesToClass:(Class)class
{
while ([class isSubclassOfClass:NSObject.class])
{
if ([_forwardsTouchesToClasses containsObject:class])
return YES;
class = [class superclass];
}
return NO;
}
#end
The only extra code used is inside the UIView+Additions.h category, which contains the following method:
- (void)add_enumerateSubviewsPassingTest:(BOOL (^_Nonnull)(UIView * _Nonnull view))testBlock
objects:(void (^)(id _Nonnull obj, BOOL * _Nullable stop))block
{
if (!block)
return;
NSMutableArray *array = [NSMutableArray array];
[array addObject:self];
while (array.count > 0)
{
UIView *view = [array firstObject];
[array removeObjectAtIndex:0];
if (view != self && testBlock(view))
{
BOOL stop = NO;
block(view, &stop);
if (stop)
return;
}
[array addObjectsFromArray:view.subviews];
}
}
Thanks
The problem is that your UIScrollView is consuming the event. To pass it through, you would have to disable the user's interaction on it, but it wouldn't scroll then. If you have the touches location however, you can calculate where would that fall on the underlying view, using the convertPoint:toView: method, and call a mathod on it by passing on the CGPoint. From there, you can calculate which image was tapped.
I am building a iOS app. I have a UIWebView that is added as a subview to self.view, then another view, which is mapView, added as a subview of the web view. But the mapView is send to the back of the webView. The background of the webView is transparent so that one can see the map.
see the code:
[self.webView addSubview: self.mapView];
[self.webView sendSubviewToBack: self.mapView];
Well what I am trying to do is to pass the gestures of the webView to the mapView so that the user can drag the map.
I have marked the cancelsTouchesInView property to NO for both the webView and the mapView.
I have added a gesture recognizer for the webView. The selector does get called. But what am I supposed to do next?
self.webPanGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action: #selector(handleWebPanGesture:)];
[self.webView addGestureRecognizer: self.webPanGesture];
I called the gestureRecognizerShouldBegin method in the webView selector, but it doesn't work.
- ( void ) handleWebPanGesture: ( UIPanGestureRecognizer *)gesture
{
NSLog(#"WebPanGesture recognizer called!");
[self.mapView gestureRecognizerShouldBegin: gesture];
[self panAction: gesture];
self.mapPanGesture = gesture; // the mapPanGesture property is the Gesture recognizer for the map
}
I also call this function
- ( IBAction )panAction:(UIPanGestureRecognizer *)sender {
NSLog(#"panAction called!");
CGPoint move = [sender translationInView:self.webView];
CGPoint newCenter = subViewCenter;
newCenter.x += move.x; newCenter.y += move.y;
self.myMapView.mapView.center = newCenter;
}
but it doesn't make the map draggable, it just moves it.
self.mapPanGesture = gesture //doesn't work as well.
How can I target the actions to the mapView so that the map gets dragged when drag on the webView?
I sure you should use overlays (MKOverlay) on mapView to show content on map. Because this is much easier way to achieve what you need.
Please read this Adding Annotations to a Map
Here I found a way around so do check out this link might be helpful.
In short, webView doesn't handle touchBegan method so u need to subclass that and in touch began method u could pass the following,
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"%#",event);
[super touchesBegan:touches withEvent:event];
[_mapView touchesBegan:touches withEvent:event];
}
or check out for below method,
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
// If the hitView is THIS view, return the view that you want to receive the touch instead:
if (hitView == self) {
return otherView;
}
// Else return the hitView (as it could be one of this view's buttons):
return hitView;
}
Above hitTest is with reference to this link.
Hope, this much info is useful to u.
I have a clear UIView which has gesture recognizers attached to it.
This clear uiview covers the entire super view to allow for the gestures to be invoked from anywhere on it.
Under this clear UIView sit different components such as tables,buttons,collectionview etc.
The clear UIView has no idea what is under it any time.
What I want - if a view which is under the clear uiview can handle a touch event (or any type of gesture) - the clear view should disregard that event - and the event will pass through to the underlying view which could handle it.
I tried
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
but I don't know how to make sure the underlying view can handle it.
-(id)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
id hitView = [super hitTest:point withEvent:event];
if (hitView == self)
{
return nil;
}
else
{
return hitView;
}
}
Add this to your to clear view.
If the hit on clear view means just return nil.
You can override pointInside: withEvent: method. This method returns a boolean value indicating whether the receiver contains the specified point. So if we return NO then your upper clear view will become transparent for touch events and they will be passed to underlying views.
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// Clear UIView will now respond to touch events if return NO:
return NO;
}
use below code for your case->
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
UIView *hitTestView = [super hitTest:point withEvent:event];
if(hitTestView!=nil){
//check for gesture
if([hitTestView.gestureRecognizers count]>0)
return hitTestView;
//if it is subclass of UIControl like UIButton etc
else if([hitTestView isKindOfClass:[UIControl class]])
return hitTestView;
//if can handle touches
else if([hitTestView respondsToSelector:#selector(touchesBegan:withEvent:)])
return hitTestView;
else
return nil;
}
else{
return self;
}
}
In the above code if the subView which is hitView can anyway handle touch ,we return that object to handle that touch. If there is no such hitTest view, then we return the view itself.
I used some of these suggestions and used the following solution:
I added the gesture recognizer to the bottom most superview in the heirarchy (and not the top most)
Then in that class over rid
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v = [super hitTest:point withEvent:event];
// if v is nil then touch wasn't in this view or its subviews
if (v == nil)
{
return nil;
}
// in any case if the topview was hidden than return the default value
if (self.myTopView.hidden)
{
return v;
}
// if the view isn't hidden but the touch returned a control - than we can pass the touch to the control
if ([v isKindOfClass:[UIControl class]])
{
return v;
}
// decide on what threshold to decide is a touch
CGFloat threshHold = 40;
// if the touch wasn't on a control but could initiate a gesture than that view should get the touch
if (v.gestureRecognizers)
{
threshHold = 30;
// return v;
}
// check if the threshold should be bigger
if ([self someCondition])
{
threshHold = 100;
}
// threshold according to its position - this is the dynamic part
if (point.y > (self.myTopView.frame.origin.y - threshold))
{
return self.handleBarView;
}
return v;
}
I have a custom UIControl which, when tapped, goes into a confirm state and, when tapped again, performs the desired action.
I want to have this control go back into its initial state if the user interacts anywhere else on the screen. Is there a non-invasive way to achieve this?
Clarification: I consider code invasive if I can't contain it within this control. I'd like to give the dev of another app this code, which they could use to add the control to their app, without having to mess around with code anywhere else in the app. If this isn't possible, fine, but the question is how to accomplish this non-invasively.
You can think of a UITapGestureRecognizer placed on top of everything that triggers the dismiss only when the touch happens outside of you UIControl bounds.
Something like
- (void)presentConfirm {
// whatever
self.dismissRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(dismiss:)];
self.dismissRecognizer = self;
[self.view addGestureRecognizer:self.dismissRecognizer];
}
- (void)dismiss:(UIGestureRecognizer *)gestureRecognizer {
// do stuff
[self.view removeGestureRecognizer:self.dismissRecognizer];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint touchPoint = [touch locationInView:self.view];
return !CGRectContainsPoint(self.control.frame, touch));
}
Basically you're triggering the dismiss method only when the touch happens outside the UIControl frame (I assumed your control is referenced as self.control).
Also you're going to need a dismissRecognizer property declared as
#property (nonatomic, strong) UITapGestureRecognizer *dismissRecognizer;
and to prevent warnings you should also declare that your controller conforms to the UIGestureRecognizerDelegate protocol.
You could try to accomplish this with the touchesBegan method:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
if(myUIControl.isInConfirmState){ // check if your control is in confirmed state
UITouch* touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if(!CGRectContainsPoint(myUIControl.frame, touchLocation)){
// set myUIControl to it's initial state
}
}
}
Hope this helps.