I have to place functionality where CCButton can be dragged to proper position, wherever user needs to place them.
I have create a custom class for this but the issue is click method of the button is not being called when user clicks on the button.
touchyButton.h
#import "cocos2d.h"
#import "cocos2d-ui.h"
#interface touchyButton : CCButton { ... }
#property (nonatomic, assign) BOOL touchMoved;
#end
touchyButton.m
#import "touchyButton.h"
#implementation touchyButton
- (void) touchMoved:(UITouch *)touch withEvent:(UIEvent *)event {
NSLog(#"touchMoved...", nil);
self.touchMoved = YES;
self.anchorPoint = ccp(0.5, 0.5);
CGPoint touchLoc = [touch locationInNode:self.parent];
//CGPoint inTouchLoc = [self convertToNodeSpace:self.anchorPoint];
//CGPoint touchP = ccpAdd(touchLoc, inTouchLoc);
//self.position = [self.parent convertToNodeSpace: touchP];
self.position = touchLoc;
}
- (void) touchEnded:(UITouch *)touch withEvent:(UIEvent *)event {
self.touchMoved = NO;
NSLog(#"touchEnded...", nil);
}
#end
As the code explains, we are just trying to move the button wherever on the screen when user drags on the button.
Calling of the button in main code where it needs to be displayed.
touchyButton *btnRight = [touchyButton buttonWithTitle: #"" spriteFrame:[[CCSprite spriteWithImageNamed: #"arrR.png"] spriteFrame]];
[btnRight setBackgroundOpacity:0.5f forState: CCControlStateNormal];
[btnRight setAnchorPoint: ccp(1, 0.5)];
[btnRight setPosition: ccp(viewS.width - 10.f, viewS.height/2)];
[self addChild: btnRight];
[btnRight setTarget:self selector: #selector(performRightJump:)];
Now, when user clicks on the button, the button goes into selected state but performRightJump never fires. Can anyone suggest any alternative how I can implement the button with dragging behaviour with target action working..? Any hint would be appreciated as well.
One more thing is, in current code I can only be able to move the button's anchor point to the new touch point. Any idea how I can move the button in real fashion? The current method causes problem of tapping of first time for move, button's anchor point jumps to the tapped point.
this code is cocos2dx but useful for you
create a new class classA and classB
add this code
classA.h
{
//here implement
CC_SYNTHESIZE_RETAIN(classB *, _ classB, _ ClassB);
}
classA.cpp
bool classA::init()
{
set_classB(classB::initwithclass());
this->addChild(get_ClassB(),0);
//here create button
return true;
}
bool classA:: onTouchBegan (Touch *touch ,Event *event)
{
_classB->setposition(touchLoc);
return true;
}
void classA:: onTouchMoved(Touch *touch, Event *event)
{
}
void classA:: onTouchEnded(Touch *touch ,Event *event)
{
}
//as well as this code use in cocos2d
Related
I have been researching about it since last 3 to 4 hours but I didn't get any information. My issue is I want to enable userInteraction to some part of the UIViewController.
Description:
I have a UIViewController. I have added 30 tableviews. I have stored one value in the application. If that value is 1 then I have to enable user interaction for tableview1 only and if the value is 2 then tableview2 only ........etc
. Please let me know if I am not clear. Thank you for spending your valuable time. Thanks in advance
A simple way to do it is to subclass UIView and override - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event.
Return NO for the part of the UIView (represented as ignoreRect in the sample) you want the subviews to ignore touches.
#interface InteractionView ()
#property (nonatomic) CGRect ignoreRect;
#end
#implementation InteractionView
- (void)awakeFromNib {
self.ignoreRect = CGRectMake(0.0f, 0.0f, 300.0f, 300.0f);
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
if (CGRectContainsPoint(self.ignoreRect, point)) {
return NO;
}
return [super pointInside:point withEvent:event];
}
#end
If you need more tweaking for an expected behavior : for exemple return a specific view for a specific zone, return the top view of a specific zone, ... you may use
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (CGRectContainsPoint(self.ignoreRect, point)) {
return nil; // Edit that part if you want to return a chosen view
}
return [super hitTest:point withEvent:event];
}
Another solution without - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event. is to adding UIButton as a subview to the part of UIView you want to close interaction.
for example if you want to close interaction of the bottom half of view.
UIButton *button = [UIButton buttonWithType:UIButtonTypeCustom];
button.frame = CGRectMake(0, self.view.frame.size.height*0.5f, self.view.frame.size.width, self.view.frame.size.height*0.5);
[self.view addSubview:button];
since it will get the touch events, half of the view will be closed to user interaction.
EDIT
IBOutletCollection(UITableView) NSArray *allTableViews;// get all your tableviews reference to this array. set tag in interface builder for each array to reach later.
then when you want to enable/disable interaction of related tableview
int tagToOpenInteraction = 1;//or whatever it is
for(UITableView *t in allTableViews)
{
if(t.tag == tagToOpenInteraction)
[t setUserInteractionEnabled:YES];
else
[t setUserInteractionEnabled:NO];
}
i have a transparent UIScrollView on top of another view.
the scroll view has content - text and images, used to display info.
the view behind it has some images that the user should be able to tap on.
and the content over them is scrollable using the mentioned scrollview.
i want to be able to normally use the scroll view (no zoom though), but when that scroll view is not actually scrolling to let the tap events through to the view behind it.
using a combination of the touch and scroll events i can determine when to let the taps through.
but the view behind it still it does not receive them.
i have tried using something like this for all touch events:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesBegan %#", (_isScrolling ? #"YES" : #"NO"));
if(!_isScrolling)
{
NSLog(#"sending");
[self.viewBehind touchesBegan:touches withEvent:event];
}
}
but it does not work.
also in my case i cannot really apply the hitTest and pointInside solutions, given the use case that i have.
First off UIScrollViews only inherently recognize UIPanGestureRecognizers and UIPinchGestureRecognizers so you need to add a UITapGestureRecognizer to the UIScrollView so it can recognize any tapping gestures as well:
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
// To prevent the pan gesture of the UIScrollView from swallowing up the
// touch event
tap.cancelsTouchesInView = NO;
[scrollView addGestureRecognizer:tap];
Then once you receive that tap gesture and the handleTap: action is triggered, you can use locationInView: to detect whether the tap gesture's position is in fact within the frame of one of the images below your scroll view, for example:
- (void)handleTap:(UITapGestureRecognizer *)recognizer {
// First get the tap gesture recognizers's location in the entire
// view's window
CGPoint tapPoint = [recognizer locationInView:self.view];
// Then see if it falls within one of your below images' frames
for (UIImageView* image in relevantImages) {
// If the image's coordinate system isn't already equivalent to
// self.view, convert it so it has the same coordinate system
// as the tap.
CGRect imageFrameInSuperview = [image.superview convertRect:image toView:self.view]
// If the tap in fact lies inside the image bounds,
// perform the appropriate action.
if (CGRectContainsPoint(imageFrameInSuperview, tapPoint)) {
// Perhaps call a method here to react to the image tap
[self reactToImageTap:image];
break;
}
}
}
This way, the above code is only performed if a tap gesture is recognized, your program only reacts to a tap on the scroll view if the tap location falls within an image; otherwise, you can just scroll your UIScrollView as usual.
Here I present my complete solution that:
Forwards touches directly to views instead of calling a control event.
User can specify which classes to forward.
User can specify which views to check if forward is needed.
Here the interface:
/**
* This subclass of UIScrollView allow views in a deeper Z index to react when touched, even if the scrollview instance is in front of them.
**/
#interface MJForwardingTouchesScrollView : UIScrollView
/**
* Set of Class objects. The scrollview will events pass through if the initial tap is not over a view of the specified classes.
**/
#property (nonatomic, strong) NSSet <Class> *forwardsTouchesToClasses;
/**
* Optional array of underlying views to test touches forward. Default is nil.
* #discussion By default the scroll view will attempt to forward to views located in the same self.superview.subviews array. However, optionally by providing specific views inside this property, the scroll view subclass will check als among them.
**/
#property (nonatomic, strong) NSArray <__kindof UIView*> *underlyingViews;
#end
And the Implementation:
#import "MJForwardingTouchesScrollView.h"
#import "UIView+Additions.h"
#implementation MJForwardingTouchesScrollView
- (instancetype)initWithCoder:(NSCoder *)aDecoder
{
self = [super initWithCoder:aDecoder];
if (self != nil)
{
_forwardsTouchesToClasses = [NSSet setWithArray:#[UIControl.class]];
}
return self;
}
- (instancetype)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self != nil)
{
_forwardsTouchesToClasses = [NSSet setWithArray:#[UIControl.class]];
}
return self;
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL pointInside = [self mjz_mustCapturePoint:point withEvent:event];
if (!pointInside)
return NO;
return [super pointInside:point withEvent:event];
}
#pragma mark Private Methods
- (BOOL)mjz_mustCapturePoint:(CGPoint)point withEvent:(UIEvent*)event
{
if (![self mjz_mustCapturePoint:point withEvent:event view:self.superview])
return NO;
__block BOOL mustCapturePoint = YES;
[_underlyingViews enumerateObjectsUsingBlock:^(__kindof UIView * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if (![self mjz_mustCapturePoint:point withEvent:event view:obj])
{
mustCapturePoint = NO;
*stop = YES;
}
}];
return mustCapturePoint;
}
- (BOOL)mjz_mustCapturePoint:(CGPoint)point withEvent:(UIEvent *)event view:(UIView*)view
{
CGPoint tapPoint = [self convertPoint:point toView:view];
__block BOOL mustCapturePoint = YES;
[view add_enumerateSubviewsPassingTest:^BOOL(UIView * _Nonnull testView) {
BOOL forwardTouches = [self mjz_forwardTouchesToClass:testView.class];
return forwardTouches;
} objects:^(UIView * _Nonnull testView, BOOL * _Nullable stop) {
CGRect imageFrameInSuperview = [testView.superview convertRect:testView.frame toView:view];
if (CGRectContainsPoint(imageFrameInSuperview, tapPoint))
{
mustCapturePoint = NO;
*stop = YES;
}
}];
return mustCapturePoint;
}
- (BOOL)mjz_forwardTouchesToClass:(Class)class
{
while ([class isSubclassOfClass:NSObject.class])
{
if ([_forwardsTouchesToClasses containsObject:class])
return YES;
class = [class superclass];
}
return NO;
}
#end
The only extra code used is inside the UIView+Additions.h category, which contains the following method:
- (void)add_enumerateSubviewsPassingTest:(BOOL (^_Nonnull)(UIView * _Nonnull view))testBlock
objects:(void (^)(id _Nonnull obj, BOOL * _Nullable stop))block
{
if (!block)
return;
NSMutableArray *array = [NSMutableArray array];
[array addObject:self];
while (array.count > 0)
{
UIView *view = [array firstObject];
[array removeObjectAtIndex:0];
if (view != self && testBlock(view))
{
BOOL stop = NO;
block(view, &stop);
if (stop)
return;
}
[array addObjectsFromArray:view.subviews];
}
}
Thanks
The problem is that your UIScrollView is consuming the event. To pass it through, you would have to disable the user's interaction on it, but it wouldn't scroll then. If you have the touches location however, you can calculate where would that fall on the underlying view, using the convertPoint:toView: method, and call a mathod on it by passing on the CGPoint. From there, you can calculate which image was tapped.
I am wanting to know when a user has touched anywhere on the screen of my app.
I have looked into using -(UIResponder *)nextResponder but unfortunately this will not work, as I am also reloaded a table automatically, so this gets trigged when that occurs.
I have also tried a gesture recognizer, with the following code. But this will only recognise touches on the view. Where as I have many buttons the user will be using to operate the app. I would like to avoid adding a gesture recogniser or code for this in every button and segment control I have on the screen
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapOnView:)];
[self.mainView addGestureRecognizer:tap];
- (void)tapOnView:(UITapGestureRecognizer *)sender
{
//do something
}
I have also tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event , but this has the same issue as the gesture recognizer.
I was wondering if there is any way I could achieve this task. I was hoping that I may be able to recognise the type of event from within the nextResponder, and then I could detect if it is button for example.
EDIT: The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched. I need this feature to only occur on 1 of my viewcontrollers.
As mentioned by Ian MacDonald, using hitTest:: is a great solution to detect user interaction on an app wide scale, including when buttons, textfields, etc, are selected.
My solution was to subclass UIWindow and implement the hitTest method.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// do your stuff here
// return nil if you want to prevent interaction with UI elements
return [super hitTest:point withEvent:event];
}
You could attach your UITapGestureRecognizer to your [[UIApplication sharedApplication] keyWindow].
Alternatively, you could override hitTest: of your root UIView.
Is there a particular task you are hoping to accomplish? There may be a better way than assigning an "anywhere" gesture.
Edit: Use hitTest:.
#interface PassthroughView : UIView
#property (readonly) id target;
#property (readonly) SEL selector;
#end
#implementation PassthroughView
- (void)setTarget:(id)target selector:(SEL)selector {
_target = target;
_selector = selector;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
[_target performSelector:_selector];
return nil;
}
#end
#implementation YourUIViewController {
PassthroughView *anytouchView;
}
- (void)viewDidLoad {
// Add this at the end so it's above all other views.
anytouchView = [[PassthroughView alloc] initWithFrame:self.view.bounds];
[anytouchView setAutoresizingMask:UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight];
[anytouchView setTarget:self selector:#selector(undim)];
[anytouchView setHidden:YES];
[self.view addSubview:anytouchView];
}
- (void)undim {
[anytouchView setHidden:YES];
}
- (void)dim {
[anytouchView setHidden:NO];
}
#end
Your edit adds more clarity to your question.
The reason I am working on this is that my app needs to stay active
and the screen cannot be locked (so I have disabled screen locking).
To avoid excessive use of power, I need to dim the screen, but then
return the brightness back to the original level once the app is
touched.
Since you are controlling the screen brightness, you can add one transparent view controller before dimming screen on top of your root controller which does only one job, listen to tap using Tap gesture. And on tap you can dismiss the view controller and adjust brightness to previous state.
By doing so you dont have to worry about buttons being clicked as they will be below the transparent view controller. Since its a whole new view controller sitting on top of stack you dont have to modify your existing code as well.
Ok I have had a similar problem before.
As I remember I subclassed the UIWindow for full screen detection and made it First responder.
Than I overridden the touch to handle from subclasses.
You can also use code to identify the control that is been touched.
#import <QuartzCore/QuartzCore.h>
- (void)viewDidLoad
{
[super viewDidLoad];
[self.view setMultipleTouchEnabled:YES];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Enumerate over all the touches
[touches enumerateObjectsUsingBlock:^(id obj, BOOL *stop) {
// Get a single touch and it's location
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:self.view];
...
}];
}
To disable the locking of screen I used below code:
[[UIApplication sharedApplication] setIdleTimerDisabled:YES];
I used following functions to dim or increase the screen brightness
[[UIScreen mainScreen] setBrightness:0.0f]; //and
[[UIScreen mainScreen] setBrightness:1.0f];
I want to handle touches on my disabled button
self.closeButton.enabled = NO;
self.closeButtonDisabledRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(closeButtonDisablePressed)];
[self.closeButton addGestureRecognizer:self.closeButtonDisabledRecognizer];
But seems like it doesn't work. Any good solution ?
Disabling is too much like telling the SDK to ignore touches to be what you want. I suggest instead:
#property(assign, nonatomic) BOOL treatTheCloseButtonAsEnabled;
// replace the synthesized setter
- (void)setTreatTheCloseButtonAsEnabled:(BOOL)enabled {
self.closeButton.alpha = (enabled)? 1.0 : 0.5;
// or some other visible indication of the "disabled" state
}
- (IBAction)pressedCloseButton:(id)sender {
if (self.treatTheCloseButtonAsEnabled) {
// logic for a regular press
} else {
// logic for a disabled press
}
}
Maybe you can also use the touch event in the controller.It will be like this(I assume that your close button is added on controller.view):
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint touchPoint = [[touches anyObject] locationInView:self.view];
touchPoint = [self.view convertPoint:touchPoint toView:_closeButton.superview];
//to determine wether user touch on you button
if (CGRectContainsPoint(_closeButton.frame, touchPoint) == NO) {
return;
}
//handle your touch on button here
}
I have a CCNode that contains multiple CCSprite children.
I would like to receive touch events in my parent CCNode if any of the children have been touched.
This behaviour seems like it should be supported, I may be missing something.
My solution is to setUserInteractionEnabled = YES on all children and bubble the event up to the parent.
I do this by subclassing the CCSprite class overriding their method :
- (void) touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
[super touchBegan:touch withEvent:event];
}
I am wondering if there is a more elegant, simple and generic way of accomplishing the same behaviour ?
You could override hitTestWithWorldPos: of your 'containing' node, either calling hitTestWithWorldPos on specific children or, iterating through all children as you see fit. Perhaps something like this:
-(BOOL) hitTestWithWorldPos:(CGPoint)pos
{
BOOL hit = NO;
hit = [super hitTestWithWorldPos:pos];
for(CCNode *child in self.children)
{
hit |= [child hitTestWithWorldPos:pos];
}
return hit;
}
edit: just to be clear, then you would only need to setUserInteractionEnabled for the container, and only process the touch using the touch events of the containing node.
edit2:
so, I thought about it for a bit more and here's a quick category you can add that will enable a quick hit test for all children of a node recursively.
CCNode+CCNode_RecursiveTouch.h
#import "CCNode.h"
#interface CCNode (CCNode_RecursiveTouch)
{
}
-(BOOL) hitTestWithWorldPos:(CGPoint)worldPos forNodeTree:(id)parentNode shouldIncludeParentNode:(BOOL)includeParent;
#end
CCNode+CCNode_RecursiveTouch.m
#import "CCNode+CCNode_RecursiveTouch.h"
#implementation CCNode (CCNode_RecursiveTouch)
-(BOOL) hitTestWithWorldPos:(CGPoint)worldPos forNodeTree:(id)parentNode shouldIncludeParentNode:(BOOL)includeParent
{
BOOL hit = NO;
if(includeParent) {hit |= [parentNode hitTestWithWorldPos:worldPos];}
for( CCNode *cnode in [parentNode children] )
{
hit |= [cnode hitTestWithWorldPos:worldPos];
(cnode.children.count)?(hit |= [self hitTestWithWorldPos:worldPos forNodeTree:cnode shouldIncludeParentNode:NO]):NO; // on recurse, don't process parent again
}
return hit;
}
#end
usage would just be .. in the containing class, override hitTestWithWorldPos like this:
-(BOOL) hitTestWithWorldPos:(CGPoint)pos
{
BOOL hit = NO;
hit = [self hitTestWithWorldPos:pos forNodeTree:self shouldIncludeParentNode:NO];
return hit;
}
and of course, don't forget to include the category header.
-(void) touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
//Do whatever you like...
//Bubble the event up to the next responder...
[[[CCDirector sharedDirector] responderManager] discardCurrentEvent];
}