Touch not working in mainscene - ios

I am unable to figure out the why. could someone please say where the bug is?
Xcode 6.1.1
Cocos2d 3.1.0
I used a break point to see if the touch method is getting called or not.
Its never being called when i test and touch in device.
I used this line blow in main method
self.userInteractionEnabled = YES;
also super on enter is called below main

Make certain your object instance has a content size set in onEnter. Touch may be enabled, but with (0,0) content size, they are not being dispatched. Also, you must have a touchBegan method in your code. The following lines are pretty much 'boilerplate' for my UI bearing classes:
- (void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
}
- (void)onEnter {
[super onEnter];
self.userInteractionEnabled = YES;
// replace following two lines by whatever works for your
// specifics, but make certain that you have them and that
// the object's geometry falls in the display screen.
// setPositionInPointsW is in my own CCNode category, not in cocos2d
[self setPositionInPointsW:self.viewPort.origin];
self.contentSizeInPoints = self.viewPort.size;
}

Related

cocos2d 3.0 how to handle touch swallowing with priority

Since CCTargetedTouchDelegate has been removed since cocos2d 3.0, I would like to know how can i handle touch swallowing stuff?
I have a small sprite on top of a canvas node(layer in 2.x) and need to set the priority of the sprite higher than the canvas. When a user touches within the sprites bounding box, the touch is swallowed, otherwise the canvas will respond to it.
In cocos2d 3.0 touches are handled in reverse z-order order. This means that if your sprite is rendered on top of the canvas node it should already receive touch notification first, and has a chance to swallow it.
However, to receive and swallow the touch by the sprite you should follow these steps:
Create a separate class for your sprite and inherit it from
CCSprite.
Set self.userInteractionEnabled to YES in the init method
if this class.
Add empty touchBegan: method.
This will swallow the touch, because without calling [super touchBegan:...] in the touchBegan: method, you won't pass it to the underlying nodes.
The default implementation of CCSprite (and all the way up to CCNode) calls the [super touchBegan:...], this is why you need to create a subclass and override this behaviour.
In case you do need to pass touches to underlying nodes in some cases, you can write something like this:
-(void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
if (_passToUnderlyingNode == YES)
{
//passed to canvas node
[super touchBegan:touch withEvent:event];
}
else
{
//swallowed
}
}

Objective C iPhone programming a dot or shape to follow your cursor

How would I create a program so a dot starts in the center, and when I click the screen the dot follows where I clicked? Not as in teleports to it, I mean like changes it's coordinates towards it slightly every click. I get how I could do it in theory, as in like
if (mouseIsClicked) {
[mouseX moveX];
[mouseY moveY];
}
And make the class that mouseX and mouseY are have some methods to move closer to where the mouse is, but I just don't know any specifics to actually make it happen. Heck, I don't even know how to generate a dot in the first place! None of those guides are helping at all. I really want to learn this language though. I've been sitting at my mac messing around trying to get anything to work, but nothing's working anywhere near how I want it to.
Thanks for helping a total newbie like me.
If you are going to subclass UIView, you can use the touchesBegan/touchesMoved/touchesEnded methods to accomplish this. Something like:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self];
//slightly update location of your object with p.x and p.y cgpoints
[self setNeedsDisplay];
}
-(void)drawRect{
//draw your object with updated coordinates
}
You can create a dot and move it around based on taps all within your UIViewController subclass.
Make your dot by creating a UIView configured to draw the way you want - look into CALayer and setting dotview.layer.cornerRadius to make it be round (alternately you can make a UIView subclass that overrides drawRect: to make the right CoreGraphics calls to draw what you want). Set dotview.center to position it.
Create a UITapGestureRecognizer with an action method in your view controller that updates dotview.center as desired. If you want it animated, simply set the property within a view animation call & animation block like this:
[UIView animateWithDuration:0.3 animations:^{
dotview.center = newposition;
}];
You can download the sample code here that will show you general iOS gestures. In particular it has a sample that shows how to drag and drop UIViews or how to swipe them around. The sample code includes both driven and fire-n-forget animations. Check it out, its commented and I'd be happy to answer any specific questions you have after reviewing the code.
Generating a simple circle
// Above Implementation
#import <QuartzCore/QuartzCore.h>
// In viewDidLoad or somewhere
UIView *circleView = [[UIView alloc] initWithFrame:CGRectMake(32.0, 32.0, 64.0, 64.0)];
[circleView setBackgroundColor:[UIColor redColor]];
[circleView.layer setCornerRadius:32.0];
self.view addSubview:circleView];

Why is the top portion of my UISegmentedControl not tappable?

While I was playing on my phone, I noticed that my UISegmentedControl was not very responsive. It would take 2 or more tries to make my taps register. So I decided to run my app in Simulator to more precisely probe what was wrong. By clicking dozens of times with my mouse, I determined that the top 25% of the UISegmentedControl does not respond (the portion is highlighted in red with Photoshop in the screenshot below). I am not aware of any invisible UIView that could be blocking it. Do you know how to make the entire control tappable?
self.segmentedControl = [[UISegmentedControl alloc] initWithItems:[NSArray arrayWithObjects:#"Uno", #"Dos", nil]];
self.segmentedControl.selectedSegmentIndex = 0;
[self.segmentedControl addTarget:self action:#selector(segmentedControlChanged:) forControlEvents:UIControlEventValueChanged];
self.segmentedControl.height = 32.0;
self.segmentedControl.width = 310.0;
self.segmentedControl.segmentedControlStyle = UISegmentedControlStyleBar;
self.segmentedControl.tintColor = [UIColor colorWithWhite:0.9 alpha:1.0];
self.segmentedControl.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin;
UIView* toolbar = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.width, HEADER_HEIGHT)];
toolbar.autoresizingMask = UIViewAutoresizingFlexibleWidth;
CAGradientLayer *gradient = [CAGradientLayer layer];
gradient.frame = CGRectMake(
toolbar.bounds.origin.x,
toolbar.bounds.origin.y,
// * 2 for enough slack when iPad rotates
toolbar.bounds.size.width * 2,
toolbar.bounds.size.height
);
gradient.colors = [NSArray arrayWithObjects:
(id)[[UIColor whiteColor] CGColor],
(id)[[UIColor
colorWithWhite:0.8
alpha:1.0
] CGColor
],
nil
];
[toolbar.layer insertSublayer:gradient atIndex:0];
toolbar.backgroundColor = [UIColor navigationBarShadowColor];
[toolbar addSubview:self.segmentedControl];
UIView* border = [[UIView alloc] initWithFrame:CGRectMake(0, HEADER_HEIGHT - 1, toolbar.width, 1)];
border.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleTopMargin;
border.backgroundColor = [UIColor colorWithWhite:0.7 alpha:1.0];
border.autoresizingMask = UIViewAutoresizingFlexibleWidth;
[toolbar addSubview:border];
[self.segmentedControl centerInParent];
self.tableView.tableHeaderView = toolbar;
http://scs.veetle.com/soget/session-thumbnails/5363e222d2e10/86a8dd984fcaddee339dd881544ecac7/5363e222d2e10_86a8dd984fcaddee339dd881544ecac7_20140509171623_536d6fd78f503_68_896x672.jpg
As already written in other answers, UINavigationBar grabs the touches made near the nav bar itself, but not because it has some subviews extended over the edges: this is not the reason.
If you log the whole view hierarchy, you will see that the UINavigationBar doesn't extends over the defined edges.
The reason why it receives the touches is another:
in UIKit, there are many "special cases", and this is one of them.
When you tap the screen, a process called "hit testing" starts. Starting from the first UIWindow, all views are asked to answer two "questions": is the point tapped inside your bounds? what is the subviews that must receive the touch event?
this questions are answered by these two methods:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Ok, now we can continue.
After the tap, UIApplicationMain starts the hit testing process. The hit test starts from the main UIWindow (and is executed even on the status bar window and the alert view window, for example), and goes through all subviews.
This process is executed 3 times:
two times starting from UIWindow
one times starting from _UIApplicationHandleEvent
If you tap on the Navigation Bar, you will see that hitTest on UIWindow will return the UINavigationBar (all three times)
If you tap on the area below the Navigation Bar however, you will se something strange:
the first two hitTest will return your UISegmentedControl
the last hitTest will return UINavigationBar
why this?
If you swizzle and subclass UIView, overriding hitTest, you will see that the first two times the tapped point is correct. The third time, something changes the point doing something like point - 15 (or a similar number)
After a lot of searching, I have found where this is happening:
UIWindow has a (private) method called
-(CGPoint)warpPoint:(CGPoint)point;
debugging it, I saw that this method changes the tapped point if it is immediately below the status bar.
Debugging more, I saw that the stack calls that make this possible, are only 3:
[UINavigationBar, _isChargeEnabled]
[UINavigationBar, isEnabled]
[UINavigationBar, _isAlphaHittableAndHasAlphaHittableAncestors]
So, at the end, this warpPoint method checks if the UINavigationBar is enabled and hittable, if yes it "warps" the point. The point is warped of a number of pixel between 0 and 15, and this "warp" increases when you get closer to the Navigation Bar.
Now that you know what happens behind the scenes, you have to know how to avoid it (if you want).
You can't simply override warpPoint: if the application must go on the AppStore: it's a private method and your app will be rejected.
You have to find another system (like as suggested, overriding sendEvent, but I'm not sure if it will work)
Because this question is interesting, I will think about a legal solution tomorrow and update this answer (one good starting point can be subclassing UINavigationBar, overriding hitTest and pointInside, returning nil/false if, given the same event over multiple calls, the point changes. But I must test if it works tomorrow)
EDIT
Ok, I've tried many solutions but it's not simple to find a legal and stable one.
I've described the actual behavior of the system, that could vary on different versions (hitTest called more or less than 3 times, the warpPoint warping the point of about 15px that can change ecc ecc).
The most stable is obviously the illegal override of warpPoint: in a UIWindow subclass:
-(CGPoint)warpPoint:(CGPoint)point;
{
return point;
}
however, I've found that a method like this (in UIWindow subclass) it's stable enough and does the trick:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// this method is not safe if you tap the screen two times at the same x position and y position different for 16px, because it moves the point
if (self.lastPoint.x == point.x)
{
// the points are on the same vertical line
if ((0 < (self.lastPoint.y - point.y)) && ((self.lastPoint.y - point.y) < 16) )
{
// there is a differenc of ~15px in the y position?
// if so, the point has been changed
point.y = self.lastPoint.y;
}
}
self.lastPoint = point;
return [super hitTest:point withEvent:event];
}
This method records the last point tapped, and if the subsequent tap is at the same x, and an y different for max 16px, then uses the previous point.
I've tested a lot and it seems stable.
If you want, you can add more controls to enable this behavior only in particular controllers, or only on a defined portion of the window, ecc ecc.
If I find another solution, I'll update the post
I believe the problem is because the buttons in the UINavigationBar have a larger than normal touch area. See this SO post. You can also find plenty of discussion on this with a 'UINavigationBar touch area' Google search.
As a possible solution, you could put the segmented control IN the navigation bar, but you would know better than I if that fits your use cases or not.
I've come up with an alternate solution that to me seems safer than LombaX's. It uses the fact that both events come in with the same timestamp to reject the subsequent event.
#interface RFNavigationBar ()
#property (nonatomic, assign) NSTimeInterval lastOutOfBoundsEventTimestamp;
#end
#implementation RFNavigationBar
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// [rfillion 2014-03-28]
// UIApplication/UIWindow/UINavigationBar conspire against us. There's a band under the UINavigationBar for which the bar will return
// subviews instead of nil (to make those tap targets larger, one would assume). We don't want that. To do this, it seems to end up
// calling -hitTest twice. Once with a value out of bounds which is easy to check for. But then it calls it again with an altered point
// value that is actually within bounds. The UIEvent it passes to both seem to be the same. However, we can't just compare UIEvent pointers
// because it looks like these get reused and you end up rejecting valid touches if you just keep around the last bad touch UIEvent. So
// instead we keep around the timestamp of the last bad event, and try to avoid processing any events whose timestamp isn't larger.
if (point.y > self.bounds.size.height)
{
self.lastOutOfBoundsEventTimestamp = event.timestamp;
return nil;
}
if (event.timestamp <= self.lastOutOfBoundsEventTimestamp + 0.001)
{
return nil;
}
return [super hitTest:point withEvent:event];
}
#end
You might want to check which view is recording the touches. Try this method-
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
[touch locationInView:self.view];
if([touch.view isKindOfClass:[UISegmentedControl class]])
{
NSLog(#"This is UISegment");
}
else if([touch.view isKindOfClass:[UITabBar class]])
{
NSLog(#"This is UITabBar");
} else if(...other views...) {
...
}
}
Once you figure that out you maybe able to narrow down your problem.
It looks as if you're using a category extension to set width/height on views, as well as center them in their parent. Perhaps there is a hidden issue here - can you refactor to do your layout w/out this category?
I copied your code into a clean project and ran it in a UITableViewController's viewDidLoad method - it works fine and I have no dead spots like you report. I had to change your code slightly since I don't have the same category extension that you're using.
Also, if you're running this code in viewDidLoad, you should verify that your view has a defined size (you access your view.width). If you're creating your UITableViewController programmatically (vs from a nib/storyboard) then the frame may be CGRectZero. Mine was loaded from a nib so the frame was preset.
I'd also try temporarily removing your border view to see if it's the culprit.
I recommend that you avoid having touch-sensitive UI in such close proximity to the nav bar or toolbar. These areas are typically known as "slop factors" making it easier for users to perform touch events on buttons without the difficulty of performing precision touches. This is also the case for UIButtons for example.
But if you want to capture the touch event before the navigation bar or toolbar receives it, you can subclass UIWindow and override: -(void)sendEvent:(UIEvent *)event;
An easy way to debug this is to try using DCIntrospect in your project. It's a very easy to use/implement library that makes finding out what views are where when in the simulator a breeze.
Install the library and configure it
Run the application in the simulator and navigate to the screen with the issue
Press spacebar on the keyboard (the computer keyboard, not the simulator's
keyboard)
Click on the 25% area and see what gets highlighted.
If what's highlighted isn't the segmented view controller, that view could be what's covering up the touch event.
Create a protocol for UINavigationBar: (add new file and paste below code)
/******** file: UINavigationBar+BelowSpace.h*******/
"UINavigationBar+BelowSpace.h"
#import <Foundation/Foundation.h>
#interface UINavigationBar (BelowSpace)
#end
/*******- file: UINavigationBar+BelowSpace.m*******/
#import "UINavigationBar+BelowSpace.h"
#implementation UINavigationBar (BelowSpace)
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
int errorMargin = 5;// space left to decrease the click event area
CGRect smallerFrame = CGRectMake(0 , 0 - errorMargin, self.frame.size.width, self.frame.size.height);
BOOL isTouchAllowed = (CGRectContainsPoint(smallerFrame, point) == 1);
if (isTouchAllowed) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
#end
Hope this help ^ ^
Try this
self.navigationController!.navigationBar.userInteractionEnabled = false;

Disable multi touches for a drawing view

I have a view where I'm drawing lines. When I draw a line with two fingers or more, there is a weird behaviour. That's why I want to disable multi touch on this view.
I tried :
self.drawingView.multipleTouchEnabled = NO;
self.drawingView.exclusiveTouch = YES;
But there is no impact. And my the touches method are still called.
Ideally, I want to when I try to draw with two fingers, it does nothing. Is there a solution ?
Thanks :)
In your touches methods (Began/Moved) check how many touches are on screen and there is only one touch, handle it, otherwise pass it along. Example touchesMoved:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
if ((touches.count == 1) && ([event allTouches].count == 1)) {
// handle single finger touch moves here
....
} else {
// If more than one touch, pass it along
[super touchesBegan:touches withEvent:event];
}
}

Cocos2d handling touch with multiple layers

I've been busy for a few days trying to figure out how to handle touch in my Cocos2d project. The situation is a bit different as normal. I have a few different game layers that have items on it that I need to control with touch:
ControlLayer: Holds the game controls
(movement, action button). This layer is on top.
GameplayLayer: Holds the game objects
(CCSprites). This layer is directly beneath the ControlLayer.
Now my touches work fine in the ControlLayer, I can move my playable character around and make him jump and do other silly stuff. Yet I cannot grasp how to implement the touches to some of my CCSprites.
The information I've gathered so far makes me think I need get all my touch input from the control layer. Then I somehow need to 'cascade' the touch information to the GameplayLayer so I can handle the input there. Another option would be for me to get the CGRect information from my sprites by somehow creating an array with pointers to the objects that should be touchable. I should be able to use that information in the ControlLayer to check for each item in that list if the item was touched.
What is the best option to do this, and how do I implement this? I'm kind of new to programming with cocoa and Objective C so I'm not really sure what the best option is for this language and how to access the sprites CGRect information ([mySpriteName boundingBox]) in another class then the layer it is rendered in.
At the moment the only way I'm sure to get it to work is create duplicate CGRects for each CCSprite position and so I can check them, but I know this is not the right way to do it.
What I have so far (to test) is this:
ControlLayer.m
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGRect rect = CGRectMake(0.0f, 0.0f, 100.0f, 100.0f);
//Tried some stuff here to get see if I could get a sprite by tagname so I could use it's bounding box but that didn't work
// Check for touch with specific location
if (CGRectContainsPoint([tree boundingBox], location)) {
CCLOG(#"CGRect contains the location, touched!");
}
CCLOG(#"Layer touched at %#", NSStringFromCGPoint(location));
}
Thanks in advance for helping me!
The easiest and simplest way to solve your problem, IMO, is by using ccTouchBegan/Moved/Ended instead of ccTouchesBegan/Moved/Ended. Meaning, you are handling a single touch at a particular moment so you avoid getting confuses over multiple touches, plus the most important feature of ccTouchBegan is a CCLayer can 'consume' the touch and stop it from propagating to the next layers. More explanation after code samples below.
Here are steps to do it. Implement these sets of methods in all CCLayer subclasses that should handle touch events:
First, register with CCTouchDispatcher:
- (void)registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
Next, implement ccTouchBegan, example below is from a game I've created (some part omitted of course):
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
if (scene.state != lvlPlaying) {
// don't accept touch if not playing
return NO;
}
CGPoint location = [self convertTouchToNodeSpace:touch];
if (scene.mode == modePlaying && !firstTouch) {
if (CGRectContainsPoint(snb_putt.sprite.boundingBox, location)) {
touchOnPutt = touch.timestamp;
// do stuff
// return YES to consume the touch
return YES;
}
}
// default to not consume touch
return NO;
}
And finally implement ccTouchMoved and ccTouchEnded like the ccTouches* counterparts, except that they handle single touch instead of touches. The touch that is passed to these methods is restricted to the one that is consumed in ccTouchBegan so no need to do validation in these two methods.
Basically this is how it works. A touch event is passed by CCScene to each of its CCLayers one by one based on the z-ordering (i.e starts from the top layer to the bottom layer), until any of the layers consume the touch. So if a layer at the top (e.g. control layer) consume the touch, the touch won't be propagated to the next layer (e.g. object layer). This way each layer only has to worry about itself to decide whether to consume the touch or not. If it decides that the touch cannot be used, then it just has to not consume the touch (return NO from ccTouchBegan) and the touch will automatically propagate down the layers.
Hope this helps.

Resources