I have an xcode iOS project I'm working on where I have falling particles coming from the top of the screen down to the bottom. Based on the accelerometer, they can either fall fast, or slow.
They are all UIImageViews that are falling, and when I tap them, they should just stop moving. This works fine if they are moving slow enough and I seem to tap just a little bit under them. The problem with this is that when I tap right on top of them when they are moving fast, I can never seem to hit them.
What's the solution to this? Do I need to make a bigger UIImageView with a smaller UIImage centered in it? Or can I use the UIGestureRecognizer to look for taps in a larger radius?
Try using UITapGestureRecognizer
(or)
Try to catch the touch event using touchesBegan: method.
Try writing the touchesBegan: method as follows:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
if (touch.view == UIImageView)
{
// execute the code you need to stop the UIImageView from moving
}
}
Hope this helps. All the best with your app :-)
Related
I ran into a pretty annoying issue here involving UIScreenEdgePanGestureRecognizerand UIButton interactions.
Here the context: I am developing an Ipad app. The view of my UIViewController contains a lot of UIButton's both at its very top and bottom. I realized after a while that, when being pressed at specific location, the UIButton's weren't behaving as usual. Taking the top row as an example, when the button is pressed at a location really close from the edge of the screen, the event linked to that button would take around .5 second to be triggered. I can also visually see it as the button takes time to be highlighted.
What the cause is: I did a bit of research and from what I could read from this post there is apparently a "conflict between the possibility of a gesture directed at your app and a gesture directed at the system". Basically as the user tap near the edges of the screen, the device is waiting for the user to swipe down or up (using UIScreenEdgePanGestureRecognizer, for example) and that is most likely why I get this delay with my UIButton's.
Now: All the posts regarding this issue are a little bit outdated. I tried different work-around and "not-that-convenient" solution:
i.e. Subclassing my UIButton's to allow the touch to "bypass" UIScreenEdgePanGestureRecognizer with the following method:
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL pointInside = [super pointInside: point withEvent: event];
for (UIButton *view in self.subviews) {
if (CGRectContainsPoint(view.frame, point)) pointInside = YES;
return pointInside;
}
}
return NO;
}
But in vain, still doesn't fix the issue.
I am wondering if anyone came up with an easy solution to that ? Or maybe Apple did a fix ?
Any suggestions / solutions / comments would be greatly appreciated folks!
I have UITableView, and added the ability for moving cells.
http://s017.radikal.ru/i402/1503/76/3442a9517cec.png
So, if longPressGesture is recognized, i make the cell hidden, take a snapshot of cell(on picture highlighted with grey), and change it's position while longPressGestureStateChanged. But the animation of moving looks bad.
If I add panGesture while longPressGestureBegan, it doesn't work until I touch up and touch down again, and after that panGestureStateChanged begin working, and moving become smooth.
I need panGesture begin working while longPressGestureBegan, or catch the screen touch position.
But the difficulty is: (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event; etc... doesn't work on UITableView.
I read a lot, but nothing was founded. This is the custom case. I need for example to recognize pinchGesture only when longPressGesture was recognized, without touching up.
Anybody know how to solve this?
For UITableView use ready methods to move cells. You can find more information in the documentation.
I'm wanting to be able to detect the following in Cocos2d v3:
A touch is initiated and held, then a second touch occurs somewhere else on the screen. Think of holding with one finger, and tapping with a second.
I've tried to use - (void)touchMoved:(UITouch *)touch withEvent:(UIEvent *)event but this is only called the first time the second touch occurs, not subsequently.
To be clear, if I hold a touch on the screen and then tap somewhere else, the above method is called. But if I continue holding the first touch and then tap a second time, the above method is not called.
Also, touchBegan: is only called when the first touch occurs (i.e. the initial holding touch) and touchEnded: is only called when all touches are removed, including the initial holding touch.
I'd like to know:
1) How to recognise the above gesture in Cocos2d v3?
2) If 1) isn't possible, would there be a way to do it with my own gesture recogniser, and how would I implement my own gesture recogniser into Cocos2d v3?
Turns out by default Cocos2d V3 only responds to a single touch by default.
The solution:
self.multipleTouchEnabled = TRUE;
This means now every new touch will call:
-(void) touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
And every time a finger is lifted from the screen it will call:
-(void) touchEnded:(UITouch *)touch withEvent:(UIEvent *)event
Even if there are other touches continuing.
if you use void HelloWorld::ccTouchesBegan(CCSet *pTouches, CCEvent *pEvent) then you get each touch count using pTouches->count();
I am new to iOS development. I have a custom drawn view which is composed of multiple subviews covering the target area on screen. Specifically this is a board game like chess where I used a view for each square. The squares are created as subviews on a UIView. There is one UIViewController for this. From what I read, I have to have touchesBegan, touchesEnded etc calls in my UIView to handle these. But none of these functions are getting called. I added these calls on the base view and all the subviews. So -
How do I simulate these touch events in the iOS simulator? A mouse click is not calling the touchesBegan ,touchesEnded calls on any view.
Ideally I would like to handle these in the UIViewController because I want to run the touch through some logic. Is that possible? How do I achieve it?
Please refer THIS
It is tutorial in Apple sample codes it describes how to handle touches very nicely.
Just run the sample code and go through description you will get clear idea how touches work un iOS.
Turns out when I add the view programmatically and not through the storyboard the userInteractionEnabled variable is set to NO by default. After setting it up, I get the touchesEnabled call getting called in the view.
Check this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if(![touch.view isKindOfClass:[yourView class]])
{
}
}
Hope this will help.
So I'm developing an app where I have all my gestures are being recognized. My problem comes when I attempt to add UIImageViews wherever the finger touches the screen.
These Views are to follow the finger, which they do, but the problem is I believe they are swallowing the touches not allowing the Gestures to be recognized. I have tried:
[_fingerHolder1 setUserInteractionEnabled:NO];
[_fingerHolder2 setUserInteractionEnabled:NO];
But it doesn't seem to change anything.
I am adding these the View in the ccTouchesBegan/Moved/Ended methods, whereas the gestures are being recognized in their respective handlers.
I have looked at using the UIPanGesture but I'm having some trouble recognizing the swipes as well as setting the coordinates for theUIImageViews of the finger trackers while doing this. Should I experiment with this more or is there a different solution?
Any help is appreciated!
The UIImageView will receive and process touches, hence they will not be forwarded to the cocos2d OpenGL view (also a UIView).
To make this work you need to create a subclass of UIImageView and override each touches… method and manually forward the event to cocos2d's view, here's the example for touchesBegan:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[[CCDirector sharedDirector].view touchesBegan:touches withEvent:event];
}
Use this UIImageView subclass in place of the original ones you use currently.
That will make regular cocos2d touch events work, and it should also make UIGestureRecognizers behave as expected if you've added those to cocos2d's view.
If I understand what you need (please correct me if I'm wrong), you want to move some UIViews when a drag(pan) event is detected, but you also add UIImageViews when the user touches the screen and this disables the touches.
You should set UIIMageView.userInteractionEnable = YES(by default is set to NO), basically every view that should detect touches should have userInteractionEnable = YES.
If you want to ignore some touches on some subviews you should implement:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch method of UIGestureRecognizerDelegate.
For handling different types of gesture you should implement the method:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}