I'm trying to get a particular event to work.
The user should touch one button (A), this button would pop-up some UIView, in which there is another button (B).
If the user touchUpInside button A, the popup should disappear, and the Button (B) would not be clickable.
However, if the user clicks on button (A), and then drag his finger into the button (B) then this button (B) will be selected, but only fire if the user TouchUpInside this button (B)
I have tried the most obvious TouchDragEnter and TouchDragInside, but it does not do what I expected, you have to touchDownInside first for it to work. Since the TouchDownInside event has been done on the button (A), it can't be done on the button (B)
Have I missed something, or should I just go ahead a create a subClass of my own for this particular behaviour ?
To add some difficulty, the button (A) and button (B) are not on the same UIView.
If I was to go about this, I would ignore touchUpInside etc altogether and just focus on the drag event. In the touchesMoved method, keep checking where the touches are, and if they intersect the rect of the view you are concerned about.
Something like this : (apologies for incomplete example)
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self];
if ( CGRectContainsPoint(imageView.bounds, locationInView) ) {
// Point lies inside the bounds...
I have just created a library called PopMenu that implements exactly the function you mentioned. The idea is it pops out buttons when you touchdown on the menu button and keep tracking the user's finger until the user touches up on the screen. It returns the button that the user ended his touch on.
Related
Suppose I have a UIView named Cake. Cake has a gesture recognizer.
Now, suppose I have a UIButton named Bob.
I add Cake as a subview to Bob:
[Bob addSubview: Cake];
Now, Bob, the UIButton, no longer responds the control event touch up inside.
I want Cake to be able to handle the touch while Bob simultaneously handles the touch as well. Currently, Cake can handle the touch, but Bob lazily does nothing.
Things I have tried:
Setting cancelsTouchesInView of Cake's gesture recognizer to NO
Implementing the UIGestureRecognizerdelegate for Cake's gesture recognizer and always returning YES for the shouldRecognizeSimultaneouslyWithGestureRecognizer method
Subclassing UIGestureRecognizer and calling [self.view.nextResponder touchesSomething:touches withEvent:event]; in each of the touchesSomething (touchesBegan, touchesEnded, etc.) methods (I've also confirmed that the next responder IS IN FACT the UIButton that is supposed to handle the control events)
Not using a gesture recognizer and instead just using the touchesSomething methods in the UIView (Cake) + passing through the touchesSomething calls to all of super, self.superview, self.nextResponder and more.
Does anyone know a good way to make this work?
My suggestion is the following:
Make sure, that you set the UserInteraction of the subview to false! Now the action of the button should be called correctly.
Now add the event parameter to the action:
- (IBAction)pressButton:(UIButton *)sender forEvent:(UIEvent *)event
Inside the action check if the press was inside the subview
- (IBAction)pressButton:(UIButton *)sender forEvent:(UIEvent *)event
// get location
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
// check position
if (CGRectContainsPoint(self.subview.frame, location) {
// call selector like the gesture recognizer here
}
}
What I'm trying to make, Cake, is a view that can be placed as a subview into any button without additional setup - its a decorative view of sorts. The gesture recognizer of Cake is there to make a small animation
Then you're going about this all wrong. Take away the gesture recognizer of Cake; you don't need it. You're trying to get Cake to respond to Bob being pressed. But that's easy; Bob's a button! The button already tells you everything that's happening — it's being highlighted etc. So all you need is a UIButton subclass that tells Cake when to do its animation.
I don't know whether this is the right place to ask this question.
I have an image when I am displaying in a UIImageView. This image view takes the full screen. In the image there are facebook, twitter and email buttons (not actually buttons - it's designed in the image).
I want to make the facebook, twitter and email "buttons" tappable and do something. Is this possible?
You are not doing it the right way.
Most of the time when you make an interface there's a lot of logical components, and each components has its own actions (animation/tappable/editable components). For each different logical component you should put a different element on it.
In your case, I would set up a full screen UIImageView just for the background, add UILabels for the texts on it (you can opt-out on this if your text won't have any effect on it, but that's what I prefer because this way it's more flexible in the future). And on top of those, you can add UIButtons for the tappable elements, for example, Facebook and twitter buttons.
Edit: If you just hate making separate images, you can always make invisible buttons (a button without any text)
Implement the touchesEnded:withEvent: method and check if the touch location is within the rectangle of a desired button in the image.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
if (CGRectContainsPoint(_facebookButtonRect, location))
{
// Do your stuff.
}
}
You can also place invisible buttons on top of the image view.
For each button:
1) Add a transparent UIButton as a subview to your image view. Position it on your image by setting its frame. Use alpha=1 and backgroundColor = [UIColor clearColor] for the button. Also set the button title to "". All this should give you an invisible button that still responds to touches.
2) Handle the button tap event as usual.
I'm trying to call action with UIControlEventTouchDown event for simple UIButton which placed in left-bottom corner of UIViewController (which pushed with UINavigationController).
I created storyboard to push view controller with button.
And added actions for button to trace touch events.
- (IBAction)touchUpInside:(id)sender {
NSLog(#"touchUpInside");
}
- (IBAction)touchDown:(id)sender {
NSLog(#"touchDown");
}
And also added touchesBegan to trace if it is called.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
NSLog(#"touchesBegan");
}
Now, with such setup, I have strange behaviour. There are some touch areas in left (width =13) and left-bottom (width = 50, height = 50) which respond differently on touches. If you will will make touch over those areas -touchesBegan is not called on touch down, as would be with normal behaviour. But it will be called only after touch up.
I believe left area is used in UINavigationControoler for interactive pop of pushed UIViewController. So two questions here.
For which functionality is responsible area in bottom-left?
How and where can I change behaviour to pass touch event to UIButton (for example if I want UIButton to respond on long touch event, when I pressing in "red" area)?
I had this same problem, and I fixed it by disabling the "swipe to go back" (technically called "interactive pop gesture" in UIKit) feature introduced in iOS 7.
Sample code to disable interactive pop gesture:
if ([self.navigationController respondsToSelector:#selector(interactivePopGestureRecognizer)]) {
self.navigationController.interactivePopGestureRecognizer.enabled = NO;
}
I believe this is due to the interactive pop gesture recognizer consuming/delaying touch events near the left edge of the screen (because a swipe to go back starts from the left edge) and thus causing the touch events to not be delivered to controls that are situated near the left edge of the view.
I am trying to set up a button using UIControlEventTouchDragEnter as the way to trigger the button's method. Specifically, I have a button, and I want the button's method to be triggered if the user presses their finger outside of the button, and drags their finger into the bounds of the button.
According to apple, this event, UIControlEventTouchDragEnter, is: An event where a finger is dragged into the bounds of the control.
However, I can't get the button to trigger. Here is my code:
- (IBAction)touchDragEnter:(UIButton *)sender {
_samlpe.image = [UIImage imageNamed:#"alternate_pic.png"];
}
So, when touchInto for this button is triggered, the method will change the current image of _sample into this alternate image. If I just use touchUpInside, the image does change to the alternate upon button click.
Does anyone know why this isn't working, or have work-arounds? Thanks!
The touchDragEnter is only triggered when you initially tap the button, drag your finger to the outside of the bounds of the button, and drag again into the bounds of the button.
You might want to make use of touchesMoved method in your view controller class and detect the button entered based on the location of the touch:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:touch.view];
NSLog(#"%f - %f", touchLocation.x, touchLocation.y);
}
My problem is as follows:
I'm displaying a message view with an attachment, in a standard view controller. When the user presses and holds the attachment icon it show the image on screen, when the user lets go the image disappears. This is to aid detecting screenshots while the user is viewing the image.
I use a long press gesture recognizer to detect the touch and then touchesEnded or touchesCancelled to detect the release of the touch.
My problem occurs when the user presses the screen with a second finger, as the release of the second touch is not reported. The code is below, the methods get called in this order:
First long press -> attachmentLongPressed called
Second long press -> attachmentLongPressed called
Release first finger -> touchesEnded called
Release second finger -> nothing called
-(void)attachmentImageLongPressed:(UIImageView *)sender{
if(!self.isAttachmentOpen){
[self setAttachmentOpen:YES];
// Show image...
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[self setAttachmentOpen:NO];
// Remove image from view
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
if(self.isAttachmentOpen){
[self screenshotDetected];
}
}
The result is that the image view is left on screen with no way to dismiss it. Anybody have any suggestions?
I think it should be called touchesEnded when release the second finger. You can log all the touches from all callback to find out which method is being called,
However, it's possible that long pressed gesture might delays the touch ended event, so try to set delaysTouchesEnded to FALSE.
gestureLongPressed.delaysTouchesEnded = FALSE