Forward UITouch event from superview to a UIControl subclass - ios

I'm writing a radial menu, where when you long press (UILongPressGestureRecognizer) on the screen, it pops out a menu of buttons, and I can drag my finger (which is already touching the screen) over one of the buttons, which selects, then when I let go, it performs an action specific to that button.
I currently have the radial menu as a UIControl subclass, and I'm trying to override beginTrackingWithTouch: and continueTrackingWithTouch:, but the long press that shows the menu (adds it to the superview), does not get transferred to a touch recognized by the UIControl.
Any ideas how I can "forward" this touch event from the UIControl's superview to it?
Thanks!

Not a direct answer, but you should really watch the WWDC session about scrollviews of this year. And then watch it again. It contains a fantastic amount of information, and most certainly an answer to your question. It is session 235: advanced scrollviews and touch handling techniques.

I would do this...
The long press handler:
-(IBAction)onLongPress:(UILongPressGestureRecognizer*)recognizer
{
CGPoint point = [recognizer locationInView:self.view];
if (recognizer.state == UIGestureRecognizerStateBegan) {
//create the radial view and add it to the view
CGSize radialViewSize = CGSizeMake(80, 80);
radialView = [[RadialView alloc] initWithFrame:CGRectMake(point.x - radialViewSize.width/2, point.y - radialViewSize.height/2, radialViewSize.width, radialViewSize.height)];
[self.view addSubview:radialView];
radialView.backgroundColor = [UIColor redColor];
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
[radialView onTouchUp:[radialView convertPoint:point fromView:self.view]];
[radialView removeFromSuperview];
radialView = nil;
}
}
In your radial view: (I suppose that the radial view keeps the buttons in an array)
-(void)onTouchUp:(CGPoint)point
{
for (UIButton *button in buttons) {
if ([button pointInside:[self convertPoint:point toView:button] withEvent:nil]) {
//This button got clicked
//send button clicked event
[button sendActionsForControlEvents:UIControlEventTouchUpInside];
}
}
}
I know it's not perfect, since the touch events don't get forwarded to the radial view (as you asked), but it let's you click the buttons. Hope it helps!

I'm not sure if this is the same behavior you're looking for, but I recently had to overcome the exact same issue when developing my Concentric Radial Menu. The thing I discovered very quickly was that views added to the view hierarchy during the touch event do not get re-hit-tested and therefore seem unresponsive until the next event comes around.
The solution I used, which I can't say I love, was to implement a custom UIWindow subclass that intercepts - (void)sendEvent:(UIEvent *)event and forwards those events to the "active" radial menu.
That is, the menu registers with the window upon activation, then unregisters when being unloaded. If done atomically, this is actually a pretty safe technique, I just wish it were cleaner than it is.
Best of luck!

Related

TouchesBegan delay on left hand side of the display

On iPhone's with 3D touch enabled, there is a feature where long pressing the left hand side of the screen with enough force opens lets you change which app is active. Because of this, when a non-moving touch happens on the left hand side of the screen, the touch event is delayed for a second or two until the iPhone verifies that the user is not trying to switch tasks and is interacting with the app.
This is a major problem when developing a game with SpriteKit, as these touches are delayed by a second every time a user taps/holds their finger on the left edge of the screen. I was able to solve this problem by registering a UILongPressGestureRecognizer in the main Scene of the game, thus disabling TouchesBegan and implementing a custom touches function (used as a selector by the gesture recognizer):
-(void)handleLongPressGesture:(UITapGestureRecognizer *)gesture {
CGPoint location = [gesture locationInView:self.view];
if (gesture.state == UIGestureRecognizerStateBegan)
{
//
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
//
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
//
}
else if (gesture.state == UIGestureRecognizerStateCancelled)
{
//
}
}
-(void)didMoveToView:(SKView *)view {
/* Setup your scene here */
UILongPressGestureRecognizer *longPressGestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPressGesture:)];
longPressGestureRecognizer.delaysTouchesBegan = false;
longPressGestureRecognizer.minimumPressDuration = 0;
[view addGestureRecognizer:longPressGestureRecognizer];
// continue
}
The problem with this is that I would have to implement a gesture recognizer for every touch (including simultaneous ones) that I expect the user to enter. This interferes with any touchesBegan methods as subclasses of SKSpriteNode, SKScene, etc. and kills a lot of functionality.
Is there any way to disable this delay? When registering the gestureRecognizer, I was able to set delaysTouchesBegan property to false. Can I do the same somehow for my SKScene?
To see this issue in action, you can run the default SpriteKit project, and tap (hold for a second or two) near the left hand side of the screen. You will see that there is a delay between when you touch the screen and when the SKShapeNodes are rendered (as opposed to touching anywhere else on the screen).
* Edit 1 *
For those trying to find a way to get around this for now, you can keep the gesture recognizer but set its cancelsTouchesInView to false. Use the gesture recognizer to do everything you need to do until TouchesBegan kicks in (touchesBegan will receive the same touch event about a second after the gesture recognizer recognizes the touch). Once touchesBegan kicks in, you can disable everything happening in the gesture recognizer. This seems like a sloppy fix to me, but it works for now.
Still trying to find a more-or-less formal solution.
I have experienced this as an user and it is really annoying. The only thing that worked for me was to disable the 3D touch. Otherwise the left side of the touchscreen is almost useless.

UITapGestureRecognizer on UIView gets fired after I tap on my MPVolumeSlider

I found some questions and answers here on stackoverflow for that problem, but none of the solutions there solved my problem.
My iOS App has the ability to play some music with a nice music player. I designed it with Xcode's Interface Builder and dragged out a UIView and changed its class to MPVolumeView. Everything works fine when I'm debugging my app on my iPhone 6.
Here is my problem: I also dragged out a UITapGestureRecognizer on my whole view which contains my controls like
play/pause, next/previous track (...)
and also my MPVolumeView. When I tap on that view it should fade out and disappear. Then I added a UITapGestureRecognizer on my UIImageView which shows my artwork image of the song. When I tap this image view, it should fade in my view with all controls in int - that's working properly.
BUT: When I slide the knob of the volume slider just a little bit, or if I am just touching it, the view still disappears. It seems like my MPVolumeView is forwarding my touch or something like that. I tried setting userInteractionEnabled = false on my volume slider, but that didn't help. I also set the delegate of my gesture recognizer to self and added the
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
NSLog(#"tapped");
if([gestureRecognizer.view isMemberOfClass:[UIImageView class]]) {
return true;
}
return false;
}
function to my code, which returns true or false, depending on which view I'm tapping. When I'm accessing the gestureRecognizer.view property, it doesn't recognize my MPVolumeView, just the UIView in the background.
Here my two methods which are fired after when the TapGestureRecognizers are fired:
- (IBAction)overlayViewTapped:(UITapGestureRecognizer *)sender {
if(sender.state == UIGestureRecognizerStateEnded) {
[UIView animateWithDuration:0.3
delay:0.0
options:UIViewAnimationOptionAllowUserInteraction
animations:^{ self.blackOverlayView.alpha = 0.0; self.normalTimeLabel.alpha = 1.0; }
completion:nil];
}
}
- (IBAction)imageViewTapped:(UITapGestureRecognizer *)sender {
[UIView animateWithDuration:0.3
delay:0.0
options:UIViewAnimationOptionAllowUserInteraction
animations:^{ self.blackOverlayView.alpha = 1.0; self.normalTimeLabel.alpha = 0.0; }
completion:nil];
}
Please help me, I'm nearly going nuts with that ..
EDIT: My music player looks like this:
After I tap anywhere on the view (except the subviews), the view should fade out and hide everything, just show the artwork image of the song and the current elapsed time. This will look like this:
As I said - the problem is, if I just tap the volume slider or slide it just a little bit, my UITapGestureRecognizer fires and fades out my complete view. How can I prevent that?
It is behaving the way it is simply because you added the gesture recognizer to the entire UIView, which includes the volume slider and whatnot.
Instead of detecting the touch in the entire view, check to see if the touch is in the area you want it.
Create a CGRect property, I'll call it touchArea:
#property CGRect touchArea;
Then specify the size of the touchArea (you can do this in the viewDidLoad):
touchArea = CGRectMake(0.0, 240.0, 320.0, 240.0);
You will have to find out where you want this and how big it should be and replace my example values with the real ones. A simple way of cheating this is to take something like a UILabel in IB and positioning and sizing it to your desire, then go to the size inspector pane and get the x, y, width and height values.
Then, before you do your fade animation, check to see if the touch was in the touchArea:
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint touchPoint = [gestureRecognizer locationInView:self.view];
if (CGRectContainsPoint(touchArea, touchPoint))
{
//do your animation here.
}
}
As a note, I would set a BOOL to check whether or not the view is faded in or out, so you can always check before animating.

How to know if subViews intersect While moving with UIGesturerescognizer or some otherway?

In my application I have many UIButtons dynamically added to the View and I use below method
to Drag them around the View.
//forDragAction
[btnTarget addTarget:self action:#selector(wasDragged:withEvent:)
forControlEvents:UIControlEventTouchDragInside];
- (void)wasDragged:(UIButton *)button withEvent:(UIEvent *)event
{
// get the touch
UITouch *touch = [[event touchesForView:button] anyObject];
// get delta
CGPoint previousLocation = [touch previousLocationInView:button];
// frameof buttonChanged here
}
I want to stop dragging action if the dragged one intersect with anyother, I know that I can use for loop like below to check if any UIButton is Interacting
for(UIButton *btn in [[button superview] subViews])
{
//check if the btn frame interact with any others if so comeout of loop
}
I want to know if there is someother way, As mentioned way will get slower if the subViews count increse to such great amount
Edit:- the UIButtons are dynamically added to the UIView (But total amount of subViews won't exceed 120)
Try brute force first. You might be surprised at how well it does. (But remember that a loop through [[button superview] subviews] will contain the button itself, so it will always stop because the button intersects itself. Be sure to exclude the button).
Optimize after you have something working that is demonstrably slow with real data.
If that's really the case, there's a whole lot of algorithmic work done on this problem, which can be summarized as preprocessing the data into structures that allow cheaper initial tests to reject distant objects. This is a good SO answer on the topic, referring to this article.
I don't think that count of subviews, that may fill the screen (without intersections), is too large. So use function:
bool CGRectIntersectsRect (
CGRect rect1,
CGRect rect2
);
for detect whether frame of dragged button intersects with another subview.

Detect when finger dragging UIButton overlaps UIImageView

I am using UIPanGestureRecognizer to drag a UIButton around the screen. The idea is that the user can drag it over a folder to insert it in the folder (like iOS icons). This code I found works fine if I want to detect when the button overlaps with the image:
-(void) touchesEnded:(NSSet *) touches {
if(CGRectIntersectsRect([imageViewA frame], [imageViewB frame]) {
NSLog(#"Do something.");
}
}
But since the button is big and there are more images one next to another, it may happen that the button overlaps with both of them. I therefore want to detect when the actual user finger holding the UIButton overlaps with the image to trigger the right action. Any ideas?
UIGestureRecognizer will recognize the pan and when it ends, you can use locationInView: to find the finger's position in the button's super view. You can then see if they are overlapping with CGRectContainsPoint(frame, point):
- (void)handlePanGesture:(UIPanGestureRecognizer*)recognizer {
if ([recognizer state] == UIGestureRecognizerStateEnded) {
CGPoint fingerPoint = [recognizer locationInView:someImageView.superview];
if (CGRectContainsPoint(someImageView.frame, fingerPoint)) {
NSLog(#"Do something");
}
}
}

Multiple Gesture Responders for a Single View

I have an image that I would like to set up to respond to several different gesture responders. So for example, if one part of the picture is touched I would like one selector to be called, and another selector for a different part of the picture.
I looked at the UIGestureRecognizer and UITapGestureRecognizer classes, but I couldn't find a way to specify the image zones to be associated with them. Is this at all possible in iOS? And if so what classes should I look into using?
The easiest solution is to lay invisible views over the image and put the gesture recognizers on them.
If that's not feasible you'll have to look at the locationInView in the gesture recognizer's tap handler and figure out what you want to do based on where the user tapped.
Use the locationInView: property to determine where your tap occurred and then conditionally invoke a method. You can do this by setting up some CGRects that correspond to your hit areas. Then use the CGRectContainsPoint() function to determine if the tap landed in one of the hit areas.
Your tap gesture recognizer action may look something like this:
- (void)tapGestureRecognized:(UIGestureRecognizer *)recognizer
{
// Specify some CGRects that will be hit areas
CGRect firstHitArea = CGRectMake(10.0f, 10.0f, 44.0f, 44.0f);
CGRect secondHitArea = CGRectMake(64.0f, 10.0f, 44.0f, 44.0f)
// Get the location of the touch in the view's coordinate space
CGPoint touchLocation = [recognizer locationInView:recognizer.view];
if (CGRectContainsPoint(firstHitArea, touchLocation))
{
[self firstMethod];
}
else if (CGRectContainsPoint(secondHitArea, touchLocation))
{
[self secondMethod];
}
}

Resources