I have a situation in which I would like to place a UIView (that is not my code) on top of another UIView and have events propagated to the bottom most view. To be more concrete I would like to place a UIView on top of a MKMapView.
I have a UIView that contains both the views (as subviews). I would like my uiview to forward on the events from its view to both its subviews.
.h:
#interface MyUIView : UIView
#end
.m:
#import "MyMapView.h"
#import <MapKit/MapKit.h>
#import "MapGestureRecognizer.h"
#implementation MyMapView
MKMapView *mkMapView;
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
mkMapView = [[MKMapView alloc] initWithFrame:frame];
[self addSubview:mkMapView];
// Create another view to place on top of the mkMapView
// This is not my view I do not control is codebase. I would like to put it on top of the mkMapView
// and still have the mkMapView receive touch events
UIView * otherView = [[UIView alloc] initWithFrame:frame];
otherView.backgroundColor = [UIColor clearColor];
[self addSubview:otherView];
MapGestureRecognizer *mgr = [[MapGestureRecognizer alloc] init];
mgr.touchesBeganCallback = ^(NSSet * touches, UIEvent * event) {
UITouch *touch = [[event allTouches] anyObject];
NSLog(#"touches began=%#", [NSString stringWithFormat:#"%i", touch.view.tag]);
[mkMapView touchesBegan:touches withEvent:event];
};
mgr.touchesCancelledCallback = ^(NSSet * touches, UIEvent * event) {
UITouch *touch = [[event allTouches] anyObject];
NSLog(#"touches cancelled=%#", [NSString stringWithFormat:#"%i", touch.view.tag]);
[mkMapView touchesCancelled:touches withEvent:event];
};
mgr.touchesEndedCallback = ^(NSSet * touches, UIEvent * event) {
UITouch *touch = [[event allTouches] anyObject];
NSLog(#"touches ended=%#", [NSString stringWithFormat:#"%i", touch.view.tag]);
[mkMapView touchesEnded:touches withEvent:event];
};
mgr.touchesMovedCallback = ^(NSSet * touches, UIEvent * event) {
UITouch *touch = [[event allTouches] anyObject];
NSLog(#"touches moved=%#", [NSString stringWithFormat:#"%i", touch.view.tag]);
[mkMapView touchesMoved:touches withEvent:event];
};
[self addGestureRecognizer:mgr];
return self;
}
#end
I implemented my own gesture recognizer to intercept these events and I am trying to pass to intercepted events through to the mkMapView. I see the log messages however it does not seem to pass the events through as I can make the mkmapview scroll/pan.
The gesture recogniser works with the topmost view if another view is on top of the view with the attached gesture recogniser, the recogniser does not fire off. A quick and dirty way would be to assign the same set of recognisers to otherView and drop them appropriately when not needed. The action need not be performed on the same view that receives the gestures so receiving the pan gestures in Otherviews and then panning the map in the background should work.
change your code to below one->
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
mkMapView = [[MKMapView alloc] initWithFrame:frame];
[self addSubview:mkMapView];
UIView * otherView = [[UIView alloc] initWithFrame:frame];
otherView.backgroundColor = [UIColor clearColor];
[self addSubview:otherView];
return self;
}
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
return mkMapView;
}
Related
I'm trying to subclass UIView and design a transparent view. This view will sit on top of many other views and it's only task is to capture and record user touches (tap and pan). I have tried many different methods, explained in different questions asked by other users with no luck. This is what I have done so far in my implementation file:
#import "touchLayer.h"
#implementation touchLayer
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) [self commonInit];
return self;
}
- (id)initWithCoder:(NSCoder *)aDecoder
{
self = [super initWithCoder:aDecoder];
if (self) [self commonInit];
return self;
}
- (void)commonInit
{
self.userInteractionEnabled = YES;
self.alpha = 0.0;
}
- (id)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
id hitView = [super hitTest:point withEvent:event];
if (hitView == self)
{
UITouch *touch = [[event allTouches] anyObject];
if (touch.phase == UITouchPhaseBegan) NSLog(#"touchesBegan");
else if (touch.phase == UITouchPhaseMoved) NSLog(#"touchesMoved");
else if (touch.phase == UITouchPhaseEnded) NSLog(#"touchesEnded");
return nil;
}
else
return hitView;
}
#end
Now this code works just fine, and I see see the touches in the lower layers, but I cannot differentiate between touchBegan, touchMoved, and touchEnded. [[event allTouches] anyObject] returns nil. Do you have any idea how I can capture tap and pan on a UIView without blocking the touches? Thanks a lot.
After investigating, actually i can't find solution to detect touch using hitTest method with a touchLayer. But your question is about capturing and recording user touches, so i have another for this issue.
My solution is
Subclass UIWindow
Replace window of UIAppDelegate with a new one which is created with your window class.
Override sendEvent method of UIWindow, capture and record user touches in this method.
This is my subclass of UIWindow to detect touch. I tried and it work.
#implementation DetectTouchWindow
- (void)sendEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
switch ([touch phase]) {
case UITouchPhaseBegan:
NSLog(#"Touch Began");
break;
case UITouchPhaseMoved:
NSLog(#"Touch Move");
break;
case UITouchPhaseEnded:
NSLog(#"Touch End");
break;
case UITouchPhaseCancelled:
NSLog(#"Touch Cancelled");
break;
default:
break;
}
[super sendEvent:event];
}
#end
For more detail and easier, i created a demo repo to check it. You can take a look at this link https://github.com/trungducc/stackoverflow/tree/recording-touch-events
Hope this helps ;)
I have a customized UIView
#interface EColumn : UIView
I have many instances of this EColumn in it's super view.
How can i detect when the finger hold and moves in the area of this UIView, and when it move out.
I don't mean a tap Gesture, i can detect tap Gesture by using this:
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(taped:)];
[self addGestureRecognizer:tapGesture];
#implementation EColumn
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *view = [self touchedViewWithTouches:touches andEvent:event];
NSLog(#"%#",view);
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *view = [self touchedViewWithTouches:touches andEvent:event];
NSLog(#"%#",view);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *view = [self touchedViewWithTouches:touches andEvent:event];
NSLog(#"%#",view);
}
- (UIView *)touchedViewWithTouches:(NSSet *)touches andEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:touch.view];
UIView *touchedView;
for (UIView *view in self.subviews)
{
if(CGRectContainsPoint(view.frame, touchLocation))
{
touchedView = view;
break;
}
}
return touchedView;
}
#end
You can detect the finger hold for specific time with the UILongPressGestureRecognizer . For this, you can also specify minimumPressDuration and numberOfTouchesRequired
UILongPressGestureRecognizer *longPressRecognizer =
[[UILongPressGestureRecognizer alloc]
initWithTarget:self
action:#selector(longPressDetected:)];
longPressRecognizer.minimumPressDuration = 3;
longPressRecognizer.numberOfTouchesRequired = 1;
[self addGestureRecognizer:longPressRecognizer];
For detecting moves, you can use UIPanGestureRecognizer
UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(move:)];
[panRecognizer setMinimumNumberOfTouches:1];
[panRecognizer setMaximumNumberOfTouches:1];
[self addGestureRecognizer:panRecognizer];
As I have CustomView where i want to detect Touch Event like when user do Touch Down/Up and release or visa versa as in this Image.
What I tried
Swipe But in Which i Get only coordinate where user releasing the finger. like tap on X = 100 and releasing on 10 and m getting 10 only.
What i am looking for
Want to get whole coordinate like user tap on X = 100 and release on X = 80 then looking for on every single coordinate changed like 100,99,98,97,96,95,94.......80. equally as finger moving.
Please if anyone have any idea about it or something i forget to do.
Please Review.
Change this to your .h file
#interface ViewController : UIViewController{
CGFloat touchStartPoint;
CGFloat touchOffsetPoint;
CGFloat tempTouchOffsetPoint;
}
#end
And this to your .m file
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[touches anyObject];
touchStartPoint=[touch locationInView:self.view].y;
touchOffsetPoint = 0;
tempTouchOffsetPoint = 0;
// NSLog(#"touchStartPoint = %f",touchStartPoint);
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[touches anyObject];
touchOffsetPoint = [touch locationInView:self.view].y - touchStartPoint;
if (touchOffsetPoint>tempTouchOffsetPoint) {
NSLog(#"touch down");
}else{
NSLog(#"touch up");
}
tempTouchOffsetPoint = touchOffsetPoint;
}
Here is how I have done it:
UISwipeGestureRecognizer *swipeGesture = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipeGesture:)];
swipeGesture.direction = UISwipeGestureRecognizerDirectionUp|UISwipeGestureRecognizerDirectionDown;
[self.gestureAreaView addGestureRecognizer:swipeGesture];
[swipeGesture release];
-(void)handleSwipeGesture:(UISwipeGestureRecognizer *) sender
{
//Gesture detect - swipe up/down , can't be recognized direction
}
UISwipeGestureRecognizer *swipeGesture = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipeGesture:)];
swipeGesture.direction = UISwipeGestureRecognizerDirectionUp;
[self.view addGestureRecognizer:swipeGesture];
[swipeGesture release];
UISwipeGestureRecognizer *swipeGesture2 = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipeGesture:)];
swipeGesture2.direction = UISwipeGestureRecognizerDirectionDown;
[self.view addGestureRecognizer:swipeGesture2];
[swipeGesture2 release];
-(void)handleSwipeGesture:(UISwipeGestureRecognizer *) sender
{
//Gesture detect - swipe up/down , can be recognized direction
if(sender.direction == UISwipeGestureRecognizerDirectionUp)
{
}
else if(sender.direction == UISwipeGestureRecognizerDirectionDown)
{
}
}
Hope this helps out
Try this method
// declared somewhere
#property (nonatomic, retain) NSMutableArray *touchPositions;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
UITouch *touch = [[event allTouches] anyObject];
CGPoint loc = [touch locationInView:self];
[self.touchPositions addObject:[NSValue valueWithCGPoint:loc]];
}
I have just started working on a project and I already have a few problems and errors. I was wondering if you guys could help me figure out where the bug is.
Quick overview: I have two files: (ViewController.m and Scroller.m). In the ViewController I have some code under ViewDidLoad (which I'll show in a second), I also have a function (addImagesToView) that does what it says, and some touchesBegan,moved and ended for intractability.
In the Scroller I decided to rewrite some of the "-(void)touches" functions implementations.
The problem I have: is that the buttons (coming from the addImagesToView function) get stuck on highlighted. I performed some tests with an NSLog to see which "touches" working and which don't. "touchesMoved" doesn't work properly. If the user drags downward, the log stops after about 3 or 4 lines of text. I think that it interferes with the scroll somehow.
This is the content of ViewController.m:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
UIImageView *imageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:#"bgPNG.png"]];
imageView.frame = CGRectMake(0, 0, 320, 480);
[self.view addSubview:imageView];
//UIColor *background = [[UIColor alloc]initWithPatternImage:imageView.image];
//self.view.backgroundColor = background;
CGRect fullScreen = [[UIScreen mainScreen] applicationFrame];
scroll = [[Scroller alloc] initWithFrame:fullScreen];
scroll.contentSize = CGSizeMake(320, 2730);
scroll.delaysContentTouches = NO;
//scroll.canCancelContentTouches = NO;
scroll.scrollEnabled = YES;
[self.view addSubview:scroll];
buttons = [[NSMutableArray alloc]init];
[self addImagesToView];
}
-(void) addImagesToView{
CGFloat yCoordinate = 35;
for (int i=1; i<=18; i++) {
UIImageView *image = [[UIImageView alloc]initWithImage:[UIImage imageNamed:[NSString stringWithFormat:#"picture%d.png",i]]highlightedImage:[UIImage imageNamed:[NSString stringWithFormat:#"picture%dHG.png",i]]];
CGRect position = CGRectMake(105, yCoordinate, IMAGE_SIZE, IMAGE_SIZE);
image.frame = position;
[scroll addSubview:image];
image.userInteractionEnabled = YES;
image.tag = i;
[buttons addObject:image];
yCoordinate += 150;
}
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Get any touch
UITouch *t = [touches anyObject];
if ([t.view class] == [UIImageView class])
{
// Get the tappedImageView
UIImageView *tappedImageView = (UIImageView*) t.view;
tappedImageView.highlighted = YES;
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *t = [touches anyObject];
if ([t.view class] == [UIImageView class])
{
// Get the tappedImageView
UIImageView *tappedImageView = (UIImageView*) t.view;
tappedImageView.highlighted = NO;
}
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources th at can be recreated.
}
#end
and this is the Scroller.m
#import "Scroller.h"
#implementation Scroller
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if (!self.dragging)
[self.nextResponder touchesBegan: touches withEvent:event];
else
[super touchesBegan: touches withEvent: event];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *t = [touches anyObject];
if ([t.view class] == [UIImageView class])
{
// Get the tappedImageView
UIImageView *tappedImageView = (UIImageView*) t.view;
tappedImageView.highlighted = NO;
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (!self.dragging)
[self.nextResponder touchesEnded: touches withEvent:event];
else
[super touchesEnded: touches withEvent: event]; // Get the tappedImageView
}
/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect
{
// Drawing code
}
*/
#end
I have also tried to implement all the touches in the ViewController, but I don't know how to make it read from the scroll (please notice the diff. between SCROLL and SCROLLER).
As you can tell I've tried to order them like this: view -> backGroundImage ->scroller ->addImagesToView(function), so that the images that come out from [I]addImagesToView[/I] are on top of the scroll. That's my desired hierarchy.
Thank you very much.
You don't have to write your own touch handling routines for button presses. Instead of using UIImageViews, just create UIButtons instead and hook into their UIControlEventTouchUpInside event.
I think you'd be better off telling us what you are trying to accomplish as you're probably barking up the wrong tree here. As Jim said, there are better ways to select images, and even if you want to handle touches yourself, UIGestureRecognizers are probably a better option. If you are allowing users to select images, I'd recommend you take a look at iCarousel on github. It's an open source carousel that works great for image selection. Also, if you can target iOS 6 there is a new collection view that could be right up your alley.
I have a UIWindow with a text field, a button, and a table.
I would like to be able to track all the touches on the screen and then forward them to the element being touched.
I have read about overriding sendEvent in the Apple documentation but I still do not understand:
How to use hitTest to retrieve the element being touched
How to forward touches
This is what I have so far.
- (void) sendEvent:(UIEvent *)event
{
for (UITouch *touch in [event allTouches])
{
/* Get coordinates of touch */
CGPoint point = [touch locationInView:self];
/* Get subview being touched */
/* something like this???
UIView *receiver = [self hitTest:point withEvent:event];
*/
/* Forward touch event to right view */
/* how??? */
}
[super sendEvent:(UIEvent *)event];
}
Thank you.
I am not sure this is the best solution but I am following what has been posted here.
Basically I have a subclass of UIView covering the entire space. Such class contains a reference to ALL the elements that can be touched. (I wish there was a way to avoid that)
This is the code in the header
#interface SubclassUIView : UIView {
UITextField *text;
UITableView *table;
UIButton *button;
UIToolbar *toolbar;
}
And this is the implementation:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint tableHit = [table convertPoint:point fromView:self];
CGPoint buttonHit = [button convertPoint:point fromView:self];
CGPoint toolbarHit = [toolbar convertPoint:point fromView:self];
CGPoint messageHit = [text convertPoint:point fromView:self];
if ([table pointInside:tViewHit withEvent:event]) return table;
else if ([button pointInside:buttonHit withEvent:event]) return button;
else if ([toolbar pointInside:toolbarHit withEvent:event]) return toolbar;
else if ([text pointInside:messageHit withEvent:event]) return text;
return [super hitTest:point withEvent:event];
}