How do i detect a touch on a UI image? - ios

Im making a single view application where I want a UI image to move from one side of the screen to an other. I want to detect a touch on this image. I cannot use a button because I cannot make the button move. More precision: When i click on the UI image, I want a label to show and to not be hidden anymore. I cant use a UI button because while I click on the image, the image will be moving and buttons cannot move. I hope this is more clear.
This is the code up to now:
#IBAction func Start1(sender: UIButton) {
Person1.center = CGPointMake(160, 450) ;
[UIView.animateWithDuration(3,
animations: { self.Person1.center = CGPointMake(160, 70 )
})];
}
#IBOutlet var Person1: UIImageView!
Thank you

What I understand that you want a swipe gesture on your screen.
Here is the code to add swipe gesture
UISwipeGestureRecognizer *swipeRightBlack = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(slideToRightWithGestureRecognizer:)];
swipeRightBlack.direction = UISwipeGestureRecognizerDirectionRight;
[self.viewBlack addGestureRecognizer:swipeRightBlack];
If you want to add single touch gesture too.
UITapGestureRecognizer *singleTapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleSingleTapGesture:)];
Here is a nice explanation to add different gesture in your project.

I'm not quite sure I understand what you mean, but if you want to make the picture automatically move from one position to another location, you can directly write two points out, then go to animation is realized. Or do you want to drag the moving images, then you need to add a drag gesture UIPanGestureRecognizer to UIImageView, and then according to the drag drag position get pictures move.

Related

Multiple UIGestureRecognizers in Xcode/Swift

Right now, I have two different UILabels each with their own long press and pan UIGestureRecognizers (setup through the storyboard). My final goal is to have each UILabel change color when long pressed, and without lifting their finger to end the long press, to change the value of the UILabel itself when the user pans up and down or side to side.
Right now, each UILabel has its own pan gesture method and long press gesture method. Is there any way to have a single long press/pan method for both UILabels but also have the ability to do something for one label and something else for another?
Also, is there a better approach to doing this? Eventually, I would also like to implement visual feedback when changing the value of the labels, such as in the form of animations.
I am new to iOS programming and programming in general and detailed answers are greatly appreciated. Thanks.
You can have a single function.No need to have seperate gestures for seperate Label.Example
//First add tag value to ur labels
label_one.tag=1;
label_two.tag=2;
UIPanGestureRecognizer * _panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(handlePanGesture:)];
_panGestureRecognizer.delegate = self;
[label_one addGestureRecognizer:_panGestureRecognizer];
UIPanGestureRecognizer * _panGestureRecognizer_two = [[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(handlePanGesture:)];
_panGestureRecognizer_two.delegate = self;
[label_two addGestureRecognizer:_panGestureRecognizer_two];
-(void)handlePanGesture:(UIPanGestureRecognizer*)sender{
if(sender.tag==1){
}
else if(sender.tag==2){
}
}
Same goes for other gesture as well

How to synchronise UIView moving with swipe gesture of UIPageViewController

Ok, here is the thing:
I need to have few animations happening on every screen change in my PageViewController.
So, when a user swipes, an image should fly in from the top left corner, from example.
I can make that animation to happen over time when user swipes and changes the page, but what I need is to that animation to be synchronised with the swipe movement itself.
So if a user presses the screen and starts swiping, the animation should follow user's finger and animate it's translation with the finger movement.
How can I achieve that?
I guess I need some sort of swipe gesture listener, but I failed to find any solution online. I guess I'm not using the right keywords.
You can use a UIPanGestureRecognizer to get the number of pixels swiped:
UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panRecognizer_Panned:)];
[self.myViewToSwipe addGestureRecognizer:panRecognizer];
...
- (void) panRecognizer_Panned:(UIPanGestureRecognizer *)recognizer {
CGFloat pixelsMovedHoriz = [recognizer translationInView:self.vwRelativeTo].x;
}

UIImageView touch method

I have a UIImageView and I would like to be able to add two methods to it. One for double tapping the top half and the other for double tapping the bottom half of the ImageView.
Currently the code is just an Outlet referencing an the UIImageView:
#IBOutlet weak var postImage: UIImageView!
Could anyone advise me of how to make this happen or at least point me in the correct direction?
Enable user interactions.
self.postImage.userInteractionEnabled = true
Use UITapGestureRecognizer
Detect tap location.
Enjoy.
use UITapGestureRecognizer (obj-c example, for Swift it's basically the same)
UITapGestureRecognizer *gestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapImageView:)];
[self.imageView addGestureRecognizer:gestureRecognizer];
gestureRecognizer.numberOfTapsRequired = 2;
then you can handle tap position inside of the image view
- (void)tapImageView:(UITapGestureRecognizer *)recognizer
{
CGPoint location = [recognizer locationInView:self.imageView];
if ([self.imageView pointInside:location withEvent:nil]) {
// tap is inside view bounds
if (location.y < self.imageView.bounds.height/2) {
// tap in upper half
}
else {
// tap in lower half
}
}
}
There are 2 ways that came to my mind.
Make the image view as the public property of the cell class and add the gesture recogniser to it in the controller class itself.
Or, after creating the cell object, call a method with self as a parameter and add the gesture recogniser in that method whose target will the self object passed.

Modifying screen swiping functionality for use in a slider (iOS7)

I am interested in trying to modify the functionality that allows full-screen swiping from one view to another in order to create a "slider" that is the size of the entire page -- i.e. dragging/swiping/sliding anywhere on the page has an effect of some kind.
It doesn't need to be visible. For example, I might have a solid red screen that I can change the colour of by dragging to the right anywhere, having it gradually change to blue.
Is this possible?
Using a UIPanGestureRecognizer, you can track finger movement on the screen, then use translationInView to find out how much the user's finger moved by, and change the color somehow based on that number
UIPanGestureRecognizer* pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.view addGestureRecognizer:pan];
And handle it:
-(void)handlePan:(UIPanGestureRecognizer*)sender
{
CGFloat xMovement = [sender translationInView:sender.view].x;
// Do something with the movement
// Then reset the translation
[sender setTranslation:CGPointZero inView:sender.view];
}

How can I detect the touch event of an UIImageView?

I have placed an image (UIImageView) on the navigation bar. Now I want to detect the touch event and want to handle the event. How can I do that?
In practical terms, don't do that.
Instead add a button with Custom style (no button graphics unless you specify images) over the UIImageView. Then attach whatever methods you want called to that.
You can use that technique for many cases where you really want some area of the screen to act as a button instead of messing with the Touch stuff.
A UIImageView is derived from a UIView which is derived from UIResponder so it's ready to handle touch events. You'll want to provide the touchesBegan, touchesMoved, and touchesEnded methods and they'll get called if the user taps the image. If all you want is a tap event, it's easier to just use a custom button with the image set as the button image. But if you want finer-grain control over taps, moves, etc. this is the way to go.
You'll also want to look at a few more things:
Override canBecomeFirstResponder and return YES to indicate that the view can become the focus of touch events (the default is NO).
Set the userInteractionEnabled property to YES. The default for UIViews is YES, but for UIImageViews is NO so you have to explicitly turn it on.
If you want to respond to multi-touch events (i.e. pinch, zoom, etc) you'll want to set multipleTouchEnabled to YES.
To add a touch event to a UIImageView, use the following in your .m file:
UITapGestureRecognizer *newTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(myTapMethod)];
[myImageView setUserInteractionEnabled:YES];
[myImageView addGestureRecognizer:newTap];
-(void)myTapMethod{
// Treat image tap
}
You can also add a UIGestureRecognizer. It does not require you to add an additional element in your view hierarchy, but still provides you will all the nicely written code for handling touch events with a fairly simple interface:
UISwipeGestureRecognizer *swipeRight = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(handleSwipe:)];
swipeRight.direction = UISwipeGestureRecognizerDirectionRight;
[imgView_ addGestureRecognizer:swipeRight];
[swipeRight release];
UISwipeGestureRecognizer *swipeLeft = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(handleSwipe:)];
swipeLeft.direction = UISwipeGestureRecognizerDirectionLeft;
[imgView_ addGestureRecognizer:swipeLeft];
[swipeLeft release];
I've been on different threads on the past few hours trying to find a solution for my problem, to no avail. I see that many developers share this problem, and I think people here know about this. I have multiple images inside a UIScrollView, trying to get tap events on them.
I am not getting any events from an UIImangeView, but I do get an event from a similar UILable with very similar parameters I am setting to it. Under iOS 5.1.
I have already done the following:
set setUserInteractionEnabled to YES for both `UIImageView and parent
view .
set setMultipleTouchEnabled to YES for UIImageView.
Tried subclassing UIImageView, didn't help any.
Attaching some code below, in this code I initialize both a UIImageView and UILabel, the label works fine in terms of firing events. I tried keeping out irrelevant code.
UIImageView *single_view = [[UIImageView alloc]initWithFrame:CGRectMake(200, 200, 100, 100)];
single_view.image = img;
single_view.layer.zPosition = 4;
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(singleTapGestureCaptured:)];
[single_view addGestureRecognizer:singleTap];
[single_view setMultipleTouchEnabled:YES];
[single_view setUserInteractionEnabled:YES];
[self.myScrollView addSubview:single_view];
self.myScrollView.userInteractionEnabled = YES;
UILabel *testLabel = [[UILabel alloc] initWithFrame:CGRectMake(100, 100, 100, 100)];
testLabel.backgroundColor = [UIColor redColor];
[self.myScrollView addSubview:testLabel];
[testLabel addGestureRecognizer:singleTap];
[testLabel setMultipleTouchEnabled:YES];
[testLabel setUserInteractionEnabled:YES];
testLabel.layer.zPosition = 4;
And the method which handles the event:
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
UIView *tappedView = [gesture.view hitTest:[gesture locationInView:gesture.view] withEvent:nil];
NSLog(#"Touch event on view: %#", [tappedView class]);
}
As said, the label tap is received.
Instead of making a touchable UIImageView then placing it on the navbar, you should just create a UIBarButtonItem, which you make out of a UIImageView.
First make the image view:
UIImageView *yourImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"nameOfYourImage.png"]];
Then make the barbutton item out of your image view:
UIBarButtonItem *yourBarButtonItem = [[UIBarButtonItem alloc] initWithCustomView:yourImageView];
Then add the bar button item to your navigation bar:
self.navigationItem.rightBarButtonItem = yourBarButtonItem;
Remember that this code goes into the view controller which is inside a navigation controller viewcontroller array. So basically, this "touchable image-looking bar button item" will only appear in the navigation bar when this view controller when it's being shown. When you push another view controller, this navigation bar button item will disappear.
You might want to override the touchesBegan:withEvent: method of the UIView (or subclass) that contains your UIImageView subview.
Within this method, test if any of the UITouch touches fall inside the bounds of the UIImageView instance (let's say it is called imageView).
That is, does the CGPoint element [touch locationInView] intersect with with the CGRect element [imageView bounds]? Look into the function CGRectContainsPoint to run this test.
First, you should place an UIButton and then either you can add a background image for this button, or you need to place an UIImageView over the button.
Or:
You can add the tap gesture to a UIImageView so that get the click action when tap on the UIImageView.
For those of you looking for a Swift 4 solution to this answer, you can use the following to detect a touch event on a UIImageView.
let gestureRecognizer: UITapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(imageViewTapped))
imageView.addGestureRecognizer(gestureRecognizer)
imageView.isUserInteractionEnabled = true
You will then need to define your selector as follows:
#objc func imageViewTapped() {
// Image has been tapped
}
Add gesture on that view. Add an image into that view, and then it would be detecting a gesture on the image too. You could try with the delegate method of the touch event. Then in that case it also might be detecting.

Resources