Start UIGestureRecognizer programmatically - ios

I want to start the UIPanGestureRecognizer right after adding it to a screenshot. Because the screenshot is created through code, when an item has been highlighted, the user won't press on the screen again. So... How do I start the recognizer programmatically?
UIView *snapshot = [cell snapshotViewAfterScreenUpdates:NO];
//use the cell to map the snapshot frame to the window because this does a perfect job of accounting for table offset, etc. Other methods put the view a little to the side or way off
CGRect newFrame = snapshot.frame;
newFrame.origin = [cell convertPoint:newFrame.origin toView:self.view.window];
[snapshot setFrame:newFrame];
[HelperMethods shadowForView:cell color:[UIColor blackColor] offset:CGSizeMake(1, 1) opacity:.7 radius:snapshot.frame.size.width/4];
//[self.view addSubview:snapshot];
newFrame.origin.y -=10;
//move the frame a little to let user know it can be moved
[UIView animateWithDuration:.2 animations:^{
[snapshot setFrame:newFrame];
}];
//add a long press that kills the tap if recognized
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(userCellDragged:)];
[pan setMinimumNumberOfTouches:1];
[pan setMaximumNumberOfTouches:1];
[cell addGestureRecognizer:pan];

You can always call the method on its own.
For example, you've added the selector
- (void) userCellDragged:(UIPanGestureRecognizer)sender;
For your pan gesture recognizer.
You could call this from anywhere in the view by simply adding
[self userCellDragged:nil];
Remember to add a parameter to this method something like:
if (sender == nil) {
// Triggered programmatically
}
else {
// proceed as normal
}

Related

Advice on Collectionview scenario

Need some advice on how to approach a scenario using a Collectionview. In short, the app has a CV displaying images where you can tap a cell with a thumbnail of an image and it will then display a fullscreen view of that image. I'm accomplishing this by instantiating a new UIView (not from storyboard) inside didSelectItemAtIndexPath. So the fullscreen view of the image from the cell is just a new UIView triggered from tapping the cell and I set the view's image to be the same as the cell's image...simple enough. The fullscreen view also has a button that relates to each image. Tapping the fullscreen image closes the image and goes back to the CV. All of this works perfectly.
However, I just realized that I would also like to be able to swipe through all the images while in fullscreen mode...basically very similar to how the iOS photo album works. I was able to write some code pretty quickly to do this by adding a swipe gesture to didSelectItemAtIndexPath and set the action selector to a method to handle the swipes, which worked. However, the result of this was really just changing the image for the original cell selected (tapped). So I'm not able to keep track of the selected cell while swiping through the images in fullscreen mode.
So I need advice on how to approach this. I know there has to be examples out there for something like this, but I was unable to find any. Does anyone have any advice on how I should implement this? Thanks!
Code from didSelectItemAtIndexPath...
self.fullScreenImage = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width-10, self.view.bounds.size.height-15)];
self.fullScreenImage.contentMode = UIViewContentModeScaleAspectFit;
UISwipeGestureRecognizer *rightSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleRightSwipe:)];
[rightSwipe setDirection:UISwipeGestureRecognizerDirectionRight];
[self.fullScreenImage addGestureRecognizer:rightSwipe];
UISwipeGestureRecognizer *leftSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleLeftSwipe:)];
[leftSwipe setDirection:UISwipeGestureRecognizerDirectionLeft];
[self.fullScreenImage addGestureRecognizer:leftSwipe];
if (!self.isFullScreen) {
self.fullScreenImage.transform = CGAffineTransformMakeScale(0.1, 0.1);
__weak BaseViewController *weakSelf = self;
[UIView animateWithDuration:0.5 delay:0 options:0 animations:^{
NSLog(#"Starting animiation!");
weakSelf.view.backgroundColor = [UIColor blackColor];
weakSelf.myCollectionView.backgroundColor = [UIColor blackColor];
weakSelf.fullScreenImage.center = self.view.center;
weakSelf.fullScreenImage.backgroundColor = [UIColor blackColor];
weakSelf.fullScreenImage.image = [UIImage imageWithContentsOfFile:coffeeImageData.imageURL.path];
weakSelf.fullScreenImage.transform = CGAffineTransformIdentity; // zoom in effect
[weakSelf.view addSubview:self.fullScreenImage];
[weakSelf.fullScreenImage addSubview:likeButton]; // add the button to the view
}completion:^(BOOL finished){
if (finished) {
NSLog(#"Animation finished!");
weakSelf.isFullScreen = YES;
}
}];
return;
}
Handling the swipe gesture from...
- (void)handleLeftSwipe:(UISwipeGestureRecognizer *)sender {
// make sure indexForSwiping is not > than size of array
if (self.indexForSwiping != [self.imageLoadManager.coffeeImageDataArray count]-1) {
self.indexForSwiping += 1;
NSString *cacheKey = self.allCacheKeys[self.indexForSwiping];
if (cacheKey) {
[self.imageCache queryDiskCacheForKey:cacheKey done:^(UIImage *image, SDImageCacheType cacheType) {
if (image) {
[UIView animateWithDuration:1.0 delay:0 options:UIViewAnimationOptionBeginFromCurrentState animations:^{
self.fullScreenImage.image = image;
} completion:^(BOOL finished) {
NSLog(#"swiping");
}];
}
}];
}
}
}
This framework already have such functionality, so you can look into source code to understand how it works...
https://github.com/mwaterfall/MWPhotoBrowser
I would have the view be part of the viewController and not only part of your function. Then, have the viewController manage the swipe, image, and location of the image. When your collectionView is tapped, set the location in your viewController so that when a swipe is caught you can increment by one and update your view with the new image. Let me know if that wasn't clear enough and I can try clarifying

Long press gesture on UIImageView

I have a UIImageView image circle. I want to long press the picture zoom in and move it to another place, and when released it declined and remained this place. But when I release the finger, the picture moves to the starting place. Why? Here's the code:
UILongPressGestureRecognizer *recognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(move:)];
recognizer.minimumPressDuration = .5;
recognizer.delegate = self;
[circle addGestureRecognizer:recognizer];
- (void)move:(UILongPressGestureRecognizer *)sender
{
UIView *image_ = sender.view;
CGPoint point = [sender locationInView:image_.superview];
if (sender.state == UIGestureRecognizerStateBegan) {
[UIView animateWithDuration:0.3 animations:^{
image_.transform = CGAffineTransformMakeScale(1.2f, 1.2f);
} completion:nil];
}
if (sender.state == UIGestureRecognizerStateChanged) {
image_.center = point;
}
if (sender.state == UIGestureRecognizerStateEnded) {
image_.center = point;
[UIView animateWithDuration:0.3 animations:^{
image_.transform = CGAffineTransformMakeScale(1.f, 1.f);
} completion:nil];
}
}
The reason that it is moving back to the start is because you are using a UILongPressGestureRecognizer. This type of recogniser will not give you a reliable update on tracking the user's input location. Once the gesture ends it is probably giving you the point where the gesture was first initiated.
I would also set the image's transform back to the identity transform rather than scale it back to 1.0f in both dimensions.
The solution would be to use a UIPanGestureRecognizer and properly track the gesture's location in the view. This will work far better, and on top of that it is the right way to do it.
The best way for you is to add Image view to UIView.
So its UIView will be container UIView and add UILongPressGesture to its container view.
Then it will work. UIImageview doesn't reply to UILongPressGesture and its container view will reply about it.
You will get the result what you want.
Hope your success.

iOS - UIScrollView from half to full screen

In a UIViewController, I have a UIScrollView that takes half of the screen. This UIScrollView contains a collection of UIView. On some specific event, such as a swipe, I want my UIScrollView to take full screen animatedly, how do I achieve this behavior?
Try this...
// adding swipe gesture to your scrollview
UISwipeGestureRecognizer *swipeLeft = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipe:)];
// Set swipe direction.
[swipeLeft setDirection:UISwipeGestureRecognizerDirectionLeft];
[scrollview addGestureRecognizer:swipeLeft];
// add gesture action method
- (void)handleSwipe:(UISwipeGestureRecognizer *)swipe {
// if you are having only one swipe gesture, you no need to add the below condition. you can add the code inside the condition
if (swipe.direction == UISwipeGestureRecognizerDirectionLeft) {
NSLog(#"Left Swipe");
scrollView.contentSize = self.view.frame;
scrollView.frame = self.view.frame;
}
}
Try using this method on event and set :
scrollView.contentSize = CGSizeMake(847, 800); // change on basis of your requirement
For animating scroll view :
- (void)setContentOffset:(CGPoint)contentOffset animated:(BOOL)animated; // animate at constant velocity to new offset
Quoting from #RAJA's answer and slight improvement:
[UIView beginAnimations:#"whatever" context:nil];
scrollView.contentSize = CGSizeMake(self.view.frame.size.width, self.view.frame.size.height * <count of subviews>);
scrollView.frame = self.view.frame;
[UIView commitAnimations];
I just extend the first answer, If you want animate the above process, you can achieve that by appending the following code :
- (void)handleSwipe:(UISwipeGestureRecognizer *)swipe
{
if (swipe.direction == UISwipeGestureRecognizerDirectionLeft)
{
[UIView animateWithDuration:0.15f // set duration for your animation here
animations:^{
scrollView.contentSize = self.view.frame;
scrollView.frame = self.view.frame;
}
completion:^(BOOL finished){
NSLog(#"completion block");
}];
}
}

Trying to correctly move UIImageView using gestureRecognizer in iOS

I have a UIImageView that I am trying to move using a UIPanGestureRecognizer object. The UIImageView is positioned over top of a UITableView, and serves as a scrollbar. I want this UIImageView to be moved by the user up or down the UITableView, not sideways. To accomplish this, I have implemented UIGestureRecognizerDelegate, and I have the following methods:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
- (void)panGestureDetected:(UIPanGestureRecognizer *)recognizer {
NSLog(#"Do I get called?");
//_startLocation is a property of type CGPoint that I declare in the .h file
_startLocation = [recognizer locationInView:_imageView];
NSLog(#"The point is: %d", _startLocation);
CGRect frame = [_imageView frame];
frame.origin.y += frame.origin.y - _startLocation.y;
[_imageView setFrame: frame];
_imageView.center = _startLocation;
}
The method, panGestureDetected is called from viewDidLoad as follows:
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panGestureDetected:)];
panGesture.maximumNumberOfTouches = 1;
panGesture.minimumNumberOfTouches = 1;
panGesture.delegate = self;
[_table addGestureRecognizer:panGesture];
Unfortunately, my UIImageView moves all over the place frantically on the screen when I try to move it. I want to see a smooth scrolling UIImageView go up/down while the user drags it. Can anyone see what I am doing wrong?
In your handler method just keep this line and remove all other.
_imageView.center.y = [recognizer locationInView:[_imageView superView]].y;
You need to get location in superView, not imageView itself. And just change the y value of center.

Attach GestureRecogniser to multiple imageviews

Something strange I encountered today while attaching same gesture recogniser to multiple image views. It gets attached to only the last one, in other words, it can be attached to only one view!
I had to create multiple gesture recognisers to meet my requirements.
Following is what I have done. Am I doing correct? Is that's the only way to attach recognisers to the multiple imageviews?
Please note that I don't want to use UITableView or UIVIew and put all imageviews in it and attach gesture recogniser to only UITableView or UIVIew. I have all image scattered and I have to detect which image is being dragged. Thanks.
[imgView1 setUserInteractionEnabled:YES];
[imgView1 setMultipleTouchEnabled:YES];
[imgView2 setUserInteractionEnabled:YES];
[imgView2 setMultipleTouchEnabled:YES];
[imgView3 setUserInteractionEnabled:YES];
[imgView3 setMultipleTouchEnabled:YES];
[imgView4 setUserInteractionEnabled:YES];
[imgView4 setMultipleTouchEnabled:YES];
[imgView5 setUserInteractionEnabled:YES];
[imgView5 setMultipleTouchEnabled:YES];
[imgView6 setUserInteractionEnabled:YES];
[imgView6 setMultipleTouchEnabled:YES];
//Attach gesture recognizer to each imagviews
gestureRecognizer1 = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
gestureRecognizer1.delegate = self;
gestureRecognizer2 = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
gestureRecognizer2.delegate = self;
gestureRecognizer3 = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
gestureRecognizer3.delegate = self;
gestureRecognizer4 = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
gestureRecognizer4.delegate = self;
gestureRecognizer5 = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
gestureRecognizer5.delegate = self;
gestureRecognizer6 = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
gestureRecognizer6.delegate = self;
[imgView1 addGestureRecognizer:gestureRecognizer1];
[imgView2 addGestureRecognizer:gestureRecognizer2];
[imgView3 addGestureRecognizer:gestureRecognizer3];
[imgView4 addGestureRecognizer:gestureRecognizer4];
[imgView5 addGestureRecognizer:gestureRecognizer5];
[imgView6 addGestureRecognizer:gestureRecognizer6];
Yes, one view per gesture recognizer. So if you want only one recognizer, put it on the superview, e.g.:
UILongPressGestureRecognizer *gestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(gestureHandler:)];
[self.view addGestureRecognizer:gestureRecognizer];
And then, in your handler, you can:
- (void)handleLongPress:(UILongPressGestureRecognizer *)sender
{
CGPoint location = [sender locationInView:self.view];
if (sender.state == UIGestureRecognizerStateBegan)
{
for (UIView *view in self.view.subviews)
{
if ([view isKindOfClass:[UIImageView class]] && CGRectContainsPoint(view.frame, location))
{
UIImageView *image = (UIImageView *) view;
// ok, now you know which image you received your long press for
// do whatever you wanted on it at this point
return;
}
}
}
}
By the way, if you do that, you don't need to worry about enabling user interaction on the images, either.
Finally, you don't need to worry about specifying your gesture recognizer's delegate unless you're going to conform to UIGestureRecognizerDelegate, which this isn't. Also note that I'm using a local var for my recognizer because there's no reason to hang onto it.
Update:
While the above code works fine, perhaps even better would be a custom long press gesture recognizer that would fail if the long press didn't take place over an image (this way it's more likely to play well in case you have other gesture recognizers taking place in your view). So:
#import <UIKit/UIGestureRecognizerSubclass.h>
#interface ImageLongPressGestureRecognizer : UILongPressGestureRecognizer
#property (nonatomic, weak) UIImageView *imageview;
#end
#implementation ImageLongPressGestureRecognizer
#synthesize imageview = _imageview;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
self.imageview = nil;
[super touchesBegan:touches withEvent:event];
CGPoint location = [self locationInView:self.view];
for (UIView *view in self.view.subviews)
{
if ([view isKindOfClass:[UIImageView class]] && CGRectContainsPoint(view.frame, location))
{
self.imageview = (UIImageView *)view;
return;
}
}
self.state = UIGestureRecognizerStateFailed;
}
#end
then create your gesture recognizer accordingly, using this new subclass:
ImageLongPressGestureRecognizer *gestureRecognizer = [[ImageLongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPress:)];
[self.view addGestureRecognizer:gestureRecognizer];
and then, as a nice little benefit of this subclassing, your main gesture recognizer is simplified, namely:
- (void)handleLongPress:(ImageLongPressGestureRecognizer *)sender
{
if (sender.state == UIGestureRecognizerStateBegan)
{
// you can now do whatever you want with sender.imageview, e.g. this makes it blink for you:
[UIView animateWithDuration:0.5
animations:^{
sender.imageview.alpha = 0.0;
} completion:^(BOOL finished){
[UIView animateWithDuration:0.5
animations:^{
sender.imageview.alpha = 1.0;
}
completion:nil];
}];
}
}
You can't attach a gesture recognizer to more than one object (as you discovered). One solution to what you are doing might be to subclass UIImageView and have setup code in that class so each view creates its recognizer, etc.
I guess, first of all, you should make an array of views and array of recognizers (mutable array, if needed) and then populate it. It will help you to use cycles to avoid code duplication.
As for multiple view with one recognizer - no, it's not possible, answered here.

Resources