I have an ImageView with a png and I want to do this: when someone touch this imageview it's alpha change to 0.0, is it possible? (all without buttons)
you can use UITapGestureRecognizer added to the UIImageView via addGestureRecognizer
snippets:
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(imageTaped:)];
singleTap.numberOfTapsRequired = 1;
singleTap.numberOfTouchesRequired = 1;
[iv addGestureRecognizer:singleTap];
[iv setUserInteractionEnabled:YES];
and
- (void)imageTaped:(UIGestureRecognizer *)gestureRecognizer {
[UIView animateWithDuration:0.5 animations:^(void){
imageView.alpha = 0.0f;
}];
}
Yes, it is possible. For example you can do that with following steps:
set image view's userInteractionEnabled property to YES - so it will receive touch events
add UITapGestureRecongnizer to it
in gesture handler set view's alpha to 0.0, you can do that with animation as well:
[UIView animateWithDuration:0.5 animations:^(void){
imageView.alpha = 0.0f;
}];
There are already lots of questions like this. Searching with google gave me the following:
touches event handler for UIImageView
UIImageView Touch Event
how can i detect the touch event of an UIImageView
The code in Swift
In my case I implemented the tap gesture for an image click
1 - Link the image with the ViewController, by drag and drop
#IBOutlet weak var imgCapa: UIImageView!
2 - Instance the UITapGestureRecognizer in ViewDidLoad method:
override func viewDidLoad() {
super.viewDidLoad()
//instance the UITapGestureRecognizer and inform the method for the action "imageTapped"
var tap = UITapGestureRecognizer(target: self, action: "imageTapped")
//define quantity of taps
tap.numberOfTapsRequired = 1
tap.numberOfTouchesRequired = 1
//set the image to the gesture
imgCapa.addGestureRecognizer(tap)
}
3 - Create the method to do what do you want when the image clicked
func imageTapped(){
//write your specific code for what do you want
//in my case I want to show other page by specific segue
let sumario = self.storyboard?.instantiateViewControllerWithIdentifier("sumarioViewController") as SumarioViewController
self.performSegueWithIdentifier("segueSumarioViewController", sender: sumario)
}
In recent Xcode, this is pretty easy. Go into the storyboard, in the object library search for "gesture", drag the one you want onto the image view. You can then treat the gesture object in the view hierarchy as the thing being tapped, i.e. control-drag from there to your view controller to connect the event handler.
Once there you can set the alpha as you like, although if you're trying to essentially remove the image view, you should set imageView.hidden = true / imageView.hidden = YES instead, because that will stop it receiving events.
Related
I found some questions and answers here on stackoverflow for that problem, but none of the solutions there solved my problem.
My iOS App has the ability to play some music with a nice music player. I designed it with Xcode's Interface Builder and dragged out a UIView and changed its class to MPVolumeView. Everything works fine when I'm debugging my app on my iPhone 6.
Here is my problem: I also dragged out a UITapGestureRecognizer on my whole view which contains my controls like
play/pause, next/previous track (...)
and also my MPVolumeView. When I tap on that view it should fade out and disappear. Then I added a UITapGestureRecognizer on my UIImageView which shows my artwork image of the song. When I tap this image view, it should fade in my view with all controls in int - that's working properly.
BUT: When I slide the knob of the volume slider just a little bit, or if I am just touching it, the view still disappears. It seems like my MPVolumeView is forwarding my touch or something like that. I tried setting userInteractionEnabled = false on my volume slider, but that didn't help. I also set the delegate of my gesture recognizer to self and added the
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
NSLog(#"tapped");
if([gestureRecognizer.view isMemberOfClass:[UIImageView class]]) {
return true;
}
return false;
}
function to my code, which returns true or false, depending on which view I'm tapping. When I'm accessing the gestureRecognizer.view property, it doesn't recognize my MPVolumeView, just the UIView in the background.
Here my two methods which are fired after when the TapGestureRecognizers are fired:
- (IBAction)overlayViewTapped:(UITapGestureRecognizer *)sender {
if(sender.state == UIGestureRecognizerStateEnded) {
[UIView animateWithDuration:0.3
delay:0.0
options:UIViewAnimationOptionAllowUserInteraction
animations:^{ self.blackOverlayView.alpha = 0.0; self.normalTimeLabel.alpha = 1.0; }
completion:nil];
}
}
- (IBAction)imageViewTapped:(UITapGestureRecognizer *)sender {
[UIView animateWithDuration:0.3
delay:0.0
options:UIViewAnimationOptionAllowUserInteraction
animations:^{ self.blackOverlayView.alpha = 1.0; self.normalTimeLabel.alpha = 0.0; }
completion:nil];
}
Please help me, I'm nearly going nuts with that ..
EDIT: My music player looks like this:
After I tap anywhere on the view (except the subviews), the view should fade out and hide everything, just show the artwork image of the song and the current elapsed time. This will look like this:
As I said - the problem is, if I just tap the volume slider or slide it just a little bit, my UITapGestureRecognizer fires and fades out my complete view. How can I prevent that?
It is behaving the way it is simply because you added the gesture recognizer to the entire UIView, which includes the volume slider and whatnot.
Instead of detecting the touch in the entire view, check to see if the touch is in the area you want it.
Create a CGRect property, I'll call it touchArea:
#property CGRect touchArea;
Then specify the size of the touchArea (you can do this in the viewDidLoad):
touchArea = CGRectMake(0.0, 240.0, 320.0, 240.0);
You will have to find out where you want this and how big it should be and replace my example values with the real ones. A simple way of cheating this is to take something like a UILabel in IB and positioning and sizing it to your desire, then go to the size inspector pane and get the x, y, width and height values.
Then, before you do your fade animation, check to see if the touch was in the touchArea:
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint touchPoint = [gestureRecognizer locationInView:self.view];
if (CGRectContainsPoint(touchArea, touchPoint))
{
//do your animation here.
}
}
As a note, I would set a BOOL to check whether or not the view is faded in or out, so you can always check before animating.
i want to show up images in my view and add the tap gesture to do some stuff.
My code looks like this for the image creation
for(int i = 0; i < 5; i++) {
UIImageView *imageToMove =
[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"icon1.png"]];
imageToMove.frame = CGRectMake(((float)rand() / RAND_MAX) * 1024, ((float)rand() / RAND_MAX) * 768 , 95, 95);
[imageToMove setUserInteractionEnabled:YES];
[imageToMove addGestureRecognizer:singleTap];
[self.view addSubview:imageToMove];
}
and this simple function to get some feedback when a image is tapped
- (void)tapDetected {
NSLog(#"single Tap on imageview");
}
My Problem is, only one (IMHO the last one added) image is touchable. The other images "behind" can't be accessed.
Is there a possibility to solve this?
It is not entirely clear what you would like to do when tapping on stacked images (I mean, the natural thing would be that the image you touch can move, and this is should be accomplished by your code); in any case, try with:
singleTap.cancelsTouchesInView = NO;
This will make your gesture recognizers not cancel the tap they handle, so the holding view will also receive the events. If this does not help, have a look at the UIGestureRecognizerDelegate protocol and specifically at the method
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer*)otherGestureRecognizer
which should allow to fine-control how your gesture recognizers co-exist.
using: objective-C
I have a tableView with rows. I want that user can move cell a little aside and additional action shown to him.
What was done:
Currently create separately table in xib file and cell in xib file.
Cell is very simple
I want to move viewWIthLabel.
In cell class file for using animation I use next code
- (void)awakeFromNib
{
UISwipeGestureRecognizer * leftGestrudeRecognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swipeLeft:)];
leftGestrudeRecognizer.direction = UISwipeGestureRecognizerDirectionLeft;
UISwipeGestureRecognizer * rightGestrudeRecognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swipeRight:)];
rightGestrudeRecognizer.direction = UISwipeGestureRecognizerDirectionRight;
[self addGestureRecognizer:rightGestrudeRecognizer];
[self addGestureRecognizer:leftGestrudeRecognizer];
}
and actions:
- (IBAction)swipeLeft:(id)sender{
NSLog(#"swipeL");
[UIView animateWithDuration:0.5 animations:^{
self.viewWIthLabel.frame = CGRectMake(-100, 0, self.viewWIthLabel.frame.size.width, self.viewWIthLabel.frame.size.height);
} ];
}
- (IBAction)swipeRight:(id)sender{
NSLog(#"swipeR");
[UIView animateWithDuration:.5 animations:^{
self.viewWIthLabel.frame = CGRectMake(0, 0, self.viewWIthLabel.frame.size.width, self.viewWIthLabel.frame.size.height);
}];
}
So idea to swipe the cell and move it with animation for some distance to show a hidden button.
Result of this code - almoust like I want:
But if you start to scroll the tableView you can get duplicates of "swiped cells" in random (depending on scrolling speed) positions:
Any idea why it happened?
You might have to clear/reset the swiped state of the cell in – tableView:cellForRowAtIndexPath:. Since the cells get re-used (I am assuming you are doing that as would be required for good performance of the tableview). Probably something like this -
self.viewWIthLabel.frame = CGRectMake(0, 0, self.viewWIthLabel.frame.size.width, self.viewWIthLabel.frame.size.height);
I have a main view in my program with a draggable view in it. This view can be dragged around with pan gestures. Currently though it uses a lot of code which I want to put in a subclass to reduce complexity. (I eventually want to increase functionality by allowing the user to expand with view with further pan gestures. This means there will be lots more code clogging up my view controller if I can't sort this out first)
Is it possible to have the code for a gesture recogniser in the subclass of a class and still interact with views in the parent class.
This is the current code I am using to enable pan gesture in the parent class:
-(void)viewDidLoad {
...
UIView * draggableView = [[UIView alloc]initWithFrame:CGRectMake(highlightedSectionXCoordinateStart, highlightedSectionYCoordinateStart, highlightedSectionWidth, highlightedSectionHeight)];
draggableView.backgroundColor = [UIColor colorWithRed:121.0/255.0 green:227.0/255.0 blue:16.0/255.0 alpha:0.5];
draggableView.userInteractionEnabled = YES;
[graphView addSubview:draggableView];
UIPanGestureRecognizer * panner = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panWasRecognized:)];
[draggableView addGestureRecognizer:panner];
}
- (void)panWasRecognized:(UIPanGestureRecognizer *)panner {
UIView * draggedView = panner.view;
CGPoint offset = [panner translationInView:draggedView.superview];
CGPoint center = draggedView.center;
// We want to make it so the square won't go past the axis on the left
// If the centre plus the offset
CGFloat xValue = center.x + offset.x;
draggedView.center = CGPointMake(xValue, center.y);
// Reset translation to zero so on the next `panWasRecognized:` message, the
// translation will just be the additional movement of the touch since now.
[panner setTranslation:CGPointZero inView:draggedView.superview];
}
(Thanks to Rob Mayoff for getting me this far)
I have now added a subclass of the view but can't figure out how or where I need to create the gesture recogniser as the view is now being created in the subclass and added to the parent class.
I really want the target for the gesture recogniser to be in this subclass but when I try to code it nothing happens.
I have tried putting all the code in the subclass and adding the pan gesture to the view but then I get a bad access crash when I try to drag it.
I am currently trying to use
[graphView addSubview:[[BDraggableView alloc] getDraggableView]];
To add it to the subview and then setting up the view (adding the pan gesture etc) in the function getDraggableView in the subclass
There must be a more straight forward way of doing this that I haven't conceptualised yet - I am still pretty new dealing with subclasses and so am still learning how they all fit together.
Thanks for any help you can give
I think I might of figured this one out.
In the parent class I created the child class variable:
BDraggableView * draggableViewSubClass;
draggableViewSubClass = [[BDraggableView alloc] initWithView:graphView andRangeChart: [rangeSelector getRangeChart]];
This allowed me to initialise the child class with the view I wanted to have the draggable view on: graphView
Then in the child view I set up the pan gesture as I normally would but added it to this view carried through:
- (id)initWithView:(UIView *) view andRangeChart: (ShinobiChart *)chart {
self = [super initWithNibName:nil bundle:nil];
if (self) {
// Custom initialization
parentView = view;
[self setUpViewsAndPans];
}
return self;
}
- (void)setUpViewsAndPans {
draggableView = [[UIView alloc]initWithFrame:CGRectMake(highlightedSectionXCoordinateStart, highlightedSectionYCoordinateStart, highlightedSectionWidth, highlightedSectionHeight)];
draggableView.backgroundColor = [UIColor colorWithRed:121.0/255.0 green:227.0/255.0 blue:16.0/255.0 alpha:0.5];
draggableView.userInteractionEnabled = YES;
// Add the newly made draggable view to our parent view
[parentView addSubview:draggableView];
[parentView bringSubviewToFront:draggableView];
// Add the pan gesture
UIPanGestureRecognizer * panner = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panWasRecognized:)];
[draggableView addGestureRecognizer:panner];
}
- (void)panWasRecognized:(UIPanGestureRecognizer *)panner {
UIView * draggedView = panner.view;
CGPoint offset = [panner translationInView:draggedView.superview];
CGPoint center = draggedView.center;
CGFloat xValue = center.x + offset.x;
draggedView.center = CGPointMake(xValue, center.y);
// Reset translation to zero so on the next `panWasRecognized:` message, the
// translation will just be the additional movement of the touch since now.
[panner setTranslation:CGPointZero inView:draggedView.superview];
}
It took me a while to straighten it in my head that we want to do all the setting up in our subclass and then add this view with its characteristics to the parent view.
Thanks for all the answers provided they got me thinking along the right lines to solve it
I think you want to subclass UIView and make your own DraggableView class. Here, you can add swipe and pan gesture recognizers. This would be in the implementation of a subclass of UIView
- (id)initWithFrame:(CGRect)frame
{
if (self = [super initWithFrame:frame]) {
UIGestureRecognizer *gestRec = [[UIGestureRecognizer alloc] initWithTarget:self
action:#selector(detectMyMotion:)];
[self addGestureRecognizer:gestRec];
}
return self;
}
- (void)detectMyMotion:(UIGestureRecognizer *)gestRect
{
NSLog(#"Gesture Recognized");
// maybe even, if you wanted to alert your VC of a gesture...
[self.delegate alertOfGesture:gestRect];
// your VC would be alerted by delegation of this action.
}
I've been trying to figure this out for hours, completely at a loss here. I'm trying to implement a UIPinchGestureRecognizer for some of the custom UIImageViews in my game, but it doesn't work. Everything thing I've researched says it should work, yet it doesn't. Pinch works fine if I add it to my view controller, or to a custom UIView, but not the UIImageViews. I've tried all the common fixes and tweaks, to no success. I have userInteractionEnabled and multipleTouchEnabled set to YES. I have the delegate and selectors set up properly. I have shouldRecognizeSimultaneouslyWithGestureRecognizer set to return YES.
The gesture recognizer is getting added to the UIImageView, I've been able to access its properties later in my update loop, but the NSLog in the selector never gets called for the UIImageView when I try to pinch. I've adjusted the z-position of the views to ensure they are on top but no dice.
My UIImageViews are stored in a NSMutableDictionary and are updated by looping through it during each update loop of the game. Could this have an effect on the UIPinchGestureRecognizer not getting called?... I can't think of anything else and posting the code probably won't help - because the same exact code works when it's used for the UIView or view controller.
I do have touch handling code in the view controller's touchesBegan and touchedMoved events... but I've turned that off but the problem still persists, and the pinch worked for other elements with it on anyway.
Any ideas what could prevent a gesture selector from firing on an UIImageView? The dictionary? Something to do with being constantly updated in the game loop? Any ideas would be welcome, this seems so simple to implement...
Edit: Here's the code for the UIImageView and what I'm doing with it... not sure if this helps.
Extended UIImageView class Paper.m (prp is a struct of properties used to initialize my custom variables:
NSString *tName = [NSString stringWithUTF8String: prp.imagePath];
UIImage *tImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#.png",tName]];
self = [self initWithImage: tImage];
self.userInteractionEnabled = YES;
self.multipleTouchEnabled = YES;
self.center = CGPointMake(prp.spawnX, prp.spawnY);
if (prp.zPos != 0) { self.layer.zPosition = prp.zPos; }
// other initialization excised
Then I have a custom class called ObjManager that holds the NSMutableDictionary and initializes all UIImageView objects like so, where addObj is called in a loop to add each object:
- (ObjManager*) initWithBlank {
// create an array for our objects
self = [super init];
if (self) {
objects = [[NSMutableDictionary alloc] init];
spawnID = 100; // start of counter for dynamically spawned object IDs
}
return self;
}
- (void) addObj:(Paper *)paperPiece wasSpawned:(BOOL)spawned {
// add each paper piece, assign spawnID if dynamically spawned
NSNumber *newID;
if (spawned) { newID = [NSNumber numberWithInt:spawnID]; spawnID++; }
else { newID = [NSNumber numberWithInt:paperPiece.objID]; }
[objects setObject:paperPiece forKey:newID];
}
My view controller calls the initialization of the ObjManager (called _world in my VC). Then it loops through _world like so:
// Populate additional object managers and add all subviews
for (NSNumber *key in _world.objects) {
_eachPiece = [_world.objects objectForKey:key];
// Populate collision object manager
if (_eachPiece.collision) {
[_world_collisions addObj:_eachPiece wasSpawned:NO];
}
// only add pinch gesture if the object flag is set
if (_eachPiece.pinch) {
UIPinchGestureRecognizer *pinchGesture = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(pinchPaper:)];
pinchGesture.delegate = self;
[_eachPiece addGestureRecognizer:pinchGesture];
NSLog(#"Added pinch recognizer scale: %#", pinchGesture.view.description);
}
// Add each object as a subview
[self.view addSubview:_eachPiece];
}
_eachPiece is an object in my view controller, declared in the .h file (as is _world):
#property (nonatomic, strong) ObjManager *world;
#property (nonatomic, strong) Paper *eachPiece;
Then I have an NSTimer object that updates all moveable Paper objects (the UIImageViews) in _world (ObjManager) every frame like so:
// loop through each piece and update
for (NSNumber *key in _world.objects) {
eachPiece = [_world.objects objectForKey:key];
// only update moveable pieces
if ((eachPiece.moveType == Move_Touch) || (eachPiece.moveType == Move_Auto)) {
CGPoint paperCenter;
paperCenter = eachPiece.center;
// a bunch of code to update paperCenter x & y for the object's new position based on velocity and user input
// determine image direction and transformation matrix
[_world updateDirection:eachPiece];
CGAffineTransform transformPiece = [_world imageTransform:eachPiece];
if (transformEnabled) {
eachPiece.transform = transformPiece;
}
// finally move it
[eachPiece setCenter:paperCenter];
}
}
And the pinch selector:
- (void)pinchPaper:(UIPinchGestureRecognizer *)recognizer {
NSLog(#"Pinch scale: %f", recognizer.scale);
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
recognizer.scale = 1;
}
As far as I can tell, the pinch should work. If I take the same pinch gesture code and set it to add to the view controller, it works for the entire view. I also have a custom UIView class that acts as a border (simply a rectangle drawn around the view), and moving the pinch gesture code to that allows me to pinch the border only.
Alright, so apparently gesture recognizers don't fire on views where the position is being animated. So to make it work I had to put the recognizer on the view controller, then perform a hit test and apply pinch/zoom on the touched view if it's one I want to pinch/zoom. Info on that here:
http://iphonedevsdk.com/forum/iphone-sdk-tutorials/100982-caanimation-tutorial.html
For my particular case, I kept track of which animated views I wanted to pinch, in a variable/array at the View Controller level. Then I used this code in the selector (essentially from the link above, all credit to them):
- (void)pinchPaper:(UIPinchGestureRecognizer *)recognizer {
CALayer *pinchLayer;
id layerDelegate;
CGPoint touchPoint = [recognizer locationInView:self.view];
pinchLayer = [self.view.layer.presentationLayer hitTest: touchPoint];
layerDelegate = [pinchLayer delegate];
//_pinchView is the UIView I want to pinch
if (layerDelegate == _pinchView) {
_pinchView.transform = CGAffineTransformScale(_pinchView.transform, recognizer.scale, recognizer.scale);
recognizer.scale = 1;
}
}
Only tricky thing is if you have other scale transforms (like changing directions in mine) going on as part of the existing UIView animation, you have to account for that, by using the current transform during each update loop.
For any gesture recognizer to work on imageViews, userInteraction must be enabled on it.
So, it should be,
yourImageView.userInteractionEnabled = YES;
Or, if you are using storyboards, you can check that option in storyboard's inspector window too.
Hope it helps..:)