How to pan without touch action getting executed in Sprite Kit? - ios

In touchesBegan I have my logic for single touch. And I'm trying to add ability to change camera position with pan. For pan I use touchesMoved. Everything is kind of okay, but once I pan, action for touch gets executed too.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
NSArray *sprites = [self nodesAtPoint:location];
for (SKSpriteNode *sprite in sprites)
{
//*
//* How to stop executing this block when panning?
//*
}
}
}
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint positionInScene = [touch locationInNode:self];
CGPoint previousPosition = [touch previousLocationInNode:self];
CGPoint translation = CGPointMake((-1)*(positionInScene.x - previousPosition.x), (-1)*(positionInScene.y - previousPosition.y));
CGPoint cameraPos = [self camera].position;
[self camera].position = CGPointAdd(cameraPos, translation);
}

Look at how to use the Pan Gesture that is built into IOS, with it you will have to option to allow it to also execute the touch event or not.
I will give you an answer using the view controller, you may use it somewhere else though
Objective C:
Open up ViewController.h and add the following declaration:
#interface ViewController : UIViewController<UIGestureRecognizerDelegate>
...
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer;
Then implement it in ViewController.m as follows:
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
http://www.raywenderlich.com/6567/uigesturerecognizer-tutorial-in-ios-5-pinches-pans-and-more
At this point you can link it via UI like in the tutorial above, or declare somewhere in the beginning
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.view addGestureRecognizer:pan];
Swift:
class ViewController : UIViewController, UIGestureRecognizerDelegate
... then in your code
#IBAction func handlePan(recognizer:UIPanGestureRecognizer) {
let translation = recognizer.translationInView(self.view)
if let view = recognizer.view {
view.center = CGPoint(x:view.center.x + translation.x,
y:view.center.y + translation.y)
}
recognizer.setTranslation(CGPointZero, inView: self.view)
}
http://www.raywenderlich.com/76020/using-uigesturerecognizer-with-swift-tutorial
At this point you can link it via UI like in the tutorial above, or declare somewhere in the beginning stages like init:
let pan = UIPanGestureRecognizer(target: self, action: "handlePan:")
self.view.addGestureRecognizer(pan);

Related

Trying to move UIImageView, moving whole Screen View

I want my UIViewImages to be movable with touch. I'm trying to use code implemented in my ViewController:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch = [touches anyObject];
CGPoint p0 = [touch previousLocationInView:self.view];
CGPoint p1 = [touch locationInView:self.view];
CGPoint center = self.view.center;
center.x += p1.x - p0.x;
center.y += p1.y - p0.y;
self.view.center = center;
}
}
When I try to drag an UIImageView, I'm dragging whole screen, which is incorrect.
Need help!
Karol
You create a gesture recognizer and add it to a view like this:
UIPanGestureRecognizer *panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(gestureRecognizerMethod:)];
[self.imageView addGestureRecognizer:panGestureRecognizer];
you can adjust the position of the image view
- (void)gestureRecognizerMethod:(UIPanGestureRecognizer *)recogniser
{
if (recognizer.state == UIGestureRecognizerStateBegan || recognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint touchLocation = [recognizer locationInView:self.view];
self.imageView.center = touchLocation;
}
}
read this article.
If I read your code right self.view actually IS the whole screen.
Maybe you mean self.yourImageView instead?

TouchesMoved with UITouch

I'm trying to create a simple application where you can move your UIImageView by touching him and dragging him around.
my UIImageView is called imv
-(void ) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch * touch = [touches anyObject];
if([touch view] == self.imv){
CGPoint location = [touch locationInView:self.view];
self.imv.center = location;
}
}
-(void ) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch * touch = [touches anyObject];
if([touch view] == self.imv){
CGPoint location = [touch locationInView:self.view];
self.imv.center = location;
}
}
i'am trying to solve this like whole day and i don't know what is wrong. If i disable if statement it's working else not. What can i do?
Thanks for the answers
Unless you've subclassed UIImageView (unlikely), your view is receiving the touch events.
These days it's simpler & more usual to use a UIGestureRecognizer for this kind of thing, in this case a UIPanGestureRecognizer.
e.g.
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(dragImageView:)];
[self.imv addGestureRecognizer:pan];
- (void)dragImageView:(UIPanGestureRecognizer *)dragImageView {
if(UIGestureRecognizerStateBegan == state) {
originalCenter = self.imv.center; // add CGPoint originalCenter; member
} else if(UIGestureRecognizerStateChanged == state) {
CGPoint translate = [pan translationInView:self.imv.superview];
self.imv.center = CGPointMake(originalCenter.x + translate.x, originalCenter.y + translate.y);
}
}
From a bit of experimenting, it seems that the [touch view] is returning the main UIView and not your subview, hence the problem with the if statement not working (I added the UIImageView in a storyboard xib). EDIT- it's because UIImageViews don't handle touch events by default - see here . When adding a regular UIView in the viewDidLoad seems to work as you would expect.
Anyway, this adapted version of your code works for me
-(void)moveImageForTouches:(NSSet*)touches
{
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
if(CGRectContainsPoint(self.imv.frame, location))
{
self.imv.center = location;
}
}
-(void ) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self moveImageForTouches:touches];
}
-(void ) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[self moveImageForTouches:touches];
}

Is it possible to get the x and y coordinates of a touch?

is it possible to get the x and y coordinates of a touch? If so could someone please provide a very simple example where the coordinates are just logged to the console.
Using touchesBegan Event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"Touch x : %f y : %f", touchPoint.x, touchPoint.y);
}
This event is triggered when touch starts.
Using Gesture
Register your UITapGestureRecognizer in viewDidLoad: Method
- (void)viewDidLoad {
[super viewDidLoad];
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapGestureRecognizer:)];
[self.view setUserInteractionEnabled:YES];
[self.view addGestureRecognizer:tapGesture];
}
Setting up the tapGestureRecognizer function
// Tap GestureRecognizer function
- (void)tapGestureRecognizer:(UIGestureRecognizer *)recognizer {
CGPoint tappedPoint = [recognizer locationInView:self.view];
CGFloat xCoordinate = tappedPoint.x;
CGFloat yCoordinate = tappedPoint.y;
NSLog(#"Touch Using UITapGestureRecognizer x : %f y : %f", xCoordinate, yCoordinate);
}
Sample Project
First you need to add a gesture recognizer to the view you want.
UITapGestureRecognizer *myTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(myTapRecognizer:)];
[self.myView setUserInteractionEnabled:YES];
[self.myView addGestureRecognizer:myTap];
Then in the gesture recognizer method you make a call to locationInView:
- (void)myTapRecognizer:(UIGestureRecognizer *)recognizer
{
CGPoint tappedPoint = [recognizer locationInView:self.myView];
CGFloat xCoordinate = tappedPoint.x;
CGFloat yCoordinate = tappedPoint.y;
}
You may want to take a look at apple's UIGestureRecognizer Class Reference
Here's a very basic example (place it inside your view controller):
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
NSLog(#"%#", NSStringFromCGPoint(currentPoint));
}
This triggers every time the touch moves. You can also use touchesBegan:withEvent: which triggers when a touch starts, and touchesEnded:withEvent: which triggers when a touch ends (i.e. a finger is lifted).
You can also do this using a UIGestureRecognizer, which in many cases is more practical.

Dragging Multiple Images

I am trying to develop an analysing app that determines if you are "clever"
What this involves doing is taking a picture of yourself and dragging points onto your face, where the nose, mouth and eyes are. However, The code I have tried does not work:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == eye1)
{
eye1.center = location;
}
else if ([touch view] == eye2)
{
eye2.center = location;
}
else if ([touch view] == nose)
{
nose.center = location;
}
else if ([touch view] == chin)
{
chin.center = location;
}
else if ([touch view] == lip1)
{
lip1.center = location;
}
else if ([touch view] ==lip2)
{
lip2.center = location;
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
What is happening, because when I just have a single image, it works, but is not helpful for me. What can I do to make it work? The spots start at the bottom of the screen in a "Toolbar" and then the user drags them onto the face. I kinda want the finished result to look like:
There are two basic approaches:
You can use the various touches methods (e.g. touchesBegan, touchesMoved, etc.) in your controller or the main view, or you can use a single gesture recognizer on the main view. In this situation, you'd use touchesBegan or, if using a gesture recognizer, a state of UIGestureRecognizerStateBegan, determine locationInView of the superview, and then test whether the touch is over one of your views by testing CGRectContainsPoint, using the frame of the various views as the first parameter, and by using the location as the second parameter.
Having identified the view that the gesture began, then in touchesMoved or, if in a gesture recognizer, a state of UIGestureRecognizerStateChanged, and move the view based upon the translationInView.
Alternatively (and easier IMHO), you can create individual gesture recognizers that you attach to each of the subviews. This latter approach might look like the following. For example, you first add your gesture recognizers:
NSArray *views = #[eye1, eye2, lip1, lip2, chin, nose];
for (UIView *view in views)
{
view.userInteractionEnabled = YES;
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[view addGestureRecognizer:pan];
}
Then you implement a handlePanGesture method:
- (void)handlePanGesture:(UIPanGestureRecognizer *)gesture
{
CGPoint translation = [gesture translationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateChanged)
{
gesture.view.transform = CGAffineTransformMakeTranslation(translation.x, translation.y);
[gesture.view.superview bringSubviewToFront:gesture.view];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
gesture.view.transform = CGAffineTransformIdentity;
gesture.view.center = CGPointMake(gesture.view.center.x + translation.x, gesture.view.center.y + translation.y);
}
}

iOS: Drag Effect not working well

I have implemented the drag effect on an image but during my test I see that the image is moving only on the click mouse event.
I cannot move my image with the mouse on my screen through the drag event. But when I click on a side of my screen the image take the place where I have clicked.
I followed many topics on youtube but finally, I haven't the same behavior.
This my code:
ScreenView1.h
IBOutlet UIImageView *image;
ScreenView1.m
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
image.center = location;
[self ifCollision];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[self touchesBegan:touches withEvent:event];
}
If you want to drag an image view, you will be so much happier using a UIPanGestureRecognizer. It makes this sort of thing trivial. Using touchesBegan is so iOS 4!
UIPanGestureRecognizer* p =
[[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(dragging:)];
[imageView addGestureRecognizer:p];
// ...
- (void) dragging: (UIPanGestureRecognizer*) p {
UIView* vv = p.view;
if (p.state == UIGestureRecognizerStateBegan ||
p.state == UIGestureRecognizerStateChanged) {
CGPoint delta = [p translationInView: vv.superview];
CGPoint c = vv.center;
c.x += delta.x; c.y += delta.y;
vv.center = c;
[p setTranslation: CGPointZero inView: vv.superview];
}
}
You're not doing the right thing in the touchesMoved:withEvent:, which is why the drag won't work. Here's a little code that works:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
[CATransaction begin];
[CATransaction setDisableActions:YES];
[image setCenter:location];
[CATransaction commit];
}
For the others, I have implemented my issue in that way:
- (IBAction)catchPanEvent:(UIPanGestureRecognizer *)recognizer{
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
thank you again Matt!

Resources