TouchesMoved only moves a little - ios

I am trying to allow the user to drag a label around the screen, but in the simulator it only moves a little each time I touch somewhere on the screen. It will jump to the location and then drag slightly, but then it will stop dragging and I have to touch a different location to get it to move again. Here is my code in my .m file.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *Drag = [[event allTouches] anyObject];
firstInitial.center = [Drag locationInView: self.view];
}
My ultimate goal is to be able to drag three different labels on the screen, but I'm just trying to tackle this problem first. I would greatly appreciate any help!
Thanks.

Try using a UIGestureRecognizer instead of -touchesMoved:withEvent:. And implementing something similar to the following code.
//Inside viewDidLoad
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(dragonMoved:)];
panGesture.minimumNumberOfTouches = 1;
[self addGestureRecognizer:panGesture];
//**********
- (void)dragonMoved:(UIPanGestureRecognizer *)gesture{
CGPoint touchLocation = [gesture locationInView:self];
static UIView *currentDragObject;
if(UIGestureRecognizerStateBegan == gesture.state){
for(DragObect *dragView in self.dragObjects){
if(CGRectContainsPoint(dragView.frame, touchLocation)){
currentDragObject = dragView;
break;
}
}
}else if(UIGestureRecognizerStateChanged == gesture.state){
currentDragObject.center = touchLocation;
}else if (UIGestureRecognizerStateEnded == gesture.state){
currentDragObject = nil;
}
}

Related

iOS MultiTouch Event in different uiview

I have two different UIView on the UIViewcontroller.
That UIView are left UIView (leftScreenView) and right UIView(rightScreenView).
They are differentiate subview leftJoystickView(on leftScreenView) and rightJoystickView(on rightScreenView).
But I found problem about below:
When I touch on the leftScreenView and moving, now I touch other finger on the rightScreenView.
Now the touch event always become on the leftScreenView. That can't to differentiate between leftScreenView and rightScreenView event.
I need touch on the leftScreenView and moving and touch on the rightScreenView moving are different event (do moving) at the same time.
How can I process multi touch distinguishing different moving event and began event?
#pragma mark ----- touch action -----
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// NSLog(#"------touchesBegan-------");
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchViewPoint = [touch locationInView:touch.view];
CGPoint touchLeftScreenPoint = [touch.view convertPoint:touchViewPoint toView:self.leftScreenView];
CGPoint touchRightScreenPoint = [touch.view convertPoint:touchViewPoint toView:self.rightScreenView];
NSLog(#"touch left:%d",[self.leftScreenView pointInside:touchLeftScreenPoint withEvent:event]);
NSLog(#"touch right:%d",[self.rightScreenView pointInside:touchRightScreenPoint withEvent:event]);
NSLog(#"touch.tapCount:%ld", touch.tapCount);
if( [self.leftScreenView pointInside:touchLeftScreenPoint withEvent:event] )
{
NSLog(#"began click left screen");
self.leftStickLfConstraint.constant = touchLeftScreenPoint.x ;
self.leftStickTopStickConstraint.constant = touchLeftScreenPoint.y ;
[self.leftJoystickView touchesBegan:touches withEvent:event];
}else if( [self.rightScreenView pointInside:touchRightScreenPoint withEvent:event] )
{
NSLog(#"began click right screen");
self.rightStickLfConstraint.constant = touchRightScreenPoint.x ;
self.rightStickTopConstraint.constant = touchRightScreenPoint.y ;
[self.rightJoystickView touchesBegan:touches withEvent:event];
}
NSLog(#" ");
}
The move event is below:
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchViewPoint = [touch locationInView:touch.view];
NSLog(#"moved touch.tapCount:%ld", touch.tapCount);
NSLog(#"moved touches count:%ld", [touches count]);
CGPoint touchLeftScreenPoint = [touch.view convertPoint:touchViewPoint toView:self.leftScreenView];
CGPoint touchRightScreenPoint = [touch.view convertPoint:touchViewPoint toView:self.rightScreenView];
if( [self.leftScreenView pointInside:touchLeftScreenPoint withEvent:event] )
{
NSLog(#"touchesMoved click left screen");
[self.leftJoystickView touchesMoved:touches withEvent:event];
}else if( [self.rightScreenView pointInside:touchRightScreenPoint withEvent:event] )
{
NSLog(#"touchesMoved click right screen");
[self.rightJoystickView touchesMoved:touches withEvent:event];
}
}
When I keep on the leftScreenView at moving, then touch rightScreenView.
I always get touch left:1 touch right:0.
Log:
2015-05-08 14:20:30.946 DroneG2[3606:942222] touchesMoved click left screen
2015-05-08 14:20:30.947 DroneG2[3606:942222] touches count:1
2015-05-08 14:20:30.962 DroneG2[3606:942222] moved touch.tapCount:1
2015-05-08 14:20:30.962 DroneG2[3606:942222] moved touches count:1
2015-05-08 14:20:30.962 DroneG2[3606:942222] touchesMoved click left screen
2015-05-08 14:20:30.963 DroneG2[3606:942222] touches count:1
2015-05-08 14:20:30.982 DroneG2[3606:942222] moved touch.tapCount:1
2015-05-08 14:20:30.982 DroneG2[3606:942222] moved touches count:1
2015-05-08 14:20:30.983 DroneG2[3606:942222] touchesMoved click left screen
2015-05-08 14:20:30.983 DroneG2[3606:942222] touches count:1
2015-05-08 14:20:30.984 DroneG2[3606:942222] touch left:1
2015-05-08 14:20:30.985 DroneG2[3606:942222] touch right:0
How can i process multi touch on different uiview?
I had add below in viewdidload:
self.leftScreenView.multipleTouchEnabled = NO;
self.rightScreenView.multipleTouchEnabled = NO;
// self.leftScreenView.exclusiveTouch = NO;
// self.rightScreenView.exclusiveTouch = NO;
self.view.multipleTouchEnabled = YES;
my storyboard screenshot :
Thank you very much.
Add a UIGestureRecognizer to each of your views inside -viewDidLoad. With a UIGestureRecognizer you are able to track the state of the gesture. As an example you can use the following.
//Add a gesture recognizer to each view
UIPanGestureRecognizer *panGesture =
[[UIPanGestureRecognizer alloc] initWithTarget:self
action: #selector(handlePan:)];
panGesture.minimumNumberOfTouches = 1;
[self.myView addGestureRecognizer:panGesture];
Now inside of -handlePan you can track the state and view that contains the gesture.
- (void)handlePan:(UIPanGestureRecognizer *)gesture{
UIView *view = gesture.view; //View that contains gesture
if(gesture.state == UIGestureRecognizerStateBegan){
}else if(gesture.state == UIGestureRecognizerStateChanged){
}else if(gesture.state == UIGestureRecognizerStateEnded){
}
}
Edit:
To distinguish between the left and the right view you can add a tag to each view.
leftView.tag = 0;
rightView.tag = 1;
Then inside of -handlePan:
UIView *view = gesture.view; //View that contains gesture
if(view.tag == 0)
//...
if(view.tag == 1)
//...
Edit2:
You need to add the gesture to the left and right view, not the view of the view controller.
- (void)viewDidLoad {
[super viewDidLoad];
self.leftScreenView.tag = 0;
self.rightScreenView.tag = 1;
UIPanGestureRecognizer *panGesture =
[[UIPanGestureRecognizer alloc] initWithTarget:self
action: #selector(handlePan:)];
panGesture.minimumNumberOfTouches = 1;
[self.leftScreenView addGestureRecognizer:panGesture];
UIPanGestureRecognizer *panGesture1 =
[[UIPanGestureRecognizer alloc] initWithTarget:self
action: #selector(handlePan:)];
panGesture1.minimumNumberOfTouches = 1;
[self.rightScreenView addGestureRecognizer:panGesture1];
}
You can handle this situation using - touchesForView: of UIEvent. - touchesForView: returns the UITouch set. Check if the returned set contains the leftSideView or RightSideView, if yes then handle the move event. Following is the code snippet:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
// get the touch set of leftside view
NSSet *eventTouches1 = [event touchesForView:leftSideView];
// get the touch set of rightside view
NSSet *eventTouches2 = [event touchesForView:rightSideView];
// check if the eventTouches1 is not null
if (eventTouches1) {
UITouch *touch1 = [[eventTouches1 allObjects] objectAtIndex:0];
if ([touch1.view isEqual:leftSideView]) {
// handle drag event for left side view
}
}
// check if the eventTouches2 is not null
if (eventTouches2) {
UITouch *touch2 = [[eventTouches2 allObjects] objectAtIndex:0];
if ([touch2.view isEqual:rightSideView]) {
// handle drag event for right side view
}
}
}
This can also be achieved by using UIGestureRecognizer also.

Trying to move UIImageView, moving whole Screen View

I want my UIViewImages to be movable with touch. I'm trying to use code implemented in my ViewController:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch = [touches anyObject];
CGPoint p0 = [touch previousLocationInView:self.view];
CGPoint p1 = [touch locationInView:self.view];
CGPoint center = self.view.center;
center.x += p1.x - p0.x;
center.y += p1.y - p0.y;
self.view.center = center;
}
}
When I try to drag an UIImageView, I'm dragging whole screen, which is incorrect.
Need help!
Karol
You create a gesture recognizer and add it to a view like this:
UIPanGestureRecognizer *panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(gestureRecognizerMethod:)];
[self.imageView addGestureRecognizer:panGestureRecognizer];
you can adjust the position of the image view
- (void)gestureRecognizerMethod:(UIPanGestureRecognizer *)recogniser
{
if (recognizer.state == UIGestureRecognizerStateBegan || recognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint touchLocation = [recognizer locationInView:self.view];
self.imageView.center = touchLocation;
}
}
read this article.
If I read your code right self.view actually IS the whole screen.
Maybe you mean self.yourImageView instead?

ins SpriteKit, "select" all the sprites my finger touches while moving

So I'm trying to learn SpriteKit while building what I think is a simple puzzle game. I have a 5x5 grid of SKSpriteNodes of different colors. What I want is to be able to touch one, and move my finger horizontally or vertically and detect all the nodes that my finger is touching, like if I was "selecting" them.
I tried to do something like this, but it crashes the app:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKSpriteNode *node = [self nodeAtPoint:location];
NSLog(#"Dragged over: %#", node);
}
Is there something like a "touchEnter" / "touchLeave" kinda event that I'm missing? Sorry, I don't even know what I don't know.
UIPanGestureRecognizer is your friend:
-(void)didMoveToView:(SKView*)view {
UIPanGestureRecognizer *recognizer = [[UIPangestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
recognizer.delegate = self;
[self.view addGestureRecognizer:recognizer];
}
-(void)hadlePangesture:(UIPanGestureRecognizer*)recognizer {
CGPoint location = [recognizer locationInView:self.view];
SKSpriteNode *node = [self nodeAtPoint:[self convertPointFromView:location]];
if (node) {
NSLog(#"Dragged over: %#", node);
}
}

TouchesMoved with UITouch

I'm trying to create a simple application where you can move your UIImageView by touching him and dragging him around.
my UIImageView is called imv
-(void ) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch * touch = [touches anyObject];
if([touch view] == self.imv){
CGPoint location = [touch locationInView:self.view];
self.imv.center = location;
}
}
-(void ) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch * touch = [touches anyObject];
if([touch view] == self.imv){
CGPoint location = [touch locationInView:self.view];
self.imv.center = location;
}
}
i'am trying to solve this like whole day and i don't know what is wrong. If i disable if statement it's working else not. What can i do?
Thanks for the answers
Unless you've subclassed UIImageView (unlikely), your view is receiving the touch events.
These days it's simpler & more usual to use a UIGestureRecognizer for this kind of thing, in this case a UIPanGestureRecognizer.
e.g.
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(dragImageView:)];
[self.imv addGestureRecognizer:pan];
- (void)dragImageView:(UIPanGestureRecognizer *)dragImageView {
if(UIGestureRecognizerStateBegan == state) {
originalCenter = self.imv.center; // add CGPoint originalCenter; member
} else if(UIGestureRecognizerStateChanged == state) {
CGPoint translate = [pan translationInView:self.imv.superview];
self.imv.center = CGPointMake(originalCenter.x + translate.x, originalCenter.y + translate.y);
}
}
From a bit of experimenting, it seems that the [touch view] is returning the main UIView and not your subview, hence the problem with the if statement not working (I added the UIImageView in a storyboard xib). EDIT- it's because UIImageViews don't handle touch events by default - see here . When adding a regular UIView in the viewDidLoad seems to work as you would expect.
Anyway, this adapted version of your code works for me
-(void)moveImageForTouches:(NSSet*)touches
{
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
if(CGRectContainsPoint(self.imv.frame, location))
{
self.imv.center = location;
}
}
-(void ) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self moveImageForTouches:touches];
}
-(void ) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[self moveImageForTouches:touches];
}

Dragging Multiple Images

I am trying to develop an analysing app that determines if you are "clever"
What this involves doing is taking a picture of yourself and dragging points onto your face, where the nose, mouth and eyes are. However, The code I have tried does not work:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == eye1)
{
eye1.center = location;
}
else if ([touch view] == eye2)
{
eye2.center = location;
}
else if ([touch view] == nose)
{
nose.center = location;
}
else if ([touch view] == chin)
{
chin.center = location;
}
else if ([touch view] == lip1)
{
lip1.center = location;
}
else if ([touch view] ==lip2)
{
lip2.center = location;
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
What is happening, because when I just have a single image, it works, but is not helpful for me. What can I do to make it work? The spots start at the bottom of the screen in a "Toolbar" and then the user drags them onto the face. I kinda want the finished result to look like:
There are two basic approaches:
You can use the various touches methods (e.g. touchesBegan, touchesMoved, etc.) in your controller or the main view, or you can use a single gesture recognizer on the main view. In this situation, you'd use touchesBegan or, if using a gesture recognizer, a state of UIGestureRecognizerStateBegan, determine locationInView of the superview, and then test whether the touch is over one of your views by testing CGRectContainsPoint, using the frame of the various views as the first parameter, and by using the location as the second parameter.
Having identified the view that the gesture began, then in touchesMoved or, if in a gesture recognizer, a state of UIGestureRecognizerStateChanged, and move the view based upon the translationInView.
Alternatively (and easier IMHO), you can create individual gesture recognizers that you attach to each of the subviews. This latter approach might look like the following. For example, you first add your gesture recognizers:
NSArray *views = #[eye1, eye2, lip1, lip2, chin, nose];
for (UIView *view in views)
{
view.userInteractionEnabled = YES;
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[view addGestureRecognizer:pan];
}
Then you implement a handlePanGesture method:
- (void)handlePanGesture:(UIPanGestureRecognizer *)gesture
{
CGPoint translation = [gesture translationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateChanged)
{
gesture.view.transform = CGAffineTransformMakeTranslation(translation.x, translation.y);
[gesture.view.superview bringSubviewToFront:gesture.view];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
gesture.view.transform = CGAffineTransformIdentity;
gesture.view.center = CGPointMake(gesture.view.center.x + translation.x, gesture.view.center.y + translation.y);
}
}

Resources