UIButton exceeds the limitation of boundaries when dragging very fast - ios

I'm trying to implement drag controls with boundary limitation on UIButton.
And I wrote following code for that.
- (void)onTouchDragInside:(UIButton*)btn withEvent:(UIEvent*)event{
UITouch *touch = [[event touchesForView:btn] anyObject];
CGPoint prevPos = [touch previousLocationInView:btn];
CGPoint pos = [touch locationInView:btn];
float dX = pos.x-prevPos.x;
if (btn.frame.origin.x >= buttonOffPosition && btn.frame.origin.x <= buttonOnPosition) {
btn.center=CGPointMake(btn.center.x+dX, btn.center.y);
NSLog(#"buttonOffPos: %f", buttonOffPosition);
NSLog(#"btn.center.x+dX: %f", btn.center.x+dX);
NSLog(#"buttonOnPos: %f", buttonOnPosition);
}
}
This works almost properly. But only when the button is dragged very fast, it exceeds the limitation buttonOffPosition and buttonOnPosition.
This is the problem I want to resolve. Is there a good way to solve this problem?
Your thoughts and help will be hugely appreciated.

If you want the code to limit your button's location to between your two values you need to check what the final resting point of your button will be after this touch event is processed, not check where it currently is. If it's currently in bounds to be moved, then you move it by say 1000, it will no longer be in bounds at the end but you allow it to go there because you didn't check the end point.
You can do this several ways. The simplest one that comes to my mind is:
- (void)onTouchDragInside:(UIButton*)btn withEvent:(UIEvent*)event{
UITouch *touch = [[event touchesForView:btn] anyObject];
CGPoint prevPos = [touch previousLocationInView:btn];
CGPoint pos = [touch locationInView:btn];
float dX = pos.x-prevPos.x;
//Get the new origin after this motion
float newXOrigin = btn.frame.origin.x + dX;
//Make sure it's within your two bounds
newXOrigin = MIN(newXOrigin,buttonOnPosition);
newXOrigin = MAX(newXOrigin,buttonOffPosition);
//Now get the new dX value staying in bounds
dX = newXOrigin - btn.frame.origin.x;
btn.center=CGPointMake(btn.center.x+dX, btn.center.y);
}
This method brings up a problem with your finger no longer being inside the button as you are dragging it from one direction to the other, but I will leave that problem to your next question.
EDIT:
Here is how I would make it more readable. This is only my opinion and has no bearing on the way the code works. Outside of just modifying your code, I would create a cached starting point for the button object and do all motion events from that. That way if your button stops moving even though your finger continues, as your finger comes back the button stays with your finger. This current solution, as soon as your finger changes direction the button will start moving again even though your finger is far off the button now. But to do that would be a large code change for you.
- (void)onTouchDragInside:(UIButton*)btn withEvent:(UIEvent*)event{
//This code can go awry if there is more than one finger on the screen, careful
UITouch *touch = [[event touchesForView:btn] anyObject];
CGPoint prevPos = [touch previousLocationInView:btn];
CGPoint pos = [touch locationInView:btn];
float dX = pos.x-prevPos.x;
//Get the original position of the button
CGRect buttonFrame = btn.frame;
buttonFrame.origin.x += dX;
//Make sure it's within your two bounds
buttonFrame.origin.x = MIN(buttonFrame.origin.x,buttonOnPosition);
buttonFrame.origin.x = MAX(buttonFrame.origin.x,buttonOffPosition);
//Set the button's new frame if we need to
if(buttonFrame.origin.x != btn.frame.origin.x)
btn.frame = buttonFrame
}

Related

Detect all touch locations in View with fast finger movement

I was wondering, if there is a way to recognize every single location of UITouch.
Following my finger location in touchesMoved like:
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self];
I am having problems, with detecting "p" location if I move my finger fast. It just wont read every single X or Y position, but it will drop some.
Example:
I am following my touch location with NSLog, and if I move finger fast down the iPad, NSLog message is showing me Y location like: ... 281,301,322,346,375...
This is the internal behaviour of the framework. You can safely assume intermediate points by following slope of a line. I mean your start location is (x1,y1) and end location is (x2,y2) then you can find all intermediate points.
When you will change the direction or path, you will get the new point in touchesMoved.
You can use gesture to get the location using below code
UILongPressGestureRecognizer *gr = [[UILongPressGestureRecognizer alloc] init];
[gr addTarget:self action:#selector(userLongPressed:)];
[self.view addGestureRecognizer:gr];
then in userLongPressed method :
CGPoint point = [recognizer locationInView:self.view];
NSLog point will give you all locations......
Please let me know if u want any more information..

Issue regarding TouchesMoved using the method CGRectContainsPoint

My issue occurs when I drag the uiimageview accross the screen on, which is set to only be dragable in the x axis direction.
The code sort of works. The uiimageview is moving alright and it's limited to the x axis only, which is exactly what it should.
BUT when you start dragging outside the frame of the uiimageview, it stops moving along side my finger.
This obliviously has something to do with this method: CGRectContainsPoint.
Bare in mind it's very necessary in my code as I only want the uiimageview to move, when a user has set it's finger on it.
If I didn't use this method CGRectContainsPoint, the image would still move even when a users finger wouldn't touch the image. Any work around this is much appreciated.
here's my code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Touches Moved is running");
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (CGRectContainsPoint(UIImageView, location) && UIImageView >= 40)
{
NSLog(#"Contains Point UIImageView Center!");
CGPoint xLocation = CGPointMake(location.x,UIImageView);
UIImageView = xLocation;
//here it comes.. big block of code//
if (location.x <= 40) {
NSLog(#"Start Dragging Point");
CGPoint newLocation = CGPointMake(40
, 402);
UIImageView = newLocation;
}
else if(location.x >= 273) {
NSLog(#"End Dragging Point");
CGPoint newLocation = CGPointMake(273
, 402);
UIImageView = newLocation;
}
}
Move CGRectContainsPoint(UIImageView, location) from -(void)touchesMoved:... to -(void)touchesBegan:....
Set an ivar there that points to the view only when you began the touch inside the view. Use this ivar from -(void)touchesMoved:... to boundlessly move the view. You can then unset this variable both in -(void)touchesEnded:... and -(void)touchesCancelled:....

Looking for an alternative to touchesMoved to detect any touch event within a defined area?

I have a virtual keyboard in my app with 6 keys, and the whole thing is just an image implemented with UIImageView. I determined the exact x and y coordinates that correspond to the image of each 'key' and used the following code to respond to a user interacting with the keyboard:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 72 && point.x > 65)
{
NSLog(#"Key pressed");
}
//Repeat per key...
}
However, I have realized that this method is not very smart, because changing phone perspectives (portrait to landscape) or changing devices will ruin my x and y coordinates and therefore cause problems.
So, I am looking for an alternative to specifying the absolute x and y values, and using touchesMoved in general. Ideally, it would be a button with specific settings that would call its method if it was tapped, or if the user dragged their finger into the area of the button (even if very slowly - I used swipe detection before and it required too much of an exaggerated movement).
Is it possible to set up a button to call its method if tapped or if a touch event started outside of the button and then proceded into the button? If not, what are my alternatives?
Thanks SE!
You need to get the winSize property, which will fix the problem you are having with screen sizes.
CGSize size = [[CCDirector sharedDirector]winSize];
I do believe you are using Cocos2D? If so you can use this size property instead of hard coding numbers. :)
to convert your point, use
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
To see if its within the bounds/box of the button you could try:
if (CGRectContainsPoint ([self.myButton boundingBox], location)
{
//Execute method
}
This is all assuming you are using Cocos2D and a CCSprite for your button.
This should work on any screen size and portrait or landscape :)

iPhone - Move object along touch and drag path

I am developing a simple animation where an UIImageView moves along a UIBezierPath, now I want to provide user interation to the moving UIImageView so that user can guide the UIImageView by touching the UIImageView and drag the UIImageview around the screen.
Change the frame of the position of the image view in touchesMoved:withEvent:.
Edit: Some code
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [[event allTouches] anyObject];
if ([touch.view isEqual: self.view] || touch.view == nil) {
return;
}
lastLocation = [touch locationInView: self.view];
}
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [[event allTouches] anyObject];
if ([touch.view isEqual: self.view]) {
return;
}
CGPoint location = [touch locationInView: self.view];
CGFloat xDisplacement = location.x - lastLocation.x;
CGFloat yDisplacement = location.y - lastLocation.y;
CGRect frame = touch.view.frame;
frame.origin.x += xDisplacement;
frame.origin.y += yDisplacement;
touch.view.frame = frame;
lastLocation=location;
}
You should also implement touchesEnded:withEvent: and touchesCanceled:withEvent:.
So you want the user to be able to touch an image in the middle of a keyframe animation along a curved path, and drag it to a different location? What do you want to happen to the animation at that point?
You have multiple challenges.
First is detecting the touch on the object while a keyframe animation is "in flight".
To do that, you want to use the parent view's layer's presentation layer's hitTest method.
A layer's presentation layer represents the state of the layer at any given instant, including animations.
Once you detect touches on your view, you will need to get the image's current location from the presentation layer, stop the animation, and take over with a touchesMoved/touchesDragged based animation.
I wrote a demo application that shows how to detect touches on an object that's being animated along a path. That would be a good starting point.
Take a look here:
Core Animation demo including detecting touches on a view while an animation is "in flight".
Easiest way would be subclassing UIImageView.
For simple dragging take a look at the code here (code borrowed from user MHC):
UIView drag (image and text)
Since you want to drag along Bezier path you'll have to modify touchesMoved:
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *aTouch = [touches anyObject];
//here you have location of user's finger
CGPoint location = [aTouch locationInView:self.superview];
[UIView beginAnimations:#"Dragging A DraggableView" context:nil];
//commented code would simply move the view to that point
//self.frame = CGRectMake(location.x-offset.x,location.y-offset.y,self.frame.size.width, self.frame.size.height);
//you need some kind of a function
CGPoint calculatedPosition = [self calculatePositonForPoint: location];
self.frame = CGRectMake(calculatedPosition.x,calculatedPosition.y,self.frame.size.width, self.frame.size.height);
[UIView commitAnimations];
}
What exactly you would like to do in -(CGPoint) calculatePositionForPoint:(CGPoint)location
is up to you. You could for example calculate point in Bezier path that is the closest to location. For simple test you can do:
-(CGPoint) calculatePositionForPoint:(CGPoint)location {
return location;
}
Along the way you're gonna have to decide what happens if user wonders off to far from your
precalculated Bezier path.

If I use ccDrawLine as part of a custom class, can it detect touches?

Basically, I've got a custom class that has a draw method that draws a line from point a to point b.
I'm subclassing CCSprite, so does the line then have a bounding box I can use to detect when someone touches the line?
As an example of what I'm trying to accomplish, I've cobbled together this code:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
location = [[CCDirector sharedDirector] convertToGL: [touch locationInView:touch.view ]];
for (Path *path in paths) {
CGRect pathRect = CGRectMake(path.position.x, path.position.y, path.contentSize.width, path.contentSize.height);
if(CGRectContainsPoint(pathRect, location)) {
CCLOG(#"Line Touched");
}
}
}
paths is a mutable array of Path objects. I've put logs after each statement in the method, and it gets through everything but the for loop. For some reason, it seems like it never gets into the loop.
The answer is yes, you can. You just have to make sure you set the rectangle the the right size, and place it at the correct origin point, which was where I was breaking it.

Resources