Check when user slides over UIView - touchesMoved too laggy - ios

I know there are better ways to achieve drawing tools using CoreGraphics and what-not, I'm using this UIView method because I want to achieve a pixelated effect and I have some other custom stuff in mind that would be difficult to do with just the built in CG code.
I have a grid of 400 UIViews to create my pixelated canvas...
The user selects a paint color and begins to drag their finger over the canvas of 400 UIViews, as their finger hits a UIView the color of that UIView (aka pixel) will change to the color the user has selected...
Right now I have coded the whole thing and everything is working but it is laggy (because I am using touchesMoved) and many points are skipped...
I am currently using touchesMoved and am moving a UIView to the fingers location(I am calling this the fingerLocationRect).
When I update a pixel's (1 of 400 miniature UIViews) color I am doing so by checking if the fingerLocationRect is intersection any one of my UIView rects. If they are I update the color.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
fingerLocationRect.center = CGPointMake(currentPosition.x, currentPosition.y);
int tagID = 1;
while (tagID <= 400) {
if(CGRectIntersectsRect(fingerLocationRect.frame, [drawingView viewWithTag:tagID].frame)) {
[drawingView viewWithTag:tagID].backgroundColor = [UIColor colorWithRed:redVal/255.f green:greenVal/255.f blue:blueVal/255.f alpha:1.f];
}
tagID ++;
}
}
The problem is that touchesMoved is not updated very often... so if I drag my finger across the canvas only like 5 of the pixels change instead of the 40 I actually touched.
I'm wondering if using CGRectIntersectsRect with touchesMoved is the best method to detect if the user slides of a UIView... Right now it is not updating nearly fast enough and I am getting the result on the left instead of the result on the right. (I only achieve the effect on the right if I move my finger VERY slowly...)
All that said, is there a method for detecting when a user slides over a UIView other than using touchesMoved? If not, any ideas on how to grab all of the UIViews in-between the touch locations returned by touchesMoved? I have them all having tags of 1-400 (the first row is 1-10, second row is 11-21, etc.)
P.s. if a panGestureRecognizer updated more frequently than a touchesMoved?
To get an idea of what the 400 UIViews look like here they all are with arc4random based colors:

You are looping through 400 views every time touchesMoved: is called. You just need to do a little math to determine which view in the grid is being touched. Assuming that drawingView is the "canvas" view that contains all 400 "pixels" and the tags are ordered from the top-left to the bottom-right then:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:drawingView];
NSInteger column = floorf(20 * point.x / drawingView.bounds.size.width); //20 is the number of column in the canvas
NSInteger row = floorf(20 * point.y / drawingView.bounds.size.height); //20 is the number of rows in the canvas
[viewWithTag:column + 20*row].backgroundColor = [UIColor colorWithRed:redVal/255.f green:greenVal/255.f blue:blueVal/255.f alpha:1.f];
}
This method performance improvement will allow cocoa to make more frequent calls to touchesMoved:

Related

How do I detect the coordinates of a button press on iOS?

So I'm making a minesweeper clone for iOS, and I have an array of UIButtons containing 135 buttons (the minesweeper board). It looks great and theoretically should work great. But I was having trouble detecting which button was being hit. I tried working around the problem by using this code;
UITouch *touched = [[event allTouches] anyObject];
CGPoint location = [touched locationInView:touched.view];
NSLog(#"x=%.2f y=%.2f", location.x, location.y);
int pointX = location.x;
int pointY = location.y;
My goal was to grab the coordinates of the touch and then use some basic math to figure out which button was being pressed. However, it doesn't work. At all. No button is pressed, no function runs, essentially nothing happens. I'm left with a minesweeper board that you can't interact with. Any ideas?
Assign a separate number to the tag of each button. Use the button's target, not the UITouch code. When you get a buttonPress, query the tag.
You could subclass the buttons and then program the what needs to happen when a touch occurs in a button inside of that subclass.
The UIButton * can be accessed by calling:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
on self (a UIView * I imagine). So I suppose you can set the button to the pushed state, and when touchesEnded: is called, set it back.

Restrict movement of UIButton along a UIBezierPath path

I have a circular UIBezierPath. I use the path to draw a circle on my view to create an outline of a 24 hr clock. I have a UIButton whose position depends on the current time. The button acts like an Hour hand. I want the users to be able to move the UIButton along the circular path. I call it "visit the future/past" feature. How do I restrict the buttons movement to the path I have?
Override touchesBegan: and touchesMoved: methods in your view
- (void)touchesBegan: (NSSet *)touches withEvent:(UIEvent *)event
{
if([[event touchesForView:button] count])
{
//User is trying to move the button set a variable to indicate this.
}
}
- (void)touchesMoved: (NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint *point = [[event anyObject] locationInView:self];
/*Compare x and y coordinates with the centre property of the button
If x or y are greater set center of button to next point in circle or previous
point if any of them are lesser.*/
}
Note that you will have to save all points in your circle in an array before attempting this or you will have to calculate the points on the circumference of the circle by knowing the radius.
The easiest way is in touchesMoved, you can check to ignore touch which is not in your circle view by using:
CGPoint point = [touch locationInView:circleView];
if (![circleView pointInside:point withEvent:event]) {
return;
}

What is the proper way of passing UITouch * across multiple subviews?

I have a custom view that has multiple subviews. They are all circles on the screen, sort of like three wheels of different radius on top of each other. I'm trying to make them receive a UITouch * event correctly to make them spin with the finger. Since the shapes are actually squares on the screen, when a bigger one flips and it's touchable area enters the frame of a circle above, it becomes untouchable.
So, I created another subview on top of others that will calculate the distance of the touch point to the center and distribute the touch event accordingly. I can think of several ways of doing it, but I was wondering what would be the most elegant, and most correct way of handling a situation like this.
This is what I've done so far: My custom view has a delegate, and that delegate is assigned to my main viewController. I have three protocol methods in my custom view, for the three wheels respectively. I'm passing out the touch and event according to the point of UITouch, but I'm not sure how should I actually send this data to the views that are supposed to receive it. They are all custom UIControl objects, and they all handle touches via the -beginTrackingWithTouch:withEvent:. Since this is a private method, I cannot access this from my viewController. Should I make this method public and access this from the viewController, or is there a more correct way of handling this?
Edit: added the code:
This is how I distribute the touch in the custom UIView object. The calculations work fine.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Distribute the touches according to the touch location.
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
//calculations for the circles.
CGFloat xDistance = (point.x - BIGGEST_CIRCLE_RADIUS);
CGFloat yDistance = (point.y - BIGGEST_CIRCLE_RADIUS);
CGFloat distance = sqrtf((xDistance*xDistance) + (yDistance*yDistance));
//Check to see if the point is in one of the circles, starting from the innermost circle.
if (distance <= SMALLEST_CIRCLE_RADIUS) {
[self.delegate smallestCircleReceivedTouch:touch withEvent:event];
} else if (distance < MIDDLE_CIRCLE_RADIUS) {
[self.delegate middleCircleReceivedTouch:touch withEvent:event];
} else if (distance <= BIGGEST_CIRCLE_RADIUS) {
[self.delegate biggestCircleReceivedTouch:touch withEvent:event];
} else {
return;
}
}
The delegate is the viewController and the circles are custom UIControls. They handle the touch like this:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchPoint = [touch locationInView:self];
{....}
return YES;
}
These work fine in themselves, but I'm not sure how should I connect the delegate method to the touch handling of each custom UIControl. Should I call their -beginTrackingWithTouch:withEvent: from the viewController, or should I make them implement the protocol of the customView? Or is there some other way to handle this properly?
Even I did not try it, it is not necessary, to do your own calculation. -hitTest:withEvent: should work fine for you.

Looking for an alternative to touchesMoved to detect any touch event within a defined area?

I have a virtual keyboard in my app with 6 keys, and the whole thing is just an image implemented with UIImageView. I determined the exact x and y coordinates that correspond to the image of each 'key' and used the following code to respond to a user interacting with the keyboard:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 72 && point.x > 65)
{
NSLog(#"Key pressed");
}
//Repeat per key...
}
However, I have realized that this method is not very smart, because changing phone perspectives (portrait to landscape) or changing devices will ruin my x and y coordinates and therefore cause problems.
So, I am looking for an alternative to specifying the absolute x and y values, and using touchesMoved in general. Ideally, it would be a button with specific settings that would call its method if it was tapped, or if the user dragged their finger into the area of the button (even if very slowly - I used swipe detection before and it required too much of an exaggerated movement).
Is it possible to set up a button to call its method if tapped or if a touch event started outside of the button and then proceded into the button? If not, what are my alternatives?
Thanks SE!
You need to get the winSize property, which will fix the problem you are having with screen sizes.
CGSize size = [[CCDirector sharedDirector]winSize];
I do believe you are using Cocos2D? If so you can use this size property instead of hard coding numbers. :)
to convert your point, use
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
To see if its within the bounds/box of the button you could try:
if (CGRectContainsPoint ([self.myButton boundingBox], location)
{
//Execute method
}
This is all assuming you are using Cocos2D and a CCSprite for your button.
This should work on any screen size and portrait or landscape :)

Draw a rectangle which is draggable

If I draw a rectangle on a frame and then I want to drag this rectangle to different positions on this frame. How could I do this? Please add comments if my description is unclear. It might be a bit confusing.
You can create a UIView with any background image and move it on a window by setting its frame as yourView.center = CGPointMake(x,y);
You can detect the point of touch using any/all of touchesBegan , touchesMoved or touchesEnded methods as follows :
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch =[touches anyObject];
CGPoint currentPoint =[touch locationInView:self.view];//point of touch
}
These methods are to be declared in a UIViewController. As sample code, you can refer to my github project in which I drag an item from one UITableView to another UITableView which I've accomplished using UIViews.

Resources