I have a custom view that has multiple subviews. They are all circles on the screen, sort of like three wheels of different radius on top of each other. I'm trying to make them receive a UITouch * event correctly to make them spin with the finger. Since the shapes are actually squares on the screen, when a bigger one flips and it's touchable area enters the frame of a circle above, it becomes untouchable.
So, I created another subview on top of others that will calculate the distance of the touch point to the center and distribute the touch event accordingly. I can think of several ways of doing it, but I was wondering what would be the most elegant, and most correct way of handling a situation like this.
This is what I've done so far: My custom view has a delegate, and that delegate is assigned to my main viewController. I have three protocol methods in my custom view, for the three wheels respectively. I'm passing out the touch and event according to the point of UITouch, but I'm not sure how should I actually send this data to the views that are supposed to receive it. They are all custom UIControl objects, and they all handle touches via the -beginTrackingWithTouch:withEvent:. Since this is a private method, I cannot access this from my viewController. Should I make this method public and access this from the viewController, or is there a more correct way of handling this?
Edit: added the code:
This is how I distribute the touch in the custom UIView object. The calculations work fine.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Distribute the touches according to the touch location.
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
//calculations for the circles.
CGFloat xDistance = (point.x - BIGGEST_CIRCLE_RADIUS);
CGFloat yDistance = (point.y - BIGGEST_CIRCLE_RADIUS);
CGFloat distance = sqrtf((xDistance*xDistance) + (yDistance*yDistance));
//Check to see if the point is in one of the circles, starting from the innermost circle.
if (distance <= SMALLEST_CIRCLE_RADIUS) {
[self.delegate smallestCircleReceivedTouch:touch withEvent:event];
} else if (distance < MIDDLE_CIRCLE_RADIUS) {
[self.delegate middleCircleReceivedTouch:touch withEvent:event];
} else if (distance <= BIGGEST_CIRCLE_RADIUS) {
[self.delegate biggestCircleReceivedTouch:touch withEvent:event];
} else {
return;
}
}
The delegate is the viewController and the circles are custom UIControls. They handle the touch like this:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchPoint = [touch locationInView:self];
{....}
return YES;
}
These work fine in themselves, but I'm not sure how should I connect the delegate method to the touch handling of each custom UIControl. Should I call their -beginTrackingWithTouch:withEvent: from the viewController, or should I make them implement the protocol of the customView? Or is there some other way to handle this properly?
Even I did not try it, it is not necessary, to do your own calculation. -hitTest:withEvent: should work fine for you.
Related
I am trying to figure out how I can determine if my touchpoint is where I have a UIView as subview or not. The background is UIView itself that I am adding multiple other UIViews to ... So as I long press and am changing the position while holding the touch, I'd like to know if there's a UIView at that point or not.
I have been thinking, still not clear how to go about it but came across this which makes me think of getting the indexes of hierarchy and check of it is larger than 1. But how could I do that for where I'm touching?
Any hint or clue would be appreciated.
You have to store both reference in two objects myParentView and mySubView now just use this method..
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint locationPoint = [[touches anyObject] locationInView: myParentView];
UIView* viewYouWishToObtain = [self hitTest:locationPoint withEvent:event];
if(mySubView == viewYouWishToObtain){
//That view is touched
}else{
//That view is not touched
}
}
I'm working on an iOS app.
I have a UIView that is being animated. It keeps moving from left to right.
This view has several UIView children. They move within their parent when this is being animated.
When I try to capture user touches on the children (by using a UITapGestureRecognizer), it doens't work as expected. It sort of detect some touches on the children, but not in the position where they currently are.
I've been struggling with this for a while now. Any ideas about how to solve this? I really need to detect which children the user is touching.
Thanks!
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self];
// Check every child subview
for (UIView *tickerItem in self.subviews)
{
// Check collision
if ([tickerItem.layer.presentationLayer hitTest:touchLocation])
{
MKTickerItemView *v = (MKTickerItemView*)tickerItem;
NSLog(#"Selected: %#", v.title);
// This button was hit whilst moving - do something with it here
break;
}
}
}
There is a UIViewAnimationOptions value...
UIViewAnimationOptionAllowUserInteraction
You can use this in the [UIView animateWithDuration... code.
This will enable the touch.
I am using touchesMoved with a coordinate system to detect and respond to user touches within certain areas of the screen. For example, if I have a virtual keyboard and the user swipes across the keys, it reads the coordinates and responds:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 90 && point.x > 20)
{
//do something
}
}
...However, the problem is, if the user slowly drags across the keys, or the border between keys, the method is triggered several times in a row, playing the piano key sound in a stutter.
How can I prevent this stutter? I think setting a minimum delay of 0.25 seconds between each successive if statement triggering would help. Also, this delay would only be for a specific if statement -- I want the user to be able to drag across the keys quickly and trigger different key's if-statement as quick as they want.
Does anyone know how to code something like this?
Try this:
BOOL _justPressed; // Declare this in your #interface
...
- (void)unsetJustPressed {
_justPressed = NO;
}
Then, in your touchesMoved:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_justPressed) {
// A key was just pressed, so do nothing.
return;
}
else {
_justPressed = YES;
// Do stuff here
[self performSelector:#selector(unsetJustPressed)
withObject:nil
afterDelay:0.25];
}
}
This way, you set a variable _justPressed to YES every touchesMoved:withEvent: is called (or within a specific conditional in there, depending on what you want to do), and you use performSelector:withObject:afterDelay: to set _justPressed to NO after a certain time period, and so you can just check whether _justPressed is YES when touchesMoved: is called to ascertain whether it was called recently.
Remember, you don't have to return from the method like in the example above, you can simply use _justPressed to check whether you should play the sound, but still perform your other actions. The example is just to give you a basic idea of what to do.
I have a UITableView that has a UIImageView which traverses it one row at a time at the click of a button (up/down). What I would like to do now is allow the user to drag the UIImageView up or down the table ONLY (i.e. no sideways movement). If majority of the UIImageView is over a particular cell, then when the user lets go of their finger, then I want the UIImageView to link to that row. Here is an image of the UITableView, with the UIImageView:
The scroll bar is the UIImageView that needs to move or down. I realize that I am supposed to implement the following methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// We only support single touches, so anyObject retrieves just that touch from touches.
UITouch *touch = [touches anyObject];
if ([touch view] != _imageView) {
return;
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == _imageView) {
return;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
//here is where I guess I need to determine which row contains majority of the scrollbar. This would only measure the y coordinate value, and not the x, since it will only be moving up or down.
return;
}
}
However, I am not sure HOW to achieve this functionality. I have tried to find similar examples online, and I have looked at the MoveMe example code from Apple, but I am still stuck. Please also note that my scroll bar is NOT the exact same size as a row in the table, but rather, a bit longer, but with the same height.
Thanks in advance to all who reply
Try adding a UIPanGestureRecognizer to the UIImageView. Start by getting the image view's current location, then use the translationInView method to determine where to move the image view.
From Apple's documentation:
If you want to adjust a view's location to keep it under the user's
finger, request the translation in that view's superview's coordinate
system... Apply the translation value to the state of the view when the gesture is first recognized—do not concatenate the value each time the handler is called.
Here's the basic code to add the gesture recognizer:
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panView:)];
[imageView addGestureRecognizer:panGesture];
Then, do the math to determine where to move the view.
- (void)panView:(UIPanGestureRecognizer*)sender
{
CGPoint translation = [sender translationInView:self];
// Your code here - change the frame of the image view, and then animate
// it to the closest cell when panning finishes
}
I know there are better ways to achieve drawing tools using CoreGraphics and what-not, I'm using this UIView method because I want to achieve a pixelated effect and I have some other custom stuff in mind that would be difficult to do with just the built in CG code.
I have a grid of 400 UIViews to create my pixelated canvas...
The user selects a paint color and begins to drag their finger over the canvas of 400 UIViews, as their finger hits a UIView the color of that UIView (aka pixel) will change to the color the user has selected...
Right now I have coded the whole thing and everything is working but it is laggy (because I am using touchesMoved) and many points are skipped...
I am currently using touchesMoved and am moving a UIView to the fingers location(I am calling this the fingerLocationRect).
When I update a pixel's (1 of 400 miniature UIViews) color I am doing so by checking if the fingerLocationRect is intersection any one of my UIView rects. If they are I update the color.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
fingerLocationRect.center = CGPointMake(currentPosition.x, currentPosition.y);
int tagID = 1;
while (tagID <= 400) {
if(CGRectIntersectsRect(fingerLocationRect.frame, [drawingView viewWithTag:tagID].frame)) {
[drawingView viewWithTag:tagID].backgroundColor = [UIColor colorWithRed:redVal/255.f green:greenVal/255.f blue:blueVal/255.f alpha:1.f];
}
tagID ++;
}
}
The problem is that touchesMoved is not updated very often... so if I drag my finger across the canvas only like 5 of the pixels change instead of the 40 I actually touched.
I'm wondering if using CGRectIntersectsRect with touchesMoved is the best method to detect if the user slides of a UIView... Right now it is not updating nearly fast enough and I am getting the result on the left instead of the result on the right. (I only achieve the effect on the right if I move my finger VERY slowly...)
All that said, is there a method for detecting when a user slides over a UIView other than using touchesMoved? If not, any ideas on how to grab all of the UIViews in-between the touch locations returned by touchesMoved? I have them all having tags of 1-400 (the first row is 1-10, second row is 11-21, etc.)
P.s. if a panGestureRecognizer updated more frequently than a touchesMoved?
To get an idea of what the 400 UIViews look like here they all are with arc4random based colors:
You are looping through 400 views every time touchesMoved: is called. You just need to do a little math to determine which view in the grid is being touched. Assuming that drawingView is the "canvas" view that contains all 400 "pixels" and the tags are ordered from the top-left to the bottom-right then:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:drawingView];
NSInteger column = floorf(20 * point.x / drawingView.bounds.size.width); //20 is the number of column in the canvas
NSInteger row = floorf(20 * point.y / drawingView.bounds.size.height); //20 is the number of rows in the canvas
[viewWithTag:column + 20*row].backgroundColor = [UIColor colorWithRed:redVal/255.f green:greenVal/255.f blue:blueVal/255.f alpha:1.f];
}
This method performance improvement will allow cocoa to make more frequent calls to touchesMoved: