Keep From Pushing 2 View Controllers at Once - ios

In my app setup, I have a navigation controller with 4 ImageViews. 1 of them can be dragged around, while the other 3 are stationary at the top section of the view. Using the code below, I have it set up so that the user drags the one image view to the image view of where he wants to go. So to get to view 1, he drags the movable image view to image view 1, and so on. The issue is that with the width of the image views, it is possible for the selector view to touch two at one time, which creates a nesting view controller issue. Is there a way I can keep this from happening, short of moving the image views so far away that it is impossible for more than one to be selected at a time?
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
// If the touch was in the placardView, move the placardView to its location
if ([touch view] == clock) {
CGPoint location = [touch locationInView:self.tabBarController.view];
clock.center = location;
BOOL isIntersecting = CGRectIntersectsRect(clock.frame, prayer.frame);
BOOL isIntersecting2 = CGRectIntersectsRect(clock.frame, fasting.frame);
BOOL isIntersecting3 = CGRectIntersectsRect(clock.frame, study.frame);
if(isIntersecting){
[self schedulePrayer];
NSLog(#"prayer");
}
if(isIntersecting2){
[self scheduleFasting];
NSLog(#"fasting");
}
if(isIntersecting3){
[self scheduleStudying];
NSLog(#"Studying");
}
return;
}
}

Why don't you just use if ... else if ... else if?
if(isIntersecting){
[self schedulePrayer];
NSLog(#"prayer");
}
else if(isIntersecting2){
[self scheduleFasting];
NSLog(#"fasting");
}
else if(isIntersecting3){
[self scheduleStudying];
NSLog(#"Studying");
}
Then, only one will be triggered at a time.

Create another BOOL "isTouching" and make it global. Then inside your if(isIntersecting) set "isTouching" to global, and add "isTouching" as a condition such that:
if ([touch view] == clock && (!isTouching))
You also need to set isTouching to false in the case that the UIImageView is not on any of the intersecting views and you should be good to go :)
That should be enough hints for you to solve your problem, but if you'd like more clarification let me know.

Related

How can I find out if I have a UIView where I am long pressing as I move the touch point?

I am trying to figure out how I can determine if my touchpoint is where I have a UIView as subview or not. The background is UIView itself that I am adding multiple other UIViews to ... So as I long press and am changing the position while holding the touch, I'd like to know if there's a UIView at that point or not.
I have been thinking, still not clear how to go about it but came across this which makes me think of getting the indexes of hierarchy and check of it is larger than 1. But how could I do that for where I'm touching?
Any hint or clue would be appreciated.
You have to store both reference in two objects myParentView and mySubView now just use this method..
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint locationPoint = [[touches anyObject] locationInView: myParentView];
UIView* viewYouWishToObtain = [self hitTest:locationPoint withEvent:event];
if(mySubView == viewYouWishToObtain){
//That view is touched
}else{
//That view is not touched
}
}

What is the proper way of passing UITouch * across multiple subviews?

I have a custom view that has multiple subviews. They are all circles on the screen, sort of like three wheels of different radius on top of each other. I'm trying to make them receive a UITouch * event correctly to make them spin with the finger. Since the shapes are actually squares on the screen, when a bigger one flips and it's touchable area enters the frame of a circle above, it becomes untouchable.
So, I created another subview on top of others that will calculate the distance of the touch point to the center and distribute the touch event accordingly. I can think of several ways of doing it, but I was wondering what would be the most elegant, and most correct way of handling a situation like this.
This is what I've done so far: My custom view has a delegate, and that delegate is assigned to my main viewController. I have three protocol methods in my custom view, for the three wheels respectively. I'm passing out the touch and event according to the point of UITouch, but I'm not sure how should I actually send this data to the views that are supposed to receive it. They are all custom UIControl objects, and they all handle touches via the -beginTrackingWithTouch:withEvent:. Since this is a private method, I cannot access this from my viewController. Should I make this method public and access this from the viewController, or is there a more correct way of handling this?
Edit: added the code:
This is how I distribute the touch in the custom UIView object. The calculations work fine.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Distribute the touches according to the touch location.
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
//calculations for the circles.
CGFloat xDistance = (point.x - BIGGEST_CIRCLE_RADIUS);
CGFloat yDistance = (point.y - BIGGEST_CIRCLE_RADIUS);
CGFloat distance = sqrtf((xDistance*xDistance) + (yDistance*yDistance));
//Check to see if the point is in one of the circles, starting from the innermost circle.
if (distance <= SMALLEST_CIRCLE_RADIUS) {
[self.delegate smallestCircleReceivedTouch:touch withEvent:event];
} else if (distance < MIDDLE_CIRCLE_RADIUS) {
[self.delegate middleCircleReceivedTouch:touch withEvent:event];
} else if (distance <= BIGGEST_CIRCLE_RADIUS) {
[self.delegate biggestCircleReceivedTouch:touch withEvent:event];
} else {
return;
}
}
The delegate is the viewController and the circles are custom UIControls. They handle the touch like this:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchPoint = [touch locationInView:self];
{....}
return YES;
}
These work fine in themselves, but I'm not sure how should I connect the delegate method to the touch handling of each custom UIControl. Should I call their -beginTrackingWithTouch:withEvent: from the viewController, or should I make them implement the protocol of the customView? Or is there some other way to handle this properly?
Even I did not try it, it is not necessary, to do your own calculation. -hitTest:withEvent: should work fine for you.

Trying to drag a UIImageView up and down a UITableView

I have a UITableView that has a UIImageView which traverses it one row at a time at the click of a button (up/down). What I would like to do now is allow the user to drag the UIImageView up or down the table ONLY (i.e. no sideways movement). If majority of the UIImageView is over a particular cell, then when the user lets go of their finger, then I want the UIImageView to link to that row. Here is an image of the UITableView, with the UIImageView:
The scroll bar is the UIImageView that needs to move or down. I realize that I am supposed to implement the following methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// We only support single touches, so anyObject retrieves just that touch from touches.
UITouch *touch = [touches anyObject];
if ([touch view] != _imageView) {
return;
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == _imageView) {
return;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
//here is where I guess I need to determine which row contains majority of the scrollbar. This would only measure the y coordinate value, and not the x, since it will only be moving up or down.
return;
}
}
However, I am not sure HOW to achieve this functionality. I have tried to find similar examples online, and I have looked at the MoveMe example code from Apple, but I am still stuck. Please also note that my scroll bar is NOT the exact same size as a row in the table, but rather, a bit longer, but with the same height.
Thanks in advance to all who reply
Try adding a UIPanGestureRecognizer to the UIImageView. Start by getting the image view's current location, then use the translationInView method to determine where to move the image view.
From Apple's documentation:
If you want to adjust a view's location to keep it under the user's
finger, request the translation in that view's superview's coordinate
system... Apply the translation value to the state of the view when the gesture is first recognized—do not concatenate the value each time the handler is called.
Here's the basic code to add the gesture recognizer:
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panView:)];
[imageView addGestureRecognizer:panGesture];
Then, do the math to determine where to move the view.
- (void)panView:(UIPanGestureRecognizer*)sender
{
CGPoint translation = [sender translationInView:self];
// Your code here - change the frame of the image view, and then animate
// it to the closest cell when panning finishes
}

Get the visible UIButton that was touched while animating when 2 or more views overlap?

I am programmatically generating several UIButtons and then animating them with a block animation. I am able to determine which button was touched by implementing the code in this answer (demonstrated below).
My issue now is that the images can overlap, so when there is more than 1 view at the given touch location, my code in touchesBegan pulls out the wrong button (i.e., gets the image underneath the visible button that I'm touching).
I wanted to use [touch view] to compare to the UIButtons on screen:
if (myButton==[touch view]) { ...
But that comparison always fails.
My touchesBegan:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
for (UIButton *brain in activeBrains) {
//Works, but only when buttons do not overlap
if ([brain.layer.presentationLayer hitTest:touchLocation]) {
[self brainExplodes:brain];
break;
}
/* Comparison always fails
if (brain == [touch view]) {
[self brainExplodes:brain];
break;
}
*/
}
}
So my question is how can I determine which of the overlapping images is above the other(s)?
I made a few assumptions here in my code here, but essentially you need to get a list of all the Buttons that have been touched and then find the one 'on top'. The one on top should have the highest index of the buttons in the array of subviews.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
NSMutableArray *brainsTouched = [[NSMutableArray alloc] init];
for (UIButton *brain in activeBrains) {
//Works, but only when buttons do not overlap
if ([brain.layer.presentationLayer hitTest:touchLocation]) {
[brainsTouched addObject:brain];
}
}
NSUInteger currentIndex;
NSInteger viewDepth = -1;
UIButton *brainOnTop;
for (UIButton *brain in brainsTouched){
currentIndex = [self.view.subviews indexOfObject:brain];
if (viewDepth < currentIndex){
brainOnTop = brain;
viewDepth = currentIndex;
}
}
[self brainExplodes:brainOnTop];
}
Also, I typed this in the edit window so please excuse typos.
The UIView class contains a tag property that you can use to tag individual view objects with an integer value. You can use tags to uniquely identify views inside your view hierarchy and to perform searches for those views at runtime. (Tag-based searches are faster than iterating the view hierarchy yourself.) The default value for the tag property is 0.
To search for a tagged view, use the viewWithTag: method of UIView. This method performs a depth-first search of the receiver and its subviews. It does not search superviews or other parts of the view hierarchy. Thus, calling this method from the root view of a hierarchy searches all views in the hierarchy but calling it from a specific subview searches only a subset of views.
Thanks #Aaron for you help in coming to a good solution. I did refactor your answer for my situation to gain an unnoticable perfomance gain (wee) but more importantly, I think, there's less reading if I have to refactor in the future.
It's pretty obvious in retrospect, I suppose, but of course the activeBrains array reflects the order of the subviews (since each new brain is added to the array right after it's added to the super view). So by simply looping backwards through the array, the proper brain is exploding.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
for(int i=activeBrains.count-1; i>=0; i--) {
UIButton *brain = [activeBrains objectAtIndex:i];
if ([brain.layer.presentationLayer hitTest:touchLocation]) {
[self explodeBrain:brain];
break;
}
}
}

UIView bringSubviewToFront: does *not* bring view to front

I am implementing a simple iOS solitaire game that allows the user to drag the cards around in the usual way. The cards are represented with the UIView subclass CardView. All the card view's are siblings which are subviews of SolitaireView. The following snippet tries to "bring a card to the front" so that it is above all the other views as it is being dragged:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if (touch.view.tag == CARD_TAG) {
CardView *cardView = (CardView*) touch.view;
...
[self bringSubviewToFront:cardView];
...
}
}
Unfortunately, the card's z-order remains unchanged during the drag. In the images below, I am dragging the King. Notice how it is correctly on top the Nine in the left image, but is incorrectly under the Two (under the entire stack actually) in the right image:
I also tried alter the layer.zPosition property as well to no avail.
How can I bring the card view to the front during the drag? I am mystified.
Confirmed. bringSubviewToFront: causes layoutSubview to be invoked. Since my version of layoutSubviews sets the z-orders on all the views, this was undoing the z-order I was setting in the touchesBegan:withEvent code above. Apple should mention this side effect in the bringSubviewToFront documentation.
Instead of using a UIView subclass, I created a CALayer subclass named CardLayer. I handle the touch in my KlondikeView subclass as listed below. topZPosition is an instance var that tracks the highest zPosition of all cards. Note that modifying the zPosition is usually animated -- I turn this off in the code below:
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
CGPoint hitTestPoint = [self.layer convertPoint:touchPoint
toLayer:self.layer.superlayer];
CALayer *layer = [self.layer hitTest:hitTestPoint];
if (layer == nil) return;
if ([layer.name isEqual:#"card"]) {
CardLayer *cardLayer = (CardLayer*) layer;
Card *card = cardLayer.card;
if ([self.solitaire isCardFaceUp:card]) {
//...
[CATransaction begin]; // disable animation of z change
[CATransaction setValue:(id)kCFBooleanTrue
forKey:kCATransactionDisableActions];
cardLayer.zPosition = topZPosition++; // bring to highest z
// ... if card fan, bring whole fan to top
[CATransaction commit];
//...
}
// ...
}
}

Resources