Suppose, on my window i have 3 views, main one in the background and 2 up front.
Each view up front contains some content, i'd like to be moved as part of a view upon touch. By "moved" i mean "rearrange positions relative to another view". Upon touch, i'd like to "pick up the view with all of it's content and place it in the position currently occupied by another view"
Where would you get started on something like this?
Something like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UIView *touchedView = [[touches anyObject] view];
CGPoint location = [[touches anyObject] locationInView:touchedView];
touchedView.center = location;
}
See also these methods in UIResponder.h:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
First you need to create a GestureRecognizer. Something along the lines of:
UITapGestureRecognizer *doubleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(doubleTap:)];
doubleTap.numberOfTapsRequired = 2;
[self addGestureRecognizer:doubleTap];
and add it to whatever view you want (or all three). From the sounds of it your main background view makes the most sense. Then create the doubleTap method, which will make one of your views move to where the other is:
-(void)doubleTap:(id)sender {
view1.frame = view2.frame;
[self bringSubviewToFrom:view1];
}
I would also make sure all your subview content has its autoresizingMask set according to how you want the subviews to behave.
Related
I use the following code to move an UIImageView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
paddle.center = CGPointMake(location.x, paddle.center.y);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
It works great, moving from left to right. But i can't figure out how to avoid moving over the screen, i meant to move it but don't intersect right and left edges of my screen. let's say 10 pixels from left and 10 from right.
First of all, you are the one saying this:
paddle.center = CGPointMake(location.x, paddle.center.y);
So where paddle is, is entirely up to you. If you don't want it at a certain x value, don't set it to that x value.
Second, never never call touchesBegan from within touchesMoved.
Third, you'd be much happier and safer using a UIPanGestureRecognizer. Situations like this are exactly what it's for.
I'm having a small issue. I'd like to receive all touches on the screen and for each one spawn a new video. The problem is that once a video is placed then it intercepts the touch points. I tried various values to go in locationInView but without any luck so far. Am I looking in the right place?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint pointOnScreen = [[touches anyObject] locationInView:self.view];
C4Movie *player = [C4Movie movieNamed:#"inception.mov"];
player.shouldAutoplay = YES;
player.loops = YES;
player.center = pointOnScreen;
[self.canvas addMovie:player];
}
#end
Try setting the userInteractionEnabled property of each video screen (assuming it is held in some sort of UIView) to NO - that way, touch events will pass through it and continue to be received by your handler.
Yes, you're looking in the right place, and Chris has it right about user interaction. You should try:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint pointOnScreen = [[touches anyObject] locationInView:self.view];
C4Movie *player = [C4Movie movieNamed:#"inception.mov"];
player.shouldAutoplay = YES;
player.loops = YES;
player.center = pointOnScreen;
player.userInteractionEnabled = NO;
[self.canvas addMovie:player];
}
However you're going to run into an issue with adding videos. Unfortunately, iOS / hardware only lets you have 4 video pipelines running at one time so you'll hit that pretty quickly.
If you want to add things to the screen as you touch and drag your finger, then you could also do the above code inside of the following method:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//work your magic here
}
Is there a way to cancel touch events in certain regions of a view? I have a custom UIView and I only want to process touch events only if they are say 100 pixels from the edges of the screen.
As Justin said, add a custom UIView in interface builder (or programmatically) and add it to the view. let's call that view touchArea. Then in your Viewcontroller.m file, implement the
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
methods (depends on what you're trying to do), and in these do:
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
if (CGRectContainsPoint(touchArea.frame, location)) {
//code to execute
}
actually, I think even a CGRect as an instance variable that is placed on the view can work, but the above is how I achieved it.
Assuming you're using Interface Builder, simply take another UIView. Resize it, and place it where you want touch events to occur. Then, have the custom view able to be touch active. Programmatically, I, personally, do not know how to do it.
I have a UIImageView that is added as a subview. It shows up when a button is pressed.
When someone taps outside of the UIImageView in any part of the application, I want the UIImageView to go away.
#interface SomeMasterViewController : UITableViewController <clip>
<clip>
#property (strong, nonatomic) UIImageView *someImageView;
There are some hints in stackoverflow and Apple's documentation that sound like what I need.
Apple's : Gesture Recognizers
Apple's : UIView hitTest:withEvent
Apple's : UITouch Class Reference
Stackoverflow: Listening to UITouch event along with UIGestureRecognizer
(not likely needed but..) - CGRectContainsPoint as mentioned in the following post titled: Comparing a UITouch location to UIImageView rectangle
However, I want to check my approach here. It's my understanding that the code needs to
Register a UITapGestureRecognizer to get all touch events that can happen in an application
UITapGestureRecognizer should have its cancelsTouchesInView and
delaysTouchesBegan and delaysTouchesEnded set to NO.
Compare those touch events with the someImageView (how? Using UIView hitTest:withEvent?)
Update: I am registering a UITapGestureRecognizer with the main UIWindow.
Final Unsolved Part
I have a handleTap:(UITapGestureRecognizer *) that the UITapGestureRecognizer will call. How can I take the UITapGestureRecognizer that is given and see if the tap falls outside of the UIImageView? Recognizer's locationInView looks promising, but I do not get the results I expect. I expect to see a certain UIImageView when I click on it and not see the UIImageView when I click in another spot. I get the feeling that the locationInView method is being used wrong.
Here is my call to the locationInView method:
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer.state != UIGestureRecognizerStateEnded) {
NSLog(#"handleTap NOT given UIGestureRecognizerStateEnded so nothing more to do");
return;
}
UIWindow *mainWindow = [[[UIApplication sharedApplication] delegate] window];
CGPoint point = [gestureRecognizer locationInView:mainWindow];
NSLog(#"point x,y computed as the location in a given view is %f %f", point.x, point.y);
UIView *touchedView = [mainWindow hitTest:point withEvent:nil];
NSLog(#"touchedView = %#", touchedView);
}
I get the following output:
<clip>point x,y computed as the location in a given view is 0.000000 0.000000
<clip>touchedView = <UIWindow: 0x8c4e530; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <UIWindowLayer: 0x8c4c940>>
I think you can just say [event touchesForView:<image view>]. If that returns an empty array, dismiss the image view. Do this in the table view controller's touchesBegan:withEvent:, and be sure to call [super touchesBegan:touches withEvent:event] or your table view will completely stop working. You probably don't even need to implement touchesEnded:/Cancelled:..., or touchesMoved:....
UITapGestureRecognizer definitely seems like overkill in this case.
You can use touch functions to do that:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
when user touch the screen first your touchBegan function is called.
in touchBegan:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint pt = [[touches anyObject] locationInView:self];
}
so you have the point that user touched.Then you must find that the point is in your UIImageView or not.
But if you can give tag to your UIImageViews. That will be pretty much easy.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject ];
if( yourImageView.tag==[touch view].tag){
[[self.view viewWithTag:yourImageView.tag] removeFromSuperView];
[yourImageView release];
}
}
I want to handle in superview a touch event when there are two touches that land on different subviews.
Can I do it somehow by adding UITapGestureRecognizer with numberOfTouchesRequired =2; with a target self to a subviews?
Or I need to go something more complicated?
You could try something like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = touches.anyObject;
CGPoint referencePoint = [touch locationInView:self.view];
if([touch tapCount] == 2){
//Test touch coordinates to see if theres one in each view
}