I want to add multiple UIImageView objects at run time. I just read that it's possible through NSMutableArray.
But I also want to move all the UIImageViews. Is it possible to track which UIImageView I touched?
Please help me. Any type of help will be appreciated.
Thanks
You may need to rephrase your question but I'll see if I can help you out. To find what view is being touched, you can define the touchesBegan method in your view controller.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject]; //gets the touch object
[touch.view thisIsAMethod];
//once you declare touch, you can access what view is being touched with touch.view
}
Also, if you want to move a lot of UIImageViews at once, you can make them all subviews of one UIView by calling
[oneBigUIView addSubview:oneUIImageView];
for every UIImageView. Then you can change the position of the UIView to move them all at once, since the coordinates of each UIImageView are in relation to their superview.
Related
I'm not able to capture touch events from a UIView with a UIButton subview. the UIButton probably prevents the touchesBegan and touchesEnd events from reaching the outer view.
How can I handle those events? I tried setting UserInteractionEnabled to false. The touch event reached the outer view but the UIButton does not transition to highlighted state with this solution.
Is there a way to accomplish this without losing the natural behavior of the button?
(P.S. I do not want to use GestureRecognizers etc. because I need to get the actual timestamp of the event)
If I have understood your question, you can try this approach:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
UITouch *touch = [touches anyObject];
if (CGRectContainsPoint(yourbutton.frame, [touch locationInView:self])) { //if touch is beginning on Button
}
}
Where the user clicks on the screen and the keyboard and text field show up. Then when they are done it disappears and the text is saved. But I don't want the box to be around it I just want the words that the user typed to appear. I am somewhat a noob to Xcode, and I have been trying to figure this out for many days now. If you guys have any input that would be great!
Thanks in Advance
First you have to create a UiTextField or a UILabel.
In Snapchat the keyboard hides whenever you tap on the screen. Objective-C provides a methode to detect if the screen was touched.
Use - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
With resignFirstResponder you hide the keyboard from your label.
you have to implement -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event method, when method fired than
UITouch *touch = [touches anyObject];
float position = [touch locationInView:view];
and add UITextField at that point, and on return of UITextField create UILabel with the calculation of text size and add it to view at same position where UITextField was added and remove UITextField from superview.
Or for touch you also can use UITapGestureRecognizer
I've read answers to many questions that dealt with enabling / disabling touch events, but nothing has worked for me, so I'm asking one of my own.
I have a UIImageView object (spot):
// in my view controller header file:
#property (nonatomic, strong) IBOutlet UIImageView *spot;
Then I have code relating to this object:
// in my view controller .m file:
#synthesize spot
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// handle when that spot is touched ...
}
And that works fine. For example, I can change the image displayed at the spot when the spot is clicked.
First I wanted to see how to disable touch events on the spot so I tried:
[[UIApplication sharedApplication] beginIgnoringInteractionEvents];
And that works fine. At certain points, depending on what I want to do, I was able to disable all touch events.
Then I added a button to this view controller and I want the button to be clickable ALWAYS, to have a touch event ALWAYS enabled for the button.
So now my approach to disabling touch events won't work because it's too heavy-handed. It wipes out all touch events anywhere in that view.
I want to disable ONLY the touch event on that spot. I tried:
spot.userInteractionEnabled = NO;
But that didn't work. The spot was still clickable. I also tried:
[spot1 setUserInteractionEnabled:NO];
Also didn't work. I'm quite confused as to why those don't work. My question is:
How can I disable touch events on just this one spot, this one UIImageView object?
EDIT: To address the question asked below, in the Interface Builder, in my .xib I have linked the UIImageView object to the property set in my header file. That's its Referencing Outlet.
Why do you want to disable touch for your spot? You can simply skip handling if the touch was from the spot.
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(spot.frame, touchLocation))
return;
if (CGRectContainsPoint(button.frame, touchLocation)){
//do something
}
}
I have 3 UIViews of the same size stacked on top of each other. The topmost is transparent and only used for detecting touches. The type of touch detected will determine which of the other two underlying views I want to receive the touch events. Once the topmost view is finished with the touch, I need to forward the touch events to the correct underlying view. How can I do that?
EDIT - I am adding my touch detection code. This is within MainViewController, whose view contains all 3 subviews.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches)
{
if (touch.view == self.touchOverlay) {
CGPoint touchLocation = [touch locationInView:touch.view];
//do a bunch of math to determine which view should get the touches.
if (viewAshouldGetTouches) //forward to viewA
if (viewBshouldGetTouches) //forward to viewB
}
}
}
Make your two subviews setUserInteractionEnabled:NO and handle all touches in the parent. Then, depending on touch type, send the correct view a programatic touch event. Then you don't need your clear view on the top. Basically you will be coordinating touch events from the bottom up instead of going top->bottom->middle.
You'll have to do this by creating a UIView subclass for your top view and overriding the following method :
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// UIView will be "transparent" for touch events if we return NO
return (point.y < MIDDLE_Y1 || point.y > MIDDLE_Y2);
}
I have a subclass of UIView on top of a UITableView. I am using the UITableView to display some data and, at the same time, I would like to overlay an animation that follows the finger (for instance, leaving a trail).
If I get it right, I need the touch events to be handled both by the UIView subclass and the UITableView. How can I do that?
Is it possible to have, ie, touchesMoved being triggered on the UIView subclass and then on UITableView?
Thank you so much for any help.
The way I have solved this problem is in a way that is not that clean, but it works. Please let me know if there's a better way to do this.
I have overridden hitTest for my custom UIView so that it directs touches to the UITableView underneath. Then in the UITableView I am handling the gestures through touchesBegan, touchesMoved, etc. There I am also calling touchesBegan on the UIView.
In this way touches are handled by two views.
The reason why I am not doing the other way around (having UIView's touchesBegan calling UITableView's touchesBegan) is that gestures recognizers on the UITableView would not work.
UIView subclass' hitTest
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// tview is the UITableView subclass instance
CGPoint tViewHit = [tView convertPoint:point fromView:self];
if ([tView pointInside:tViewHit withEvent:event]) return tView;
return [super hitTest:point withEvent:event];
}
UITableView subclass's touchesBegan
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
// ....
// view is the UIView's subclass instance
[view touchesBegan:touches withEvent:event];
}
No, you cann't do it implicity. Event Delivery chapter says
The window object uses hit-testing and the responder chain to find the
view to receive the touch event. In hit-testing, a window calls
hitTest:withEvent: on the top-most view of the view hierarchy; this
method proceeds by recursively calling pointInside:withEvent: on each
view in the view hierarchy that returns YES, proceeding down the
hierarchy until it finds the subview within whose bounds the touch
took place. That view becomes the hit-test view.
So, when window finds touched view it returns YES. Only one view can handle touches at the current moment.
But if you need to handle event for UITableView then handle it for UIView! You can convert touched point to required coordinates with – convertPoint, – convertRect functions, add subview to UITableView and move it depends on coordinate, and a lot of another things.
UITableView relays unhandled touch events to UIView. (Google "responder chain")
UITableView Documentation
So, you can handle your touch events in UIView only. So. In your UIView
touchesstart - do initialization stuff
touchesmove - draw tail on UIView (Use timers/delayedresponse to desable points so that it would look like a trail)
touchesend - do remaining stuff
Hope this helps.