I have a table view and outside the table view and i have one view which is outside the table view. I am trying to drag and drop elements from table view to Drop view. The View looks something like this.
There is a Table View as shown in the figure. What i am trying to accomplish is, when the user long presses on a row in table view i need to drag and drop the selected item into the droppable area. On long pressing i create a snapshot of the row and add as a subview, now i am trying to drag the subview into the drop area. I am not able to do this. Can anyone help me out with this issue.
var stateDropBoard = CGRectContainsPoint(ingBoardDropView.frame, touchPoint)
if(stateDropBoard)
{
print("DROPPED")
}
I am not able to get it working. Is there any way to accomplish what i am doing.
On your parent view use touches ended method. Inside this method run the test to check if user has dropped item on top of your green colour view.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint location = [[touches anyObject] locationInView:self.view];
CGRect fingerRect = CGRectMake(location.x-5, location.y-5, 10, 10);
for(UIView *view in self.view.subviews){
CGRect subviewFrame = view.frame;
if(CGRectIntersectsRect(fingerRect, subviewFrame)){
// this is your touched view
}
}
}
Here is swift version which you asked! with little optimisation.
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch = touches.first! as UITouch
let location = touch.locationInView(self.view)
let hitTest = self.targetView.hitTest(location, withEvent:event )
if (hitTest != nil) {
print("User finished touch inside your target view!")
}else{
print("User finished touch outside your target view");
}
}
Note : Make sure you have correct constraints in place for your target view.
I guess you can take it from here. Hope this helps.
Related
I want to scroll the image that is inside my UIImageView in case my finger is inside the UIImageView area and i'm moving my finger. I'm trying to do that using objective c. I got it moving, but it act weird and dont work right. Can you please show me how to do that please???
This is what i'm doing:
- (void)touchesMoved :(NSSet *)touches withEvent:(UIEvent *)event{
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:self];
for (id sublayer in view_shape.sublayers) {
if (sublayer ==imageView.layer) {
if (CGRectContainsPoint(imageView.frame, touchLocation)) {
imageView.layer.contentsRect = CGRectMake( touchLocation.x/1000,
touchLocation.y/1000,
imageView.layer.contentsRect.size.width,
imageView.layer.contentsRect.size.height);
}
}
}
}
}
Why not use a scrollView and just add the imageView to it
You can't do this with just an instance of UIImageView. An image view will just hold on to an image and draw that.
Either add that inside a scrollview if you want a simple scroll/paging interface.
(or)
Add that to a UIView, and declare a UIPanGestureRecognizer to the UIView, and check the actions you get. Based on the translation, you can set the frame for the UIImageView.
In my app setup, I have a navigation controller with 4 ImageViews. 1 of them can be dragged around, while the other 3 are stationary at the top section of the view. Using the code below, I have it set up so that the user drags the one image view to the image view of where he wants to go. So to get to view 1, he drags the movable image view to image view 1, and so on. The issue is that with the width of the image views, it is possible for the selector view to touch two at one time, which creates a nesting view controller issue. Is there a way I can keep this from happening, short of moving the image views so far away that it is impossible for more than one to be selected at a time?
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
// If the touch was in the placardView, move the placardView to its location
if ([touch view] == clock) {
CGPoint location = [touch locationInView:self.tabBarController.view];
clock.center = location;
BOOL isIntersecting = CGRectIntersectsRect(clock.frame, prayer.frame);
BOOL isIntersecting2 = CGRectIntersectsRect(clock.frame, fasting.frame);
BOOL isIntersecting3 = CGRectIntersectsRect(clock.frame, study.frame);
if(isIntersecting){
[self schedulePrayer];
NSLog(#"prayer");
}
if(isIntersecting2){
[self scheduleFasting];
NSLog(#"fasting");
}
if(isIntersecting3){
[self scheduleStudying];
NSLog(#"Studying");
}
return;
}
}
Why don't you just use if ... else if ... else if?
if(isIntersecting){
[self schedulePrayer];
NSLog(#"prayer");
}
else if(isIntersecting2){
[self scheduleFasting];
NSLog(#"fasting");
}
else if(isIntersecting3){
[self scheduleStudying];
NSLog(#"Studying");
}
Then, only one will be triggered at a time.
Create another BOOL "isTouching" and make it global. Then inside your if(isIntersecting) set "isTouching" to global, and add "isTouching" as a condition such that:
if ([touch view] == clock && (!isTouching))
You also need to set isTouching to false in the case that the UIImageView is not on any of the intersecting views and you should be good to go :)
That should be enough hints for you to solve your problem, but if you'd like more clarification let me know.
I have the following problem:
I have one UIImageView which I can drag by touch and a toolbar, which I want to be near that Image View. This is, what I'm doing at the moment:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
//motion here;self.tool is toolbar View
CGFloat a=self.tool.frame.size.width;
CGFloat b=self.tool.frame.size.height;
self.tool.frame=CGRectMake(self.frame.origin.x+self.frame.size.width/2+50, self.frame.origin.y+self.frame.size.height/2+50, a, b);
}
It works fine but sometimes toolbar is moving outside of screen. Maybe there is simple way to track if I am outside and move toolbar to another point?
You can check it like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
//motion here;self.tool is toolbar View
CGFloat a = self.tool.frame.size.width;
CGFloat b = self.tool.frame.size.height;
CGRect newFrame = CGRectMake(self.frame.origin.x+self.frame.size.width/2+50,
self.frame.origin.y+self.frame.size.height/2+50,
a, b);
// only set frame, if it is still in the bounds of self.superview
if(CGRectContainsRect(self.superview.frame, newFrame)) {
self.tool.frame = newFrame;
}
}
You should be using UIGestureRecognizer, not touchesMoved:. And when you do, the gesture recognizer on the image view can move the toolbar view however it likes.
I'm using XCode 4.4 developing for iOS 5 on an iPad and am using the Storyboard layout when creating my custom button.
I have the touch event correctly working and logging but now I want to get the x/y coordinates of the tap on my custom button.
If possible, I'd like the coordinates to be relative to the custom button instead of relative to the entire iPad screen.
Here's my code in the .h file:
- (IBAction)getButtonClick:(id)sender;
and my code in the .m file:
- (IBAction)getButtonClick:(id)sender {
NSLog(#"Image Clicked.");
}
Like I said, that correctly logs when I tap the image.
How can I get the coordinates of the tap?
I've tried a few different examples from the internet but they always freeze when it displays a bunch of numbers (maybe the coordinates) in the log box. I'm VERY new to iOS developing so please make it as simple as possible. Thanks!
To get touch location you can use another variant of button action method: myAction:forEvent: (if you create it from IB interface note "sender and event" option in arguments field: )
Then in your action handler you can get touch location from event parameter, for example:
- (IBAction)myAction:(UIButton *)sender forEvent:(UIEvent *)event {
NSSet *touches = [event touchesForView:sender];
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:sender];
NSLog(#"%#", NSStringFromCGPoint(touchPoint));
}
Incase of Swift 3.0 the accepted answer works same except syntax will be changed as follows:
Swift 3.0:
#IBAction func buyTap(_ sender: Any, forEvent event: UIEvent) {
let myButton = sender as! UIButton
let touches = event.touches(for: myButton)
let touch = touches?.first
let touchPoint = touch?.location(in: myButton)
print("touchPoint\(touchPoint)")
}
For your overall coordinates (with reference to the screen), you need to create a CGPoint that contains the coordinates of your touch. But to do that, you need to get that touch first. So start by getting the touch event, then by making that point using the locationInViewmethod. Now, depending on when you want to log the touch - when the user touches down, or when they lift their finger -, you have to implement this code in the touchesBegan or touchesEnded method. Let's say you do touchesEnded, which passes an NSSet cales "touches" containing all the touch events.
UITouch *tap = [touches anyObject];
CGPoint touchPoint = [tap locationInView:self.view];
"touchPoint" will now contain the point at which the user lifts their finger. To print out the coordinates, you just access the x and y properties of that point:
CGFloat pointX = touchPoint.x;
CGFloat pointY = touchPoint.y;
NSLog(#" Coordinates are: %f, %f ", pointX, pointY);
That should output the coordinates of the touch. Now to have it be referenced to whatever button you're using, I would suggest you just manually subtract the values for the button's coordinates from the point. It seems like a simple solution, and honestly I don't know a way of getting coordinates with reference to another object, unless you make a view based on that object, and pass it to locationInView instead of self.view.
For more info on touches, there's a great set of tutorials here.
I have a simple - trivial - UIView parent/child hierarchy. One parent (UIView). One child (UIButton). The parents bounds are smaller then it's child's bounds so that a portion of the child extends beyond the bounding box of its parent.
Here's the problem: Those portions of the child outside the bbox of the parent do not receive touches. Only tapping within the bbox of the parent allows the child button to receive touches.
Can someone please suggest a fix/workaround?
UPDATE
For those following this question, here is the solution I implemented as a result of #Bastians most excellent answer:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL isInside = [super pointInside:point withEvent:event];
// identify the button view subclass
UIButton *b = (UIButton *)[self viewWithTag:3232];
CGPoint inButtonSpace = [self convertPoint:point toView:b];
BOOL isInsideButton = [b pointInside:inButtonSpace withEvent:nil];
if (isInsideButton) {
return isInsideButton;
} // if (YES == isInsideButton)
return isInside;
}
The problem is the responder chain. When you touch the display it will go down from the parents to the childen.
So .. when you touch the screen the parent will see that the touch is outside of it's own bounds and so the children will not even asked.
The function that does that is the hitTest. If you have your own UIView class you can overwrite it and return the button by yourself.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
Per Appleās own documentation, the simplest and most reliable way I have found to do this is to override hitTest:withEvent: in the superclass of your clipped view to look like the following:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// Convert the point to the target view's coordinate system.
// The target view isn't necessarily the immediate subview
CGPoint pointForTargetView = [self.targetView convertPoint:point fromView:self];
if (CGRectContainsPoint(self.targetView.bounds, pointForTargetView)) {
// The target view may have its view hierarchy,
// so call its hitTest method to return the right hit-test view
return [self.targetView hitTest:pointForTargetView withEvent:event];
}
return [super hitTest:point withEvent:event];
}
Precondition:
You have a UIButton(named as button1) inside a UIView(named as container), and button1 is partially outside the container's bounds.
Problem:
the part outside the container of button1 will not response click.
Solution:
subclass your container from UIView:
class Container: UIView {
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
let closeButton = viewWithTag(10086) as! UIButton //<-- note the tag value
if closeButton.pointInside(convertPoint(point, toView: closeButton), withEvent: event) {
return true
}
return super.pointInside(point, withEvent: event)
}
}
Don't forget to give your button1 a tag of 10086
#dugla, thank you for the question!
#Bastian and #laoyur, thanks to you for answers!
Swift 4
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if yourChildView.point(inside: convert(point, to: yourChildView), with: event) {
return true
}
return super.point(inside: point, with: event)
}
I had the same exact problem at hand. You only need to override:
-(BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *)event
Here is working code for a custom UIView subclass created solely to be tappable for its out-of-bounds children.
#implementation ARQViewToRightmost
// We should not need to override this one !!!
/*-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if ([self isInside:point])
return self;
return nil;
}*/
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if ([self isInside:point])
return YES; // That answer will make hitTest continue recursively probing the view's subviews for hit
return NO;
}
// Check to see if the point is within the bounds of the view. We've extended the bounds to the max right
// Make sure you do not have any other sibling UIViews to the right of this one as they will never receive events
- (BOOL) isInside:(CGPoint)point
{
CGFloat maxWidth = CGFLOAT_MAX;
CGRect rect = CGRectMake(0, 0, maxWidth, self.frame.size.height);
if (CGRectContainsPoint(rect, point))
return YES;
return NO;
}
#end