Am I on the right path trying to drag an item from a UIScrollView and drop it over a second UIScrollView? Neither of the UIScrollViews will be altered in the end, but I'd like to have an image follow the touchMoved position until it's released over second UIScrollView.
I have extended the UIScrollView so I can see where a touch begins inside the UIScrollView.
extension UIScrollView {
override open func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
self.next?.touchesBegan(touches, with: event)
print("touchesBegan")
}
}
to get the touched object
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
beginningLocation = touch.location(in: self.view)
let position = touch.location(in: itemScrollview)
print("itemScrollview: \(position)")
let whoHit = Int(position.x / 120)
print("whoHit: \(whoHit) \(GameData.items[whoHit].name)")
}
}
I'm trying to use touchesEnded to see if the user dragged up, but touchesEnded doesn't always get called. Works if the touches begin and end over the same UIScrollView, but when dragged off, touchesEnd doesn't get called.
Yo cannot "directly" drag an object (view, image view, label, button, etc) from one view to another (whether it's a UIView or UIScrollView or any other container).
Instead, you need to "move" the object you want to drag from the containing view to the super view (the "main" view), drag it around as a subview of the main view, and then check where it is released.
So, essentially, if you want to drag a UIImageView:
on first touch, move the image view from its containing view to the "main" view
translate the coordinates, so they are now relative to the "main" view
track the touch / drag in main view coordinates, re-positioning the image view as you go
on release, see if the end touch is inside the "target" container view
if so, add the image view as a subview of the target view (translating the position coordinates)
if it's not inside the target frame, either add it back to its original container, or drop it into the main view, or vaporize it, or whatever you desire.
Related
I have the following setup :
Blue background at zPosition = 0
Yellow button at zPosition = 0 with an action to print "Button tapped"
UIView with grey background and 0.8 alpha at zPosition = 1 with a UITapGestureRecognizer with an action to print "Grey view tapped"
When I tapped at the grey area, "Grey view tapped" is printed.
But if I tapped at the location of the yellow button, "Button tapped" is printed.
I expected to always have printed "Grey view tapped" because that view is in front of the button.
How can I prevent the button that is behind a view to be triggered ?
I know that I can set the button to .isEnabled = false but that is not a suitable solution since the grey background is created in a parent view controller from witch all my views are inheriting.
I set the grey view to .isUserInteractionEnabled = true (even if it's the default value) as stated here : In Stackoverflow
I thought using Apple Documentation but the problem is that there are a button and a gesture recognizer instead of multiple gesture recognizers.
Any idea on how to do this properly?
#Daljeet lead me to the solution. Using the Debug View Hierarchy I realised that my button was on top of the the view. My mistake was because I didn't realised the difference between UIView.layer.zPosition and the order of subviews in the hierarchy. A view can be drawn behind another view but nonetheless be in front in the View Hierarchy.
The solution was to use view.bringSubviewToFront(greyView)
Be aware that if you add the button after calling that last line of code, the button will be placed on top of the View Hierarchy
you can use this method to check if the touch was on the button and then trigger your action.
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
if let firstTouch = touches.first {
let hitView = self.view.hitTest(firstTouch.location(in: yourBtn), with: event)
if hitView === yourBtn {
print("touch is inside")
} else {
print("touch is outside")
}
}
}
Issue: The viewWithGesture contains the viewUserSees, and is draggable within the blue containerView. However, the viewWithGesture is a subView of the containerView, so when the viewWithGesture is at an extreme (illustrated here - half in and half out of the containerView), only half of the viewWithGesture responds to touches, making it very hard to drag.
Note: I realize I should redo all the math that keeps it in the container and move it outside of the containerView, but I am very curious to learn how to do this the "worse" way.
I have researched this a bunch and tried to implement hittest() and pointInside(), but so far I have managed to just make the app crash spectacularly.
Is there a good, relatively clean way to let the user grab from outside the containerView? (swift3 if possible)
EDIT: The green box is transparent and half of it is in the containerView and half is not.
In order for a view to receive a touch, the view and all its ancestors must return true from pointInside:withEvent:.
Normally, pointInside:withEvent: returns false if the point is outside the view's bounds. Since a touch in the green area is outside the container view's bounds, the container view returns false, so the touch won't hit the gesture view.
To fix this, you need to create a subclass for the container view and override its pointInside:withEvent:. In your override, return true if the point is in the container view's bounds or in the gesture view's bounds. Perhaps you can be lazy (especially if your container view doesn't have many subviews) and just return true if the point is in any subview's bounds.
class ContainerView: UIView {
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if super.point(inside: point, with: event) { return true }
for subview in subviews {
let subviewPoint = subview.convert(point, from: self)
if subview.point(inside: subviewPoint, with: event) { return true }
}
return false
}
}
The problem in short, related to working with pan gesture inside a scrollView.
I have a canvas(which is an UIView itself but bigger in size) where i am drawing some UIView objects with pan gesture enabled over each of them(Each little UIView Objects I am talking about, are making using another UIView class).
Now the canvas can be bigger in height and width...which can be changed as per the user input.
So to achieve that I have placed the canvas inside a UIScrollView. Now the canvas is increasing or decreasing smoothly.
Those tiny UIView objects on the canvas can be rotated also.
Now the problem.
If I am not changing the canvas size(static) i.e. if its not inside the scrollview then each UIView objects inside the canvas are moving superbly and everything is working just fine with the following code.
If the canvas is inside the UIScrollView then the canvas can be scrollable right? Now inside the scrollview if I am panning the UIView objects on the canvas then those little UIView objects are not following the touch of the finger rather than its moving on another point when touch is moving on the canvas.
N.B. - Somehow I figured out that I need to disable the scrolling of the scrollview when any of the subviews are getting touch. For that thing I have implemented NSNotificationCenter to pass the signal to the parent viewController.
Here is the code.
This functions are defined inside the parent viewController class
func canvusScrollDisable(){
print("Scrolling Off")
self.scrollViewForCanvus.scrollEnabled = false
}
func canvusScrollEnable(){
print("Scrolling On")
self.scrollViewForCanvus.scrollEnabled = true
}
override func viewDidLoad() {
super.viewDidLoad()
notificationUpdate.addObserver(self, selector: "canvusScrollEnable", name: "EnableScroll", object: nil)
notificationUpdate.addObserver(self, selector: "canvusScrollDisable", name: "DisableScroll", object: nil)
}
This is the Subview class of the canvas
import UIKit
class ViewClassForUIView: UIView {
let notification: NSNotificationCenter = NSNotificationCenter.defaultCenter()
var lastLocation: CGPoint = CGPointMake(0, 0)
var lastOrigin = CGPoint()
var myFrame = CGRect()
var location = CGPoint(x: 0, y: 0)
var degreeOfThisView = CGFloat()
override init(frame: CGRect){
super.init(frame: frame)
let panRecognizer = UIPanGestureRecognizer(target: self, action: "detectPan:")
self.backgroundColor = addTableUpperViewBtnColor
self.multipleTouchEnabled = false
self.exclusiveTouch = true
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
func detectPan(recognizer: UIPanGestureRecognizer){
let translation = recognizer.translationInView(self.superview!)
self.center = CGPointMake(lastLocation.x + translation.x, lastLocation.y + translation.y)
switch(recognizer.state){
case .Began:
break
case .Changed:
break
case .Ended:
notification.postNotificationName("EnableScroll", object: nil)
default: break
}
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
notification.postNotificationName("DisableScroll", object: nil)
self.superview?.bringSubviewToFront(self)
lastLocation = self.center
lastOrigin = self.frame.origin
let radians:Double = atan2( Double(self.transform.b), Double(self.transform.a))
self.degreeOfThisView = CGFloat(radians) * (CGFloat(180) / CGFloat(M_PI) )
if self.degreeOfThisView != 0.0{
self.transform = CGAffineTransformIdentity
self.lastOrigin = self.frame.origin
self.transform = CGAffineTransformMakeRotation(CGFloat(M_PI_4))
}
myFrame = self.frame
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
notification.postNotificationName("EnableScroll", object: nil)
}
}
Now the scrollView is disabling its scroll perfectly whenever one of the UIView object is receiving touch over the canvas which is inside the scrollview but sometimes those UIView objects are not properly following the touch location over the canvas/screen.
I am using Swift 2.1 with Xcode 7 but anyone can tell me the missing things of mine or the solution using Objective-c/Swift?
Where do you set the lastLocation? I think it would be better for you to use locationInView and compute the translation by yourself. Then save the lastLocation on every event that triggers the method.
Also you might want to handle the Cancel state as well to turn the scrolling back on.
All of this does seem a bit messy though. The notifications are maybe not the best idea in your case nor is putting the gesture recognizers on the subviews. I think you should have a view which handles all those small views; it should also have a gesture recognizer that can simultaneously interact with other recognizers. When the gesture is recognized it should check if any of the subviews are hit and decide if any of them should be moved. If it should be moved then use the delegate to report that the scrolling must be disabled. If not then cancel the recognizer (disable+enable does that). Also in most cases where you put something movable on the scrollview you usually want a long press gesture recognizer and not a pan gesture recognizer. Simply use that one and set some very small minimum press duration. Note that this gesture works exactly the same as the pan gesture but can have a small delay to be detected. It is very useful in these kind of situations.
Update (The architecture):
The hierarchy should be:
View controller -> Scrollview -> Canvas view -> Small views
The canvas view should contain the gesture recognizer that controls the small views. When the gesture begins you should check if any of the views are hit by its location by simply iterating through the subviews and check if their frame contains a point. If so it should start moving the hit small view and it should notify its delegate that it has began moving it. If not it should cancel the gesture recognizer.
As the canvas view has a custom delegate it is the view controller that should implement its protocol and assign itself to the canvas view as a delegate. When the canvas view reports that the view dragging has begin it should disable the scrollview scrolling. When the canvas view reports it has stopped moving the views it should reenable the scrolling of the scroll view.
Create this type of view hierarchy
Create a custom protocol of the canvas view which includes "did begin dragging" and "did end dragging"
When the view controller becomes active assign self as a delegate to the canvas view. Implement the 2 methods to enable or disable the scrolling of the scroll view.
The canvas view should add a gesture recognizer to itself and should contain an array of all the small movable subviews. The recognizer should be able to interact with other recognizers simultaneously which is done through its delegate.
The Canvas gesture recognizer target should on begin check if any of the small views are hit and save it as a property, it should also save the current position of the gesture. When the gesture changes it should move the grabbed view depending on the last and current gesture location and re-save the current location to the property. When the gesture ends it should clear the currently dragged view. On begin and end it should call the delegate to notify the change of the state.
Disable or enable the scrolling in the view controller depending on the canvas view reporting to delegate.
I think this should be all.
I'm creating a subclass of UIView to use in a project, and will handle touches on the main view. I'd like so that when the touch (on the main view) is dragged and contacts the special UIViews, they change their background color. Using UIView's default "touchesMoved" function only detects touches that originate on the specific view. I could set up the main view to check each instance of the custom UIView, but that would be contrary to encapsulation and lead to messy code.
Any ideas?
I'm not sure this is what your after, but if I put the following code in my view controller, whose view has several RDView views in it, the color of those views will change color when I drag from the main view into the RDView.
override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
var touched = self.view.hitTest((touches.anyObject() as UITouch).locationInView(self.view), withEvent: event)
if touched is RDView {
touched?.backgroundColor = UIColor.redColor()
}
}
I have a custom subclass of UIView, designed in IB that contains a few labels and a button.
There is an action for touchUpInside event on the button and the target is the custom view.
I am attaching this custom view to a self.tableView.tableHeaderView for a tableview in my UI.
The strange thing is this custom view is not responding to touches. I can see it nicely with all the labels and the button in side the table view, that means the table view handles and shows it correctly, however it is not responding to touches.
I checked the whole view hierarchy and all the views involved have userInteractionEnabled as YES.
If i drag some other controls into that custom view for example a switch, segmented control..they do not respond as well. It is like these controls in custom view are not registering touches.
It doesn't make any sense. I am out of ideas. Can you help to allow the touch event on the button to arrive to its parent view?
What is a "headerTableView"? Do you mean a UITableViewHeaderFooterView? Have you tried setting userInteractionEnabled on the root UITableViewHeaderFooterView?
This is a hack that will detect a button touch and trigger touchUpInside programmatically:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?)
{
super.touchesBegan(touches, with: event)
if let touch = touches.first, button.bounds.contains(touch.location(in: button)) {
button.sendActions(for: UIControl.Event.touchUpInside)
}
}
I don’t know what your real problem is because that button should work without this.