As described in the title, for some reason the position of all objects that have been moved is reset to its original position after .setBackgroundImage is called on some UIButton.
The original position of UI elements is set via StoryBoard after that I am changing the position of some elements using PanGestureRecognizer (For example menu slides up)
Simple like that:
#IBAction func handlePan(recognizer:UIPanGestureRecognizer) {
let translation = recognizer.translation(in: self.view)
if let view = recognizer.view {
view.center = CGPoint(x:view.center.x + translation.x,
y:view.center.y + translation.y)
}
recognizer.setTranslation(CGPoint.zero, in: self.view)
}
After that I have some button with background picture which is changed when user chooses theme change, .setBackgroundImage is preformed on this UIButton like this:
button.setBackgroundImage(image, for: UIControlState.normal)
After the image is changed on the button the position of translated object (menu that is slid up) is set back to its original position set via storyboard I want this menu to stay open even after image on some none related button is changed.
Related
I have a sticker which is draggable as I drag it and while dragging it when my finger reaches below the white board and that delete imageview appears and when I have reached there location. How do I detect touch on that imageview.
I am using pan gesture for dragging and pinch and rotation gestures also at the same time. But how will I know that I have touched my delete ImageView
Here is the image which has a stickerView on the back which is draggable and has a delete imageView on the front of the stickerView which is fixed.
#objc func handlePanGesture(_ gestureRecognizer:UIPanGestureRecognizer){
switch gestureRecognizer.state {
case .began:
// panGestureInStickerViewClosure?(true)
break
case .changed:
// panGestureInStickerViewClosure?(true)
let deleteImageView = gestureRecognizer.view?.superview?.subviews.filter{$0.isKind(of: UIImageView.self)}[safe: 0]
deleteImageView?.addGestureRecognizer(BlockGestureRecognizer())
let stickerImageViewFrame = gestureRecognizer.view?.superview?.subviews.filter{$0.isKind(of: HBStickerView.self)}[safe: 0]
let translation = gestureRecognizer.translation(in: self)
print("translation=\(translation),deleteImageView?.frame=\(deleteImageView?.frame),stickerImageViewFrame?.frame=\(stickerImageViewFrame?.frame)")
break
case .ended:
break
default:
break
}
print(gestureRecognizer.numberOfTouches)
if gestureRecognizer.numberOfTouches == 1{
panGestureInStickerViewClosure?(true)
}else if gestureRecognizer.numberOfTouches > 1{
panGestureInStickerViewClosure?(false)
}
let translation = gestureRecognizer.translation(in: self.superview)
self.center = CGPoint(x: self.center.x + translation.x, y: self.center.y + translation.y)
gestureRecognizer.setTranslation(CGPoint.zero, in: self.superview)
print(gestureRecognizer.location(in: self.superview))
//print("self.frame.maxY=\(self.frame.maxY),self.superview?.frame.maxY=\(self.superview?.frame.maxY),self.center=\(self.center),gestureRecognizer.location(in: self)=\(gestureRecognizer.location(in: self))")
// panGestureInStickerViewClosure?(true)
}
I have above code written for pan gesture
I am using this gesture recognizer for panning. But getting translation location also does not work for me. If I can get the gesture location for my thumb or any finger and perform my need.
All I want to get touch event on that delete ImageView in front of all Subview while I am dragging the StickerView below that Subview.
Let me give you an example of Instagram. An Instagram has stickers which you can drag around and pinch and rotate and when it comes to deleting you have to touch on that delete button then the delete button gets scaled to max size and StickerView gets to scaled-down and finally is removed from the superView.
I have a movable UIAction Button similar to Reddit's Scroll up button and the Assistive Touch Feature. Right now I'm having a hard time finding a way to either:
1. Not go out of bounds in the first place or
2. Bounce Back to the screen similar tot he way that Assistive Touch does.
Here's what I have in my UIPanGesture function:
#IBAction func dragSettingsButton(_ sender: UIPanGestureRecognizer) {
if sender.state == .began || sender.state == .changed {
if displayView.frame.contains(sender.location(in: self.view)) {
let translation = sender.translation(in: displayView)
// note: 'view' is optional and need to be unwrapped
sender.view!.center = CGPoint(x: sender.view!.center.x + translation.x, y: sender.view!.center.y + translation.y)
sender.setTranslation(CGPoint.zero, in: displayView)
}
else {
print("WOW BRO YOURE OUT OF BOUNDS")
}
}
}
Things to note: I have a ViewController where there are 3 views: A view to act as a custom navigation bar, a view to act as a custom segmented view, and a display view to display different views when the segmented view is switched.
There is also a Tab bar on the bottom
To bounce back:
On sender.state == .end use the same logic to determine if its out of bounds. If so calculate where it should be. Then animate it to that location.
I have a UIScrollView that contains n number of placeholder UIImageView's. Below that i have a StackView containing 10 images that the user will choose from and drag an image to a specific placeholder image view above. I have the PanGesture implemented and that works fine, however, i'm struggling with how to detect when the draggable image is within the bounds of a placeholder image. Especially since it is in a scroll view. I was able to get it to some what work with the center.X and center.Y of the draggable views then checking those values against their relative values of the placeholders, but it did not work as desired. And from what i can see, the X and Y coordinates from dragging, only apply to the actual parent UIView. So, when an image is dragged into the scroll view, the X values don't necessarily line up, some of those placeholder image X coordinates may be 2000+ if the list is long. Take a look at the code below and let me know your thoughts and suggestions. Thanks!
#IBAction func handlePan(recognizer:UIPanGestureRecognizer) {
let translation = recognizer.translationInView(self.view)
if recognizer.state == UIGestureRecognizerState.Began {
originalViewLocation = recognizer.view!.center
}else if recognizer.state == UIGestureRecognizerState.Ended {
for placeHolderView in playerImages{
if ((recognizer.view!.center.x < (placeHolderView.center.x + (placeHolderView.frame.width / 2)) && recognizer.view!.center.x > placeHolderView.frame.origin.x)){
if recognizer.view!.center.y < (stackView.center.y + stackView.frame.height / 2){
let chosenImage = recognizer.view! as! UIImageView
placeHolderView.image = chosenImage.image
}
}else{
UIView.animateWithDuration(0.5, delay: 0.0, options: .CurveEaseInOut, animations: {
recognizer.view!.center = self.originalViewLocation
}, completion: { finished in
})
}
}
}
if let view = recognizer.view {
view.center = CGPoint(x:view.center.x + translation.x,
y:view.center.y + translation.y)
}
recognizer.setTranslation(CGPointZero, inView: self.view)
print("X: \(recognizer.view!.center.x)")
print("Y: \(recognizer.view!.center.y)")
}
The playerImages array stores the UIImageViews in the scroll view.
Well i did solve this on my own. It had to do with the relative coordinate values vs actual coordinate values.
I ended up needing a few things...
CGRectContainsPoint()
To check whether or not the frame of the destination placeholder view contains the center point of the dragging view.
And...
convertRect()
with
convertPoint()
I had to convert the coordinate values of the dragging point to the values of the superview. Along with, converting the frame values of the placeholder views within the scroll view, to the same (superview values). Like so...
let convertedCenter = recognizer.view!.superview!.convertPoint(recognizer.view!.center, toView: self.view)
let convertedFrame = self.view.convertRect(IV.frame,fromView: IV.superview)
if CGRectContainsPoint(convertedFrame, convertedCenter) {
//Replace placeholder image with dragged image.
}
Hopefully this will help someone else.
My code:
#IBAction func handlePan(gesture: UIPanGestureRecognizer) {
let transition = gesture.translationInView(self.view)
switch gesture.state{
case .Changed:
if let view = gesture.view {
view.center = CGPoint(x: view.center.x + transition.x, y: view.center.y + transition.y)
}
gesture.setTranslation(CGPointZero, inView: self.view)
default:break
}
}
So I can drag a big button in the screen around. Everything works find till I comment out gesture.setTranslation(CGPointZero, inView: self.view).
I thought that one line of code only tells the app to remember the last position of the button on screen and move from there next time, but...
Then I ran the project again on a simulator, when I clicked on that button and tried to move a bit, the button just flew in the same direction and disappeared off screen, why?
As you pan around, the transition value is from where you first began the gesture.
Look at how you move the button. You just keep adding the larger and larger transition to each updated center. What you need to do is add the latest transition to the center as it was when the pan gesture was first recognized.
So either reset the translation (like in your posted code), or save the original center of the button when the gesture's state is .Began and apply the translation to the original center value.
I have the following code that drags a UIView. All works fine visually.
func moveView(sender: UIPanGestureRecognizer) {
let translate = sender.translationInView(self.view)
if sender.state == UIGestureRecognizerState.Changed {
sender.view!.center = CGPoint(x:sender.view!.center.x + translate.x, y:sender.view!.center.y + translate.y)
sender.setTranslation(CGPointZero, inView: self.view)
}
if sender.state == UIGestureRecognizerState.Ended {
let newX: CGFloat = sender.view!.center.x + translate.x
let newY: CGFloat = sender.view!.center.y + translate.y
sender.view!.center = CGPoint(x:newX, y:newY)
}
}
However after completing this drag, the view seems to lose the gesture connection such that I can't drag it again or trigger any tap gesture associated with it etc.
If I add an NSLog I can see that tapping where the view used to be triggers the log but not if I tap on the actual current view location.
I establish the gesture to view thisView with the following within viewDidLoad
let moveGesture = UIPanGestureRecognizer(target: self, action: Selector("moveView:"))
thisView.addGestureRecognizer(moveGesture)
What am I missing that keeps the gestures connected to the new view location?
Thanks.
I think that when you call this sender.setTranslation(CGPointZero, inView: self.view) maybe you should set the translation relative to superview and not to the view itself. Setting the translation relative to itself change the position of the view contents but not the view area, which means that if you set the view layer to mask to its bounds you shouldn't see anything that it's outside of the view initial area.
So you should do sender.setTranslation(CGPointZero, inView: self.view.superview).