Make UIImageView Go Up Then Down Swift - ios

I need to make a UIImageView animate up when the screen is tapped then back down when it hits the top.
This is the code I have got at the moment:
override func touchesBegan(touches: NSSet!, withEvent event: UIEvent!) {
if (CGRectIntersectsRect(self.playerImagePhone.frame, self.floorImagePhone.frame)) {
UIView.animateWithDuration(0.4) {
self.playerImagePhone.center = CGPointMake(self.playerImagePhone.center.x, self.playerImagePhone.center.y - 50)
println("player jumped")
}
}
}
override func touchesEnded(touches: NSSet!, withEvent event: UIEvent!) {
UIView.animateWithDuration(0.4) {
self.playerImagePhone.center = CGPointMake(self.playerImagePhone.center.x, self.playerImagePhone.center.y + 50)
println("player jumped")
}
}
This code is working to make it go up then back down but the problem is that it allows the user to hold their finger on the screen and the image will stay up. How can I make it go down as soon as the image is finished going up?
Thanks

touchesBegan can be called often... it probably isn't a good idea to perform your animations in there. You'll probably want to update a seperate object maintaining the animation, then once the animation is done (once the duration is over) you run another animation to bring the image back down.
As an alternative you can use also use animateWithDuration:animations:completion: and calling the down animation within the completion block.

Related

Custom UIGestureRecognizer for a two finger gesture

I have a custom UIGestureRecognizer for a two finger gesture that works perfectly except for it being very picky about how simultaneously the fingers have to touch the iOS-device for touchesBegan to be called with 2 touches. touchesBegan is often called with only one Touch even though I am trying to use two fingers.
Is there any way to make recognition for the number of Touches more forgiving in regards to how simultaneously you have to place your fingers on the touch screen?
I've noticed that a two finger tap is recognized even when you place first one finger and then another much later while still holding the first finger down.
Here is the code for my touchesBegan function:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
if touches.count != 2 {
state = .failed
return
}
// Capture the first touch and store some information about it.
if trackedTouch == nil {
trackedTouch = touches.min { $0.location(in: self.view?.window).x < $1.location(in: self.view?.window).x }
strokePhase = .topSwipeStarted
topSwipeStartPoint = (trackedTouch?.location(in: view?.window))!
// Ignore the other touch that had a larger x-value
for touch in touches {
if touch != trackedTouch {
ignore(touch, for: event)
}
}
}
}
For two-finger gestures, touchesBegan is most likely going to be called twice: once you put the first finger on the screen, and once for the second one.
In the state you keep, you should keep track of both touches (or for that matter, all current touches), and only start the gesture once both touches have been received and the gesture's start condition has been met.
public class TwoFingerGestureRecognizer: UIGestureRecognizer {
private var trackedTouches: Set<UITouch> = []
public override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
for touch in touches {
if self.trackedTouches.count < 2 {
self.trackedTouches.insert(touch)
}
else {
self.ignore(touch, for: event)
}
}
if self.trackedTouches.count == 2 {
// put your current logic here
}
}
public override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
}
public override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent) {
self.trackedTouches.subtract(touches)
}
public override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent) {
self.trackedTouches.subtract(touches)
}
public override func reset() {
super.reset()
self.trackedTouches = []
}
}
don't worry about touchesBegan:withEvent: instead use touchesEnded:withEvent: or touchesMoved:withEvent: if the end state does not contain both fingers, set it as .failed otherwise set it as .ended
tapping the screen with more than one finger simultaneously is impossible, so during touchesMoved:withEvent: you will find two fingers. I'm not sure about touchesEnded:withEvent: this one probably won't work since removing two fingers simultaneously is just as hard as applying two fingers simultaneously, but it's worth a try to see how it reacts.
I'd recommend making your code a little more forgiving. Although touchesBegan/Moved/Ended/Cancelled respond to events of "one or more touches" (as stated in the Apple docs) relying on the precision of a user to simultaneously touch the screen with 2 fingers is not ideal. This is assuming you have multi-touch enabled, which it sounds like you do.
Try tracking the touches yourself and executing your logic when you're collection of touches reaches 2. Obviously you'll also have to track when your touch amount becomes more or less and handle things accordingly, but I'd guess you're already handling this if you're gesture is meant to be for 2 fingers only (aside from the extra logic you'd have to add in touchesBegan).
Not sure why the other guys answering with using touchesMoved:withEvent did not answer your question, but maybe you need a github example.
Double touch move example.
Is touchesMoved an option in order for you to achieve the desire outcome? Also you could implement a counter before setting the state to failed
Don't forget to set isMultipleTouchEnabled = true
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if touches.count != 2 {
print("we do NOT have two touches")
if counter > 100 { // 100 "more fogiven"
state = .failed
}
counter += 1
return
} else {
print("we have to two touches")
}

UIButtons in specific zone of the screen make delay on Touch Down event

I'm creating a Custom Keyboard for iOS. I have 4 rows of keys, each key have two actions: Touch Down to highlight button, and Touch Up Inside to unhighlight the button in 0.4 seconds.
But at the left edge of the screen there is a zone where Touch Down event of any button makes a delay for about quarter of second to show highlight.
See the image
So to see highlighted version, I had to hold the button, or swipe right from it. The buttons are the same, no difference at all. When I switch from letters to symbols, this left edge zone also makes the same delay. I've tried to move all the keys to the right, about 20px, and they worked fine, without delay. Ok, stick to the edge back, and delay came back also. Then I noticed, that pressing the button on its right edge, about 1-2 pixels made no delay at all. So, it seems like the problem is in this left side edge zone of the screen particularly.
By the way, I am running this app on my 5S, I've tried to run it on my friend's 5C, the same problem. But when I run it in the simulator, there is no such delay.
Use new iOS 11 feature to solve this problem definitely.
var preferredScreenEdgesDeferringSystemGestures: UIRectEdge { get }
Documentation
I'm too creating a custom keyboard, and as far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That's however not the case for UIInputViewController.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).

iOS UIPanGestureRecognizer: adjust sensitivity?

My question: Is there a way to adjust the "sensitivity" of UIPanGestureRecognizer so that it turns on 'sooner', i.e. after moving a fewer number of 'pixels'?
I have a simple app with a UIImageView, and pinch and pan gesture recognizers tied to this so that the user can zoom in and draw on the image by hand. Works fine.
However, I notice the stock UIPanGestureRecognizer doesn't return a value of UIGestureRecognizerState.Changed until the user's gesture has moved about 10 pixels.
Example: Here's a screenshot showing several lines that I've attempted to draw shorter & shorter, and there is a noticeable finite length below which no line gets drawn because the pan gesture recognizer never changes state.
IllustrationOfProgressivelyShorterLines.png
...i.e., to the right of the yellow line, I was still trying to draw, and my touches were being recognized as touchesMoved events, but the UIPanGestureRecognizer wasn't firing its own "Moved" event and thus nothing was getting drawn.
(Note/clarification: That image takes up the entirety of my iPad's screen, so my finger is physically moving more than an inch even in the cases where no state change occurs to the recognizer. It's just that we're 'zoomed in' in terms of the tranformation generated by the pinch gesture recognizer, so a few 'pixels' of the image take up a significant amount of the screen.)
This is not what I want. Any ideas on how to fix it?
Maybe some 'internal' parameter of UIPanGestureRecognizer I could get at if I sub-classed it or some such? I thought I'd try to sub-class the recognizer in a manner such as...
class BetterPanGestureRecognizer: UIPanGestureRecognizer {
var initialTouchLocation: CGPoint!
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent) {
super.touchesBegan(touches, withEvent: event)
initialTouchLocation = touches.first!.locationInView(view)
print("pan: touch begin detected")
print(self.state.hashValue) // this lets me check the state
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent) {
super.touchesMoved(touches, withEvent: event)
print("pan: touch move detected")
print(self.state.hashValue) // this remains at the "began" value until you get beyond about 10 pixels
let some_criterion = (touches.first!.isEqual(something) && event.isEqual(somethingElse))
if (some_criterion) {
self.state = UIGestureRecognizerState.Changed
}
}
}
...but I'm not sure what to use for some_criterion, etc.
Any suggestions?
.
Other alternatives that could work, but that I'd rather not have to do:
I could simply attach my UIPanGestureRecognizer to some parent,
non-zoomed view, and then use affine transforms & such to remap the
points of the pan touches onto the respective parts of the image.
So why am I not doing that? Because the code is written so that
lots of other objects hang off the image view and they all get the
same gesture recognizers and....everything works just great without
my having keep track of anything (e.g. affine transformations), and the problem only shows up if you're really-really zoomed in.
I could abandon UIPanGestureRecognizer, and effectively just write my own using touchesBegan and touchesMoved (which is kind of
what I'm doing), however I like how UIPanGestureRecognizer
differentiates itself from, say, pinch events, in a way that I don't
have to worry about coding up myself.
I could just specify some maximum zoom beyond which the user can't go. This fails to implement what I'm going for, i.e. I want to allow for fine-detail level of manipulation.
Thanks.
[Will choose your answer over mine (i.e., the following) if merited, so I won't 'accept' this answer just yet.]
Got it. The basic idea of the solution is to change the state whenever touches are moved, but use the delegate method regarding simultaneous gesture recognizers so as not to "lock" out any pinch (or rotation) gesture. This will allow for one- and/or multi-fingered panning, as you like, with no 'conflicts'.
This, then, is my code:
class BetterPanGestureRecognizer: UIPanGestureRecognizer, UIGestureRecognizerDelegate {
var initialTouchLocation: CGPoint!
override init(target: AnyObject?, action: Selector) {
super.init(target: target, action: action)
self.delegate = self
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent) {
super.touchesBegan(touches, withEvent: event)
initialTouchLocation = touches.first!.locationInView(view)
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent) {
super.touchesMoved(touches, withEvent: event)
if UIGestureRecognizerState.Possible == self.state {
self.state = UIGestureRecognizerState.Changed
}
}
func gestureRecognizer(_: UIGestureRecognizer,
shouldRecognizeSimultaneouslyWithGestureRecognizer:UIGestureRecognizer) -> Bool {
if !(shouldRecognizeSimultaneouslyWithGestureRecognizer is UIPanGestureRecognizer) {
return true
} else {
return false
}
}
}
Generally setting that "shouldRecognizeSimultaneouslyWithGestureRecognizer" delegate to true always is what many people may want. I make the delegate return false if the other recognizer is another Pan, just because I was noticing that without that logic (i.e., and making the delegate return true no matter what), it was "passing through" Pan gestures to underlying views and I didn't want that. You may just want to have it return true no matter what. Cheers.
Swift 5 + small improvement
I had a case when accepted solution conflicted with basic taps on toolbar which also had this betterPanGesture so I added minimum horizontal offset parameter to trigger state changing to .changed
class BetterPanGestureRecognizer: UIPanGestureRecognizer {
private var initialTouchLocation: CGPoint?
private let minHorizontalOffset: CGFloat = 5
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
super.touchesBegan(touches, with: event)
self.initialTouchLocation = touches.first?.location(in: self.view)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
super.touchesMoved(touches, with: event)
if self.state == .possible,
abs((touches.first?.location(in: self.view).x ?? 0) - (self.initialTouchLocation?.x ?? 0)) >= self.minHorizontalOffset {
self.state = .changed
}
}
}

Dismiss the Keyboard in Swift via hitTest

My code inside a UIView Class that contains a UITextField named titleInput:
override func hitTest(point: CGPoint, withEvent event: UIEvent?) -> UIView? {
// Code to Dismiss the Keyboard when Pressed Outside Text Field
if !titleInput.pointInside(point, withEvent: event) {
endEditing(true)
}
// Return Original hitTest Result as Usual
return super.hitTest(point, withEvent: event)
}
I am experiencing a strange bug. When I first start the app, everything works as expected, touching outside the titleInput dismisses the keyboard. However, it brakes if I switch to a different app, and than come back to this app. After coming back to the app, clicking on the keyboard also dismisses the keyboard. Makes it difficult to type :)
Any idea why this is happening, and why it ONLY starts to happen after switching away from the app and then coming back to it? Also, is there a better way to do this same thing.
I would recommended doing it this way instead of what you are trying. This will just dismiss the keyboard when the enter/return key is pressed. I realize it is in Obj-C however the method calls are almost identical.
I usually do just this
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent) {
view.endEditing(true)
super.touchesBegan(touches, withEvent: event)
}

Sprite kit touch and hold with swift

i'm try to make my first game in sprite kit using swift
everything work fine for now but i couldn't know how could handle touch and hold in the screen
i'm try to make a jump power when the player hold the touch but i can't find event for this
thanks ;)
you can try this :
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
// action when the user touches the screen
// you can know where did he touch the screen like this
let touchLocation = touches?.anyObject().locationInView(self/* or your view */)
}
override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
// your code here
}
override func touchesEnded(touches: NSSet, withEvent event: UIEvent) {
// your code here
}
those method will help cary touches events in your screen
As I understand you need to handle 2 situations
1 - player taps on the screen/node - e.g. to jump
2 - player taps and holds - e.g. to change jump power
I think you need to handle both "touchesBegan" and "touchesEnded".
In "touchesBegan" start special timer and after some delay (e.g. 0.5 secs) start playing special animation for it (jump power indicator that shows current jump power)
on "touchesEnded" - stop timer, stop animation and calculate result jump power (based on timer's value).
If you also need to handle direction (angle) of jump - in this case you should also handle "touchesMoved" and calculate current angle based on touch position.

Resources