I have got a wide LineChart with many entries. I want to let user tap (or better longtap/3D touch) on an entry to show modal card where user could edit data entry. I tried implementing chartValueSelected but the problem is that it runs even when user taps to scroll (i e taps without releasing finger) which is not how a button should behave. Is there any way to implement tap recognizing for LineChart label?
It appears that overriding the tap gesture recognizer for the chart can work. This question has some answers based on someone who was looking for a similar solution.
You can attach your own gesture recognizer to LineChartView and use method getHighlightByTouchPoint to get information about the selected point.
override func viewDidLoad() {
// ...
let longTapRecognizer = UILongPressGestureRecognizer(target: self, action: #selector(onLongTap))
lineChartView.addGestureRecognizer(longTapRecognizer)
// ...
}
#objc func onLongTap(recognizer: UILongPressGestureRecognizer) {
if recognizer.state == .ended {
let highlight = lineChartView.getHighlightByTouchPoint(recognizer.location(in: lineChartView))
print("\(highlight)")
}
}
Related
In my app, I have an mkmapview with annotations.
When I tap annotation some small view with additional informations is appearing.
Now i want to hide this window when i tap the map but no when i tap some annotation.
How to make it?
I think something like this but i dont know how to read it is annotation view tapped.
tap gesture:
let mapTap = UITapGestureRecognizer(target: self, action: #selector(mapDidTap(_:)))
map.addGestureRecognizer(mapTap)
handler:
#objc private func mapDidTap(_ sender: UITapGestureRecognizer) {
if tapIsOnlyMap {
hideSmallPopup()
}
}
You can use var selectedAnnotations: [MKAnnotation] delegate property for detect if any annotation is selected. If there is empty you can hide the window.
One question, how did you open this annotation view when annotation is tapped? I think you can set this 'annotation window' when annotation tapped and then you should not check selectedAnnotations for hide the window.
Enjoy
Edit: it seems the swipe gesture can only takes one direction a time now. If someone knows another way to handle multiple directions at once, I’d still appreciate information!
Eidt: I find a way to deal with multiple directions concisely in this [answer] (https://stackoverflow.com/a/46104997/9645644) It uses an array literal with forEach loop. It’s much more convenient than adding gestures and dragging actions separately from storyboard.
I’m trying to get swift swipe gestures to work, everything’s fine until I tried to detect the direction of the swipe. Below is my code. I don’t understand why this isn’t working and would appreciate help!
In a view controller’s viewDidLoad I set up and added the swipe gesture recognizer, with direction[.left, .right]. After that I implemented the handler method which needs to detect the direction of the swipe. There’s no other stuff in this view controller.
After it failed to work(no response when swipe), I added a few prints, and got the output in the title.
override func viewDidLoad() {
super.viewDidLoad()
let swipeGestureRecognizer = UISwipeGestureRecognizer(target:self, action:#selector(swipeHandler(recognizer: )))
swipeGestureRecognizer.direction = [.left, .right]
view.addGestureRecognizer(swipeGestureRecognizer)
}
#objc func swipeHandler (recognizer: UISwipeGestureRecognizer){
switch recognizer.state{
case .ended:
let direction = recognizer.direction
print(direction)
if direction == .left {print(”left”)}
else if direction == .right {print(“right”)}
else {print(print(“none”)}
defaul: break
}
No matter left or right I swipe, it always prints “none”. And the direction print always give a “UISwipeGestureRecognizerDirection(rawValue: 3)”
The direction property tells the gesture when to trigger. for example if direction == .right then the swipe will trigger only on a swipe to the right. (It does not tell you the direction detected)
You need to detect one direction at a time. I would also suggest to add a method to control each swipe direction. For example.
func setUpGestures() {
// Gesture that define a left swipe.
let swipeLeft = UISwipeGestureRecognizer(target: self, action: #selector(Scene.swipeLeft))
swipeLeft.direction = .left
view?.addGestureRecognizer(swipeLeft)
// Do the same for the rest of the directions.
}
#objc func swipeLeft() {
// Do something when the user swipe left.
}
Hope it helps!
The below code adds a UIPanGestureRecognizer to the whole view on screen. When a user pans across the screen with one finger the panning/swiping action is recognised and recognizePanGesture(sender: UIPanGestureRecognizer) is triggered.
Unfortunately though my UIPanGestureRecognizer code is currently not accessibility compliant.
Questions:
How can I change the code below to ensure that it is completely accessible to users who are using VoiceOver in iOS?
What is the special gesture action a user typically uses when panning with VoiceOver active?
Code:
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
panGestureAdd()
}
func panGestureAdd() {
let panGesture: UIPanGestureRecognizer = UIPanGestureRecognizer(target: self, action: #selector(ViewController.recognizePanGesture(_:)))
panGesture.minimumNumberOfTouches = 1
panGesture.maximumNumberOfTouches = 1
self.view.addGestureRecognizer(panGesture)
}
func recognizePanGesture(sender: UIPanGestureRecognizer) {
print("UIPanGestureRecognizer active.")
}
}
VoiceOver users can perform a pan by prefixing it with the "pass-through" gesture (one-finger double-tap and hold before continuing with the gesture). You may want to offer an alternative method to access the control. One approach might be to add and conform to the UIAccessibilityTraitAdjustable trait.
I see that there are a ton of these questions, and I think I'm following the accepted Swift 3 methodology, but I'm still getting nothing. I can see that the UITapGestureRecognizer has been attached. Here's my code:
let tileClick = UITapGestureRecognizer(target: self, action: #selector(GameManagement.initiateTileClick(_:)))
newView.addGestureRecognizer(tileClick)
newView.isUserInteractionEnabled = true
func initiateTileClick(_ sender: UITapGestureRecognizer) {
print("initiate tile click")
}
A few things to note:
1) The view that I'm attaching the gesture recognizer to has a two views and a label within it that each cover the entire frame of the view, however, I tried attaching the recognizer to the label, which is the topmost child item and it still doesn't work.
2) Both the function that adds the recognizer and the function that is called on the tap are contained in an NSObject file. I have a variety of interconnected functions that I want to be able to call from multiple view controllers and would prefer to keep this in the separate NSObject file. The process worked when I had everything in a UIViewController file and stopped working when I moved the functions to the NSObject file.
3) I've tried changing GameManagement.initiateTileClick to self.initiateTileClick or just initiateTileClick and none of those worked.
If you are putting your views inside NSObject subclass then these views will lose their behaviors for UIResponder which manages the UI interactions as I am not able to see how you are adding these views to interface.
As you said, it was working inside ViewController because it manages view hierarchy and responder chain.
The solution would be to write extensions to separate code or better abstractions.
extension YourViewController {
newView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(GameManagement.initiateTileClick(_:))))
newView.isUserInteractionEnabled = true
func initiateTileClick(_ sender: UITapGestureRecognizer) {
print("initiate tile click")
}
}
Giving you an idea how the tap recogniser works.
Firstly add Tap gesture recogniser to your view controller. You have to put the object here as shown in the image.
Then control+drag the tap gesture object to your view and select delegate.
Then control+drag the recogniser to your swift file and action will be like this.
#IBAction func tapGesture(_ sender: UITapGestureRecognizer) {
}
Now you must have seen when you give some input to a text field, the keyboard appears. But if you press outside the text field, that is anywhere in the view, the keyboard hides. This is because of the tap gesture recogniser.
Consider you have a text field such that if you click in that text field, keyboard is appeared. But when you tap outside the textfield, the keyboard must hide.
Add this delegate
UITextFieldDelegate
Implement this:
#IBOutlet var phoneText: UITextField!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
exampleText.delegate = self
}
#IBAction func tapGesture(_ sender: UITapGestureRecognizer) {
exampleText.endEditing(true)
}
Obviously,this function is instance method.
func initiateTileClick(_ sender: UITapGestureRecognizer) {
print("initiate tile click")
}
-
UITapGestureRecognizer(target: self, action:#selector(GameManagement.initiateTileClick(_:)))
but thisGameManagement.initiateTileClick(_:) looks like a class is calling a class method!The target should be the caller of method.self can't call GameManagement.initiateTileClick(_:).
I'm getting a lot of delay in an app that needs to feel more instant.
I've got a simple app that toggles left and right. It's a seesaw and when one end is up, the other is down. Your supposed to use two fingers and tap on the screen like your just fidgeting with it. It supposed to be an aid to ADHD.
I've got two large images for left and right state. I've got a gesture recognized on the image and I check the coordinates of the tap to determine if you tapped the right side to go down or the left. I'm also using AudioServicesPlayAlertSound to cause a small pop vibrate on touch begin in an effort to give a bit of a feedback stimulus to the user.
In my tests, if I tap rapidly it seems I get a backlog of taps on the toggle. The vibrations happen way after the tap is over, so it feels useless. Sometimes the UI image gets backlogged just switching between images.
override func viewDidLoad() {
super.viewDidLoad()
let imageView = Seesaw
let tapGestureRecognizer = UILongPressGestureRecognizer(target:self, action: #selector(SeesawViewController.tapped));
tapGestureRecognizer.minimumPressDuration = 0
imageView?.addGestureRecognizer(tapGestureRecognizer)
imageView?.isUserInteractionEnabled = true
}
func tapped(touch: UITapGestureRecognizer) {
if touch.state == .began {
if(vibrateOn){
AudioServicesPlaySystemSound(1520)
}
let tapLocation = touch.location(in: Seesaw)
if(tapLocation.y > Seesaw.frame.height/2){
print("Go Down")
Seesaw.image = UIImage(named:"Down Seesaw");
seesawUp = false
} else if (tapLocation.y < Seesaw.frame.height/2){
print("Go Up");
Seesaw.image = UIImage(named:"Up Seesaw");
seesawUp = true
}
}
}
Another idea - would it be faster to implement this as a button? Are gesture recognizers just slow? Are the way I'm drawing the image states consuming the wrong type processing power?
Sems like you made mistake in your code. You want to create tap recognizer, but you created UILongPressGestureRecognizer
Please change line from
let tapGestureRecognizer = UILongPressGestureRecognizer(target:self, action: #selector(SeesawViewController.tapped))
to
let tapGestureRecognizer = UITapGestureRecognizer(target:self, action: #selector(SeesawViewController.tapped))
Or you may add transparent button and put your code to its handler
// onDown will fired when user touched button(not tapped)
button.addTarget(self, action: #selector(onDown), for: .touchDown)