When I am doing the single tap in iOS voice over mode, it will read the tagged elment, but I want to know the tag point x and y, are there any api to get it?
You can't get this information from VoiceOver. The APIs don't support it. The closest you could get would be to grab onto the focused element, and understand that somewhere within that rectangle was the last touch point. But, even then, there would be no way to distinguish between elements focused by single touch vs elements focused "focustNext" style sequential navigation (swipe right and swipe left gestures).
Related
I'd like to know if there is a way to best detect a users tap on a label?
The new iOS15 Maps app allows a tap on e.g. a cities name and then shows informations about that city.
I am now wondering if something similar can be done with mapbox?
I know that there is a mapView.visibleFeatures(in: myRect) function that can somehow help here. So I can convert my finger location to a rect and then get all features there.
BUT... my city e.g. might have a label that is let's say 200 px wide. So I would need to have a quite large rect to find the point of my city label. And then I will also get all kinds of other labels that might be there. Maybe even not visible, but in the dataset.
Is there no way to ask the map what the frontmost element was when I tapped? So that when I tap on the far end of the label, I still get that ONE feature?
I am still using Mapbox V6.3... the latest before their last major update.
But if it's not possible with that version, an answer about the latest V10.something would also be great.
For v10, this example demonstrates how to identify features near a click. While the overall example is to a different end, the onMapClick functions shows the method to find a feature and then build an annotation.
https://docs.mapbox.com/ios/maps/examples/view-annotation-marker/
I am making an application that works essentially like a simple Drag-and-Drop Playground with the command blocks on the left and a droppable area on the right. I want to make it fully compatible with VoiceOver and I'm running into trouble with some of the accessibility aspects since this is my first Swift application.
This is what the playground currently looks like: (App Screenshot)
My goal is to provide the users with audio cues/feedback while they are dragging the elements to help them figure out what part of the screen they are currently at. The ideal functionality would be exactly like what one uses when editing an iOS device's Home screen (the arrangement layout of the apps).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a row/column alert when you are dragging an app over an open area. I want a similar type of feedback that says "Droppable Area" when you are over the correct area (see scenario 1).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a sound when you tap on an area that has no app icon. (This also happens when you are not editing the layout and simply tap on an open area with no app.) I want that noise to be what you hear when you drag a command over an area that is not droppable (see scenario 2).
Any ideas on how this might be possible or good references to look at?
When you deep press on a phone number for example, it shows a different style of peek view:
I'd like to know if there's a simple way of replicating this or if I have to set up all those views myself. I couldn't find anything in the documentation.
I am working on a react-native app that needs to detect which line number in a TextInput (UITextfield) has been swiped by a user.
A popular app that uses this type of interaction is Paper which allows a user to swipe a given line in a text document to style it.
Even though my main use case is react-native, curious to know what other developers familiar with RN or iOS think are the building blocks for such an interaction.
My current thoughts are:
Wrap my TextInput with a PanResponder element
Detect whatever I consider is a valid user gesture
Determine the x,y coordinate of the valid gesture
Somehow determine the sentence that was swiped on given the above coordinates??
I am wanting to get the touch points in an iOS app under Delphi (preferably as an event).
Is there an interface for this that is exposed?
I found this:
Get touch points in UIScrollView through UITapGestureRecognizer.
But I would not know if its possible to convert this code.
I am trying to implement a number of on-screen sliders (audio faders) so that the user and slide their finger to move it up and down. I plan to put a transparent rectangle control over top of the sliders so that I can get multiple touch points and move more than one simultaneously.
After going down this rabbit hole about as far as possible I am sad to say that's its not possible to get multiple touch points in Delphi. This is very shortsighted and appears to be a trivial oversight by Embarcadero.
Essentially the Delphi UIView implementation does not expose the "multipleTouchEnabled" property so the touch events inside of FMX.Platform.iOS.pas will never receive more than one touch event.
I am posting this to save the next person the time that I put into find this disappointing answer.