Google Maps SDK iOS change floor level programmatically - ios

I dont find any way in the documentation how to change the floor level programmatically. For example:
I load a specific polygon for 1 room, and i want to change the floor map to the specific floor level (For example Zoom to Level 2, and then add my Polygon).
Polygon adding with Infoboxes are not the problem, ill just want to know if its possible to programmatically select the right floor level.
Like here:
I still know that there is the GMSIndoorDisplayDelegate - and i am able to set the active building. There should be an "activeLevel" method, but i am unable to assign any value.

I found the solution on how to force the floor change for a particular building.
Add the delegate GMSIndoorDisplayDelegate to the class,
Set the delegates:
mapView.delegate = self
mapView.indoorDisplay.delegate = self
Add the delegate methods:
func didChangeActiveBuilding(building: GMSIndoorBuilding!) {
if let currentBuilding = building {
var levels = currentBuilding.levels as! [GMSIndoorLevel]
mapView.indoorDisplay.activeLevel = levels[2] // set the level (key)
}
}
func didChangeActiveLevel(level: GMSIndoorLevel!) {
println("will be called after activeBuilding")
}
P.S. Just a friendly reminder, the upmost level is the first item in the array. It's backwards.

Related

Detecting when a SKNode is tapped on Apple Watch

I'm writing an app for Apple Watch using SpriteKit, so I don't have access to functions like touchesBegan and I have to use a WKTapGestureRecognizer to detect taps, no big deal, but I have issues detecting taps on a node.
In my InterfaceController I have:
#IBAction func handleTap(tapGestureRecognizer: WKTapGestureRecognizer){
scene?.didTap(tapGesture: tapGestureRecognizer)
}
And in my Scene file I have
func didTap(tapGesture:WKTapGestureRecognizer) {
let position = tapGesture.locationInObject()
let hitNodes = self.nodes(at: position)
if hitNodes.contains(labelNode) {
labelNode.text = "tapped!"
}
Problem is the Tap Gesture Recognizer gives me the absolute coordinates of the touch point (for example 11.0, 5,0) while my node is positioned relatively to the center of the screen (so its position is -0.99,-11.29 even though is at the center of the screen) therefore the tap is hitting the node not when actually tapping it, but when I tap on the top left of the screen. I searched everywhere and it looks like this is the way to do it yet I don't find people having the same issues. The node has been added via the editor. What am I doing wrong?
So you have the right idea. You are getting this wrong because hitNodes is an array of SKNodes. Those are newly created. So when you use hitNodes.contains the addresses of the labelNode and the address of the newly created SKNode that is being compared would be completely different. Therefore it would never be tapped.
Here's what I would do. This would be in my Scene File. Your InterfaceController class is correct.
func didTap(tapGesture:WKTapGestureRecognizer) {
let position = tapGesture.locationInObject()
if labelNode.contains(position) {
labelNode.text = "tapped!"
}
}
OR another way would be this. I like this way because you only have one function which would be in the WKInterfaceControlller And you would need no functions in your Scene File.
#IBAction func tapOnScreenAct(_ sender: WKGestureRecognizer) {
if scene.labelNode.contains(sender.locationInObject()) {
scene.labelNode.text = "tapped!"
}
}
Either way, both should work. Let me know if you have any more questions or clarifications.

Remove touch gestures (rotation/translation) for Ios ar object

Summary: Remove touch gestures on an AR object after adding it?
AR View Code (Just the relevant bit).
Where arObject is a model entity with a mesh, material and a collision shape.
func updateUIView(_ uiView: ARView, context: Context) {
arObject = CreateCustomModelEntity()
uiView.installGestures([.translation, .rotation], for: arObject)
}
The above could would add touch gestures to my arObject and allow it to be rotated and moved across the anchored plane.
However, I want to remove the touch gestures after adding it.
User Flow:
The user would click a model, move it around and place it where they'll like and touch the Confirm button. After the confirm button is touched, the arObject can no longer be moved around.
Looking at the Apple docs, there's an installGestures, but no equivalent removeGestures. Is this even possible?
A few ideas,
completely remove the anchor and recreate it (but then the placement of the object is lost, so this is bad)
Override the existing child ar object with a new one without the touch gestures installed. I believe this would retain the object placement, but double creating ar objects isn't ideal unless there's no other solution.
Create a temp ar object with install gestures and then override it with a new arObject (without touch gestures) after placement has been confirmed. Similar to 2. solution.
The installGestures method returns an array of EntityGestureRecognizers and also assigns these gesturesRecognizers to the arView.gesturesRecognizers array. If you want to remove gestures for a given entity you need to find the gesturesRecognizers you want and remove them from that array.
The code below assigns the translation and rotation gestureRecognizers separately to a property which you use to find the corresponding index in the arView.gesturesRecognizers array.
func updateUIView(_ uiView: ARView, context: Context) {
arObject = CreateCustomModelEntity()
translationGestureRecognizer = uiView.installGestures([.translation], for: arObject).first
rotationGestureRecognizer = uiView.installGestures([.rotation], for: arObject).first
}
func removeGesture() {
let recognizerIndex = arView.gestureRecognizers?.firstIndex(of: translationGestureRecognizer!)
arView.gestureRecognizers?.remove(at: recognizerIndex!)
}
I found removing gestureRecognizers from the arView mandatory also when removing an object from the scene because otherwise it isn't released from memory and thus it increases the app's footprint.

How do I allow users to click on iOS maps to show a callout in swift?

I am trying to build an application where I need to show a callout with details regarding business (name, address etc) when user clicks on the points of interests on the map.
I am able to show callout when there is an annotation. But I want to have a functionality similar to apple maps application, where even without an annotation, users are able to directly tap on the point of interest to show the details about that point of interest.
I have already set the following properties on my mapView:
mapView.userInteractionEnabled = true
mapView.showsPointsOfInterest = true
Any help is appreciated.
you can use MKMapViewDelegate and override Mouse Event
override func rightMouseDown(theEvent: NSEvent) {
let eventLocation: NSPoint = theEvent.locationInWindow
// do something
NSNotificationCenter.defaultCenter().postNotification(notification)
}

MKMapView with annotations issue

I have MKMapView with many annotations in proper order. And I have button to change map mode:
- (IBAction)userDidPressTrackButton:(id)sender {
if (self.mapView.userTrackingMode == MKUserTrackingModeFollow) {
self.mapView.userTrackingMode = MKUserTrackingModeFollowWithHeading;
return;
}
if (self.mapView.userTrackingMode == MKUserTrackingModeFollowWithHeading) {
self.mapView.userTrackingMode = MKUserTrackingModeNone;
return;
}
if (self.mapView.userTrackingMode == MKUserTrackingModeNone) {
self.mapView.userTrackingMode = MKUserTrackingModeFollow;
}
}
when mode = MKUserTrackingModeFollowWithHeading annotations begin to put themselves in random order. It seems that in this mode mapview begin to redraw itself every second and put all the subview (annotations) in unknown order.
How to cancel changing order of annotations?
The array of annotations that you get back from mapview.annotations is not guaranteed to be in the same order you that added them in. If you are relying on them vein in a special order you are probably doing your viewForAnnotations function wrong. It is given an annotation as a parameter, you use that to determine what view to draw. Usually people make a custom class that implements the MKAnnotation protocol and has some addition variables in which they store the data needed to create the right view. When viewForAnnotations gets called they check if the provided annotation is one of their special class, and if so extract the data, make the view and return it. There is no need to know the order of the annotations array, and it wouldn't do you any good because an MKMapView can and will ask for the annotations in any sequence it likes.
Side note: Why are you writing your own function for toggling the tracking mode? You can just use the official one and it will do it all for you.

How do I set the accessibility label for a particular segment of a UISegmentedControl?

We use KIF for our functional testing, and it uses the accessibility label of elements to determine where to send events. I'm currently trying to test the behaviour of a UISegmentedControl, but in order to do so I need to set different accessibility labels for the different segments of the control. How do I set the accessibility label for a particular segment?
As Vertex said,
obj-c
[[[self.segmentOutlet subviews] objectAtIndex:3] setAccessibilityLabel:#"GENERAL_SEGMENT"];
swift
self.segmentOutlet.subviews[3].accessibilityLabel = "GENERAL_SEGMENT"
some advice so you don't go crazy like I did:
To scroll in accessibility mode swipe three fingers
The indexes of the segments are backwards than you would expect, i.e. the furthest segment to the right is the 0th index and the furthest to the left is the n'th index where n is the number of elements in the UISegmentControl
I'm just getting started with KIF myself, so I haven't tested this, but it may be worth a try. I'm sure I'll have the same issue soon, so I'd be interested to hear if it works.
First, UIAccessibility Protocol Reference has a note under accessibilityLabel that says:
"If you supply UIImage objects to display in a UISegmentedControl, you can set this property on each image to ensure that the segments are properly accessible."
So, I'm wondering if you could set the accessibilityLabel on each NSString object as well and be able to use that to access each segment with KIF. As a start, you could try creating a couple of strings, setting their accessibility labels, and using [[UISegmentedControl alloc] initWithItems:myStringArray]; to populate it.
Please update us on your progress. I'd like to hear how this goes
Each segment of UISegmentedControl is UISegment class instance which subclass from UIImageView. You can access those instances by subviews property of UISegmentedControl and try to add accessibility for them programmatically.
You can't rely on the index in the subviewsarray for the position. For customisation of the individual subviews I sort the subviews on their X Position before setting any propery.What would also be valid for accesibilityLbel.
let sortedViews = self.subviews.sorted( by: { $0.frame.origin.x < $1.frame.origin.x } )
sortedViews[0].accessibilityLabel = "segment_full"
sortedViews[1].accessibilityLabel = "segment_not_full"
This is an old question but just in case anyone else runs up against this I found that the segments automatically had an accessibility label specified as their text. So if two options were added of Option 1 and Option 2. A call to
[tester tapViewWithAccessibilityLabel:#"Option 2"];
successfully selected the segment.
The solutions with using an indexed subview is not working since you cannot rely on a correct order and it will be difficult to change the number of segments. And sorting by origin does not work, since the frame (at least for current versions) seems to be always at x: 0.
My solution:
(segmentedControl.accessibilityElement(at: 0) as? UIView)?.accessibilityLabel = "Custom VoiceOver Label 1"
(segmentedControl.accessibilityElement(at: 1) as? UIView)?.accessibilityLabel = "Custom VoiceOver Label 2"
(segmentedControl.accessibilityElement(at: 2) as? UIView)?.accessibilityLabel = "Custom VoiceOver Label 3"
Seems to work for me and has the correct order. You also do not rely on an image. Not that pretty either but maybe more reliable than other solutions.
This is an old question but just in case anyone else runs up against this I found that the segments automatically had an accessibility label specified as their text.
Further to Stuart's answer, I found it really useful when writing test cases to turn on 'Accessibility Inspector' on the Simulator (Settings -> General -> Accessibility -> Accessibility Inspector). You'd be surprised how many elements already have accessibility labels included, like in the standard iOS UI elements or even third party frameworks.
Note: Gestures will now be different - Tap to view accessibility information, double tap to select. Minimizing the Accessibility Inspector window (by tapping the X button) will return the gestures back to normal.
You guys want to see how Apple recommends it be done?
It's FUGLY.
This is from this example:
func configureCustomSegmentsSegmentedControl() {
let imageToAccessibilityLabelMappings = [
"checkmark_icon": NSLocalizedString("Done", comment: ""),
"search_icon": NSLocalizedString("Search", comment: ""),
"tools_icon": NSLocalizedString("Settings", comment: "")
]
// Guarantee that the segments show up in the same order.
var sortedSegmentImageNames = Array(imageToAccessibilityLabelMappings.keys)
sortedSegmentImageNames.sort { lhs, rhs in
return lhs.localizedStandardCompare(rhs) == ComparisonResult.orderedAscending
}
for (idx, segmentImageName) in sortedSegmentImageNames.enumerated() {
let image = UIImage(named: segmentImageName)!
image.accessibilityLabel = imageToAccessibilityLabelMappings[segmentImageName]
customSegmentsSegmentedControl.setImage(image, forSegmentAt: idx)
}
customSegmentsSegmentedControl.selectedSegmentIndex = 0
customSegmentsSegmentedControl.addTarget(self,
action: #selector(SegmentedControlViewController.selectedSegmentDidChange(_:)),
for: .valueChanged)
}
They apply the accessibility labels to images, and then attach the images. Not too different from the above answer.
another option if not willing to set accesibility label might be calculating the poistion of each segment part and use
[tester tapScreenAtPoint:segementPosition];
to trigger the actions
If you look at the segmented control thru the accessibility inspector, you find that the segments are UISegment objects. Moreover, they turn out to be direct subviews of the UISegmentedControl. That fact suggests the following insanely crazy but perfectly safe Swift 4 code to set the accessibility labels of the segments of a UISegmentedControl:
let seg = // the UISegmentedControl
if let segclass = NSClassFromString("UISegment") {
let segments = seg.subviews.filter {type(of:$0) == segclass}
.sorted {$0.frame.minX < $1.frame.minX}
let labels = ["Previous article", "Next article"] // or whatever
for pair in zip(segments,labels) {
pair.0.accessibilityLabel = pair.1
}
}
As mentioned in the accepted answer, adding accessibilityLabel to the text should do the trick:
let title0 = "Button1" as NSString
title0.accessibilityLabel = "MyButtonIdentifier1"
segmentedControl.setTitle("\(title0)", forSegmentAt: 0)
let title1 ="Button2" as NSString
title1.accessibilityLabel = "MyButtonIdentifier2"
segmentedControl.setTitle("\(title1)", forSegmentAt: 1)
XCode 12 / iOS 14.3 / Swift 5
This is an old post but I encountered the same problem trying to set accessibility hints for individual segments in a UISegmentedControl. I also had problems with some of the older solutions. The code that's currently working for my app borrows from replies such as those from matt and Ilker Baltaci and then mixes in my own hack using UIView.description.
First, some comments:
For my UISegmentedControl with 3 segments, the subview count is 3 in the viewDidLoad() and viewWillAppear() of the parent UIVIewController. But the subview count is 7 in viewDidAppear().
In viewDidLoad() or viewWillAppear() the subview frames aren't set, so ordering the subviews didn't work for me. Apparently Benjamin B encountered the same problem with frame origins.
In viewDidAppear(), the 7 subviews include 4 views of type UIImageView and 3 views of type UISegment.
UISegment is a private type. Working directly with the private API might flag your app for rejection. (see comment below)
type(of:) didn't yield anything useful for the UISegment subviews
(HACK!) UIView.description can be used to check the type without accessing the private API.
Setting accessibility hints based on X order tightly couples UI segment titles and hints to their current positions. If user testing suggests a change in segment order, then changes must be made both in the UI and in the code to set accessibility hints. It's easy to miss that.
Using an enum to set segment titles is an alternative to relying on X ordering set manually in the UI. If your enum inherits from String and adopts the protocols CaseIterable and RawRepresentable, then it's straightforward to create titles from the enum cases, and to determine the enum case from a segment title.
There's no guarantee the following will work in a future release of the framework, given the reliance on description.contains("UISegment") but it's working for me. Gotta move on.
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// get only the UISegment items; ignore UIImageView
let segments = mySegmentedControl.subviews.compactMap(
{ $0.description.contains("UISegment") ? $0 : nil }
)
let sortedSegments = segments.sorted(
by: { $0.frame.origin.x < $1.frame.origin.x }
)
for i in 0 ..< sortedSegments.count {
let segment = sortedSegments[i]
// set .accessibilityHint or .accessibilityLabel by index
// or check for a segment title matching an enum case
// ...
}
}
On Private APIs and Rejection
I'm referring to the April 2016 comment from #dan in Test if object is an instance of class UISegment:
It's a private class. You can check it with [...
isKindOfClass:NSClassFromString(#"UISegment")] but that may get your
app rejected for using private api or stop working in the future if
apple changes the internal class name or structure.
Also:
What exactly is a Private API, and why will Apple reject an iOS App if one is used?
"App rejected due to non-public api's": https://discussions.apple.com/thread/3838251
As Vortex said, the array is right to left with [0] starting on the right. You can set every single accessibility option by accessing the subviews. Since the subviews are optional, it is good to pull out the subview first, and then assign the accessibility traits that you want. Swift 4 example for a simple two option segment control:
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
guard let rightSegment = segmentControl.subviews.first, let leftSegment = segmentControl.subviews.last else { return }
rightSegment.accessibilityLabel = "A label for the right segment."
rightSegment.accessibilityHint = "A hint for the right segment."
leftSegment.accessibilityLabel = "A label for the left segment."
leftSegment.accessibilityHint = "A hint for the left segment."
}

Resources