Google Places API for IOS does not set localized language - ios

In my app I need user to pick a place using Google places. I call google place picker like so:
GMSPlacePicker * placePicker = [[GMSPlacePicker alloc] initWithConfig:config];
GMSPlacesClient * placesClient = [GMSPlacesClient sharedClient];
[placePicker pickPlaceWithCallback:^(GMSPlace *place, NSError *error) {...}];
My problem is, that presented picking View has English language set instead of my localized language (Slovak). I have read that it is supposed to find localization by itself?

what i did was to changing the default language in my app info.plist
Localization native development region to = "your language prefix goes here"
this changed my app localization and also changed the Google Place Picker to the the preferred language.
my app also needs to be from left to right but my language is right to left so i also forced the app to "left to right"
in my app delegate :
UIView.appearance().semanticContentAttribute = UISemanticContentAttribute.forceLeftToRight
take for consideration "semanticContentAttribute" supported only from ios 9 and above.
i hope this helps anyone struggling with this issue.

Related

Why does my NSUserActivity not show up in the Shortcuts app?

I've defined 3 different NSUserActivitys in my iOS app to allow shortcuts to launch my app in different views. These worked just fine.
For example:
userActivityGo = [[NSUserActivity alloc] initWithActivityType:SIRI_MAIN_ACTIVITY_ID];
userActivityGo.title = #"Launch App";
userActivityGo.eligibleForPrediction = YES;
userActivityGo.eligibleForSearch = YES;
userActivityGo.userInfo = #{#"action" : #"go"};
userActivityGo.requiredUserInfoKeys = [NSSet setWithArray:userActivityGo.userInfo.allKeys];
self.userActivity = userActivityGo;
Recently (maybe due to iOS recent updates?) I've noticed these shortcuts are missing from the Shortcuts app.
When I enter the "apps" section in Shortcuts, my app is not listed there, but I can still see my predefined shortcuts and see them under "Siri Suggestions".
I did not implement intents since I don't have any special UI or input I need from the user, but as I've already said - it worked before.
Were there any related changes that could cause this?

iOS TTS Alex voice is treated like available when it's actually not

In my app I allow the user to change TTS voices and English Alex voice is one of possible options. However, there are some cases when it's treated as available when it's actually not. It results in pronouncing the TTS utterance with another voice.
To prepare the list of available voices I use next code:
NSMutableArray *voices = [NSMutableArray new];
for (AVSpeechSynthesisVoice *voice in [AVSpeechSynthesisVoice speechVoices]) {
[voices addObject:#{
#"id": voice.identifier,
#"name": voice.name,
#"language": voice.language,
#"quality": (voice.quality == AVSpeechSynthesisVoiceQualityEnhanced) ? #500 : #300
}];
}
To start the playback I use the next code (here it's simplified a bit):
AVSpeechUtterance *utterance = [[AVSpeechUtterance alloc] initWithString:text];
utterance.voice = [AVSpeechSynthesisVoice voiceWithIdentifier:voice];
[AVSpeechSynthesizer speakUtterance:utterance];
Cases when AVSpeechSynthesisVoice returns Alex as available when it's not:
The easiest way to reproduce it is on simulator. When I download Alex from iOS settings, the download button disappear, but when I press on the voice nothing happens. As a result it seems to be downloaded, however it can't be deleted.
In some cases Alex is dowloaded correctly and is actually available in the app, but when I try to delete it, it looks like it's not fully deleted. As a result it's treated as available in my app, but in iOS settings it's shown as not downloaded.
If the iPhone storage is close to the full state as much as possible and Alex voice hasn't been used recently, looks like it's being offloaded, so it's shown as available both in iOS settings and in my app, but in fact the utterance is being pronounced by another voice.
For all cases above in my app Alex looks like it's available, but when I pass it to the utterance it's pronounced with some different voice. Note that it happens only with this voice, I haven't seen such a case for others. Maybe this voice should be treated separately somehow?

add stamp annotation using PSPDFKit iOS objective-c

I am using PSPDFKit framework, and I am unable to add stamp annotation, using this I have implemented following:
[pdfController.annotationStateManager toggleState:PSPDFAnnotationStringStamp];
NSMutableArray<PSPDFStampAnnotation *> *defaultStamps = [NSMutableArray array];
for (NSString *stampSubject in #[#"Great!", #"Stamp", #"Like"]) {
PSPDFStampAnnotation *stamp = [[PSPDFStampAnnotation alloc] initWithSubject:stampSubject];
stamp.boundingBox = CGRectMake(0.f, 0.f, 200.f, 70.f);
[defaultStamps addObject:stamp];
}
PSPDFStampAnnotation *imageStamp = [[PSPDFStampAnnotation alloc] init];
imageStamp.image = [UIImage imageNamed:#"abc.jpg"];
imageStamp.boundingBox = CGRectMake(0.f, 0.f, imageStamp.image.size.width/4.f, imageStamp.image.size.height/4.f);
[defaultStamps addObject:imageStamp];
[PSPDFStampViewController setDefaultStampAnnotations:defaultStamps];
but I have no output.
Peter here, Founder and CEO of PSPDFKit, the PDF SDK for iOS, Android, Web, macOS and (soon) Windows. The best way to reach our support is reaching out directly to our support page. Support is part of your license subscription.
You're setting default stamps for the PSPDFStampViewController. Can you post a screenshot how things look? You're changing the default here (APPROVED, REJECTED and so on and replacing this with your own images, which is valid and works.)
Note that you only need to call this once and it needs to be called before you switch/toggle the stamp mode, so your current code will not work.
Please also make sure you use the latest version so we can rule out any old bugs or incompatibilities. As of writing this, it's Xcode 8.3 and PSPDFKit 6.6 (click for release blog post).
Stamps only show up if you have the annotation component licensed - if you ping us on support my team can check what components you have to make sure that's not the problem here.
If you're just trying to programmatically add annotations via our framework, check out this guide article instead.

VoiceOver accessibility label for Touch ID

I am trying to ensure that the iOS app that I am working on is accessible and am trying to implement VoiceOver to ensure this.
One strange thing that I cannot find any help for is when the Touch ID view is displayed (in my case for signing into the app). VoiceOver pronounces ID as a word and not I.D.
I have tried implementing the accessibility attributes to both NSString and the LAContext object but neither seem to change what is read out by VoiceOver. Code snippets below:
LAContext *context = [[LAContext alloc] init];
[context setIsAccessibilityElement:YES];
[context setAccessibilityLabel:#"TEST 2"];
NSError *error = nil;
NSString *label = #"Please authenticate your ID using the Touch ID";
[label setIsAccessibilityElement:YES];
[label setAccessibilityTraits:UIAccessibilityTraitStaticText];
[label setAccessibilityLabel:#"TEST"];
showingTouchID = TRUE;
if ([context canEvaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics error:&error]) {
[context evaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics
localizedReason:label
reply:^(BOOL success, NSError *error) {
......
The output from VoiceOver with or without context having the accessibility attributes is always the label text.
All help greatly appreciated :)
You should definitely not change the accessibility label just to make VoiceOver pronounce things correctly (i.e. do not try "hack" the label pronounciation). The reason is that VoiceOver does not have speech output only; it has also braille output where blind users expect to read things exactly letter-by-letter as they are written (i.e. see exactly all the spaces, capital/small letters, etc.) If you did e.g. write "I D" instead of "ID", then while VoiceOver would pronounce it perhaps correctly (in the specific version of iOS), blind users, after also reading "I D" on a braille display might think that that is how it is actually written and appear let's say non-professionally when they would then use this wrong spelling in written exchanges with other people.
The correct way to deal with this, albeit without giving you an immediate solution, is:
File a bug with Apple about pronounciation of the specific word with the specific voice in the specific language (e.g. "Expected pronounciation: [aj'di:]" vs. "Actual pronounciation: [id]")
File a bug with Apple to request the ability to customize pronunciation only (i.e. where you would leave the accessibility label intact and correct, but specify to the voice how it should pronounce certain part of the text), and where this customization could be done for each language individually by the translator translating the strings (because wrong pronunciation is language-specific) - also see the next point.
If you can reword, try different word than the problematic one (which seems not applicable in case of "Touch ID" which is a set term). But this is a hack too, as that solves only the English original and does not care about translations where the rewording might on the contrary potentially complicate the pronunciation.
Sorry for the bad news.
Finally, here, both on iOS 8.4.1 and iOS 9.0.2, VoiceOver with default US English iOS voice, at least on this webpage, pronounces "ID" in "Touch ID" as [ajdi:], not [id].
You can try this for a quick work around: Just give space between I and D
NSString *label = #"Please authenticate your ID using the Touch ID";
label.accessibilityLabel=#"Please authenticate your I D using the Touch I D";
Also please note that you can only set accessibility to UIElements and you cannot set it to general variables. It doesn't make sense to set accessibility label for LAContext and to NSString.
YOu need to set the accessibility label to UILabel or the element which you give the NSString to.
Starting with iOS 11, you can set the element's accessibilityAttributedLabel and use the UIAccessibilitySpeechAttributeIPANotation key (Swift: NSAttributedString.Key.accessibilitySpeechIPANotation) to specify the pronunciation for a range of the attributed string.
See "Speech Attributes for Attributed Strings" for other tools you can use to tweak how VoiceOver reads your text.

How to get Localized NSError

I have done a lot of google search without success.
I would like to use the system localization of NSError (In my case in french)
I have try many things but the wording are always in english.
The configuration off my app :
In the plist :
CFBundleDevelopmentRegion = fr_FR
In the pbxproj:
developmentRegion = fr;
knownRegions = (
fr,
Base,
);
When I call the property "localizedDescription" I always get english version like this link (NSError localizedDescription always returns english error message) but the solution doesn't work for me...
I don't found what I'm missing.
In this other link NSURLConnection returns NSError with only english as language? they copy the strings but I don't think it's the better way, we should be able to access the file without copy it.
For information when I use an UIBarButtonItem like Cancel, it's localized in french.
Thanks in advance for your help.
First you should have a localisation file with key and value strings.
Now when you want to show the localisation error, you can use following method
NSLocalizedString(<#key#>, <#comment#>)
I have had the same issue the solution was to set the info.plist attribute Localization native development region to your region: France in this case

Resources