Setting language across whole app - SwiftUI - ios

I am looking to add multiple languages to my app and so I have created the different String files and have added them to my localisations. I have a settings page in my app where I want them to be able to select the language the app is in, and the whole app is changed.
VStack{
Button("English"){
//set app to English
}
Button("Francais"){
//set app to French
}
Button("Cymru"){
//set app to Welsh
}
.....
}
I've found the modifier .environment(\.locale, init(identifier: "en")) but I believe this modifier needs to be added to every view and I was wondering if there is an easier way to do this? I want the language to be saved to user defaults too.

Related

One Time Password On IOS App - Suggestion not appearing above keyboard

I'm looking to implement where a OTP is made a suggestion at the top of the keyboard for an OTP Entry in an IOS app.
The IOS version on the phone is 12.2.
THE ISO SDK Version of my App is 12.1.
Using Visual Studio (Windows) 2017 15.9.13
Now I have done the following......
Created an new control public class OTPEntry : Xamarin.Forms.Entry
Created a renderer for the control and in this I do Control.TextContentType = UITextContentType.OneTimeCode;
I then use this control on a ContentPage with the correct namespace etc.
SO when I am on the form with this control, I send a text to the phone with an OTP. On the phone if I click on the code it offers a "Copy Code" option so it is recognised as an OTP.
However, for the life of me, when I tap in the control, to bring up the keyboard, I do not see the code in the top of the keyboard as expected.
What could I possibly be missing?
It seems the steps to implement this are relatively straightforward but I cannot seem to get it working.
Any ideas, pointers would be very greatly appreciated.
Code below...
CONTROL - IN Xamarin Forms Project
namespace XXXX
{
public class OTPEntry : Xamarin.Forms.Entry
{
public OTPEntry()
{
}
}
}
RENDERER - IN IOS Project
namespace XXXX.YYYY.ZZZZ
{
public class OTPEntryRenderer : EntryRenderer
{
protected override void OnElementChanged(ElementChangedEventArgs<Entry> e)
{
base.OnElementChanged(e);
if (e.NewElement != null)
{
Control.TextContentType = UITextContentType.OneTimeCode;
}
...
...
}
}
}
USAGE - IN CONTENT PAGE IN Xamarin Forms Project
<XXXX:OTPEntry x:Name="txtToken" Keyboard="Numeric" Placeholder="Two Factor Code" HeightRequest="50" WidthRequest="300" TextColor="#2A295B" BackgroundColor="White" Margin="0"/>
Firstly,OneTimeCode is available after iOS 12.0.So I suggest add the following code in CustomRenderer
if (UIDevice.CurrentDevice.CheckSystemVersion(12, 0))
{
Control.TextContentType = UITextContentType.OneTimeCode;
}
What happens is that when an OTP message receives into the Message Inbox, iOS runs a simple text matching algorithm that determines if that message is a valid OTP message or not and based on that keep a track of it in the memory, then when the user clicks on the OTP AutoFill enabled text field in an app, iOS keyboard popup that OTP as a suggestion in the keyboard. So that your users can fill up the OTP into the app without leaving the app or going back into the Messaging app.
You need to check if the format of OTP is correct .One way to verify whether the text message captcha format is legal is to open [SMS] on the iPhone, click on the message captcha, if from the bottom of the call option copy captcha option, can indicate that it is possible;
And don't forget to open the Autofill Passwords in system setting ->account and password .
So - after verification that the code seemed to be OK and has worked for others I was beginning to think I was going crazy.
I then had a look through the phone settings and discovered "Autofill Passwords" which was turned off.
Once I turned it on, this seems to work as expected.

Xcode: How to change the language for the app when the user selects the language?

I would like to make my application multilingual, so I have been looking about how to add other languages in an app in Xcode, however I saw the language changes based on the language of your phone.
Is there a way to set a language when a user selects it in the application? If so, is it also possible to remember the selected language for the future? So the user will not have to select it every time when he or she starts the application again.
Thank you in advance
Well first you need to have a Localization.Strings file that have multiple languages strings.
Read about them here
Second of all you have multiple ways to detect what language user selected when the app starts, the common one for this case is userDefualts read about them here.
Therefore, you can implement the Localization file and use the value saved to detect what language to use from the userDefualts.
Localization is simply the process of translating your app into multiple languages.Internationalization is the process of making your app able to adapt to different languages, regions, and culture. 
Refer the link to implement:-
https://codeburst.io/localization-of-ios-app-in-swift-4-and-xcode-9-3c7c7d53ae11
You need to save the application language selected by user in userDefaults by example, this example is using the third party library SwiftyUserDefaults
Using this way you need to add the .strings with the "Localizable_" + initials of language of regular localization way, example
your .string file for Spanish should be named
"Localizable_es" but you can customize that in code too
this are the steps:
Save the app language selected by the user:
func setupAppLanguage(lang:String) {
Defaults[.appLanguage] = lang
}
Get saved language:
static func getCurrentLang() ->String
{
if(Defaults[.appLanguage] == nil)
{
if(NSLocale.current.languageCode == nil)
{
return "en"
}
return NSLocale.current.languageCode!
}else
{
return Defaults[.appLanguage] as String!
}
}
Get localized tableName language:
static func getLocalizedTableName() ->String
{
return "Localizable_\(Client.getCurrentLang())"
}
Method to localize:
//MARK: Localization Util
static func getLocalizedText(toLocalizeText:String) ->String{
return NSLocalizedString(toLocalizeText,tableName: Client.getLocalizedTableName(), comment: "")
}
Then you can use the getLocalizedText method the same way as you use NSLocalizedString, replacing it
Example of use
self.labelText.text = Client.getLocalizedText(toLocalizeText: "k_glossary")

Export audiofiles via “open in:” from Voice Memos App

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!
After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

How to test right-to-left language in iOS XCTest unit tests?

Is there a way to switch an XCTest unit test into the right-to-left mode to test Arabic version of the app where sentences are written from right to left of the screen? My app code logic behaves differently based on language direction. I would like to verify this functionality in a unit test. What I need to do is to switch the app into the right-to-left language mode from an XCTest unit test case.
One can run the app in the right-to-left mode by changing the Scheme's Application language settings to Right-to-left Pseudolanguage. Is there a way to do similar thing in a unit test?
My imperfect solution
I ended up changing semanticContentAttribute of a view under test to .ForceRightToLeft. It does what I need to do. It does not feel like a very clean approach though. Firstly, it only works in iOS 9. Secondly, it looks like I am tinkering with my app views on a low level from the unit test. Instead, I would prefer to switch the whole app's language to right-to-left if it is possible.
class MyTests: XCTestCase {
func testRightToLeft() {
if #available(iOS 9.0, *) {
let view = UIView()
view.semanticContentAttribute = .ForceRightToLeft
// Test code involving the view
}
}
}
There's no easy way to do this right now with testing/UI testing besides passing in environment flags or setting the semanticContentAttribute as you are doing now. Filing a bug to Apple is highly recommended.
You can also change the device language & region in the scheme. This means you'll need separate schemes for the various LTR/RTL tests you want to run:
Xcode even provides pseudo-languages for extra-long string & RTL testing.
You can detect the writing direction via
let writingDirection = UIApplication.sharedApplication().userInterfaceLayoutDirection
switch writingDirection {
case .LeftToRight:
//
case .RightToLeft:
//
default:
break // what now? You are obviously using iOS 11's topToBottom direction…
}
To set different languages and locales on startup this might be a proper solution.
What you are looking for is Automated UI-Testing
This example JavaScript code changes the device orientation for example:
var target = UIATarget.localTarget();
var app = target.frontMostApp();
//set orientation to landscape left
target.setDeviceOrientation(UIA_DEVICE_ORIENTATION_LANDSCAPELEFT);
UIALogger.logMessage("Current orientation now " + app.interfaceOrientation());
//reset orientation to portrait
target.setDeviceOrientation(UIA_DEVICE_ORIENTATION_PORTRAIT);
UIALogger.logMessage("Current orientation now " + app.interfaceOrientation());
For testing, if your layout has changed to RTL or LTR you could try to access specific UI Elements and check their content against an expected content. So here is another example to check the contents of a TableViewCell from the official docs:
The crux of testing is being able to verify that each test has been performed and that it has either passed or failed. This code example runs the test testName to determine whether a valid element recipe element whose name starts with “Tarte” exists in the recipe table view. First, a local variable is used to specify the cell criteria:
var cell = UIATarget.localTarget().frontMostApp().mainWindow() \
.tableViews()[0].cells().firstWithPredicate("name beginswith 'Tarte'");
Next, the script uses the isValid method to test whether a valid element matching those criteria exists in the recipe table view.
if (cell.isValid()) {
UIALogger.logPass(testName);
} else {
UIALogger.logFail(testName);
}
If a valid cell is found, the code logs a pass message for the testName test; if not, it logs a failure message.
Notice that this test specifies firstWithPredicate and "name
beginsWith 'Tarte'". These criteria yield a reference to the cell for
“Tarte aux Fraises,” which works for the default data already in the
Recipes sample app. If, however, a user adds a recipe for “Tarte aux
Framboises,” this example may or may not give the desired results.
If you want to test a specific scheme:
Executing an Automation Instrument Script in Xcode
After you have created your customized Automation template, you can execute your test script from Xcode by following these steps:
Open your project in Xcode.
From the Scheme pop-up menu (in the workspace window toolbar), select Edit Scheme for a scheme with which you would like to use your script.
Select Profile from the left column of the scheme editing dialog.
Choose your application from the Executable pop-up menu.
Choose your customized Automation Instrument template from the Instrument pop-up menu.
Click OK to approve your changes and dismiss the scheme editor dialog.
Choose Product > Profile.
Instruments launches and executes your test script.

iOS accessibility - How to localize VoiceOver language

My app is in the process of becoming localized for a few languages and regions. How Do I localize the voiceOver (its a accessbility feature for the blind). I want the language of voiceOver and voiceControl to change based on the users selected language ?
in my info plist the Localization native development region (CFBundleDevelopmentRegion) is already set to en_us as a fail safe if no language is found to localize to.
So to make it clear, i want to know if i localize all my strings in the app will the voiceOver use the localization in my app ?
In viewDidLoad
changeNationButton.accessibilityLabel = NSLocalizedString ("Change Nation", comment: "Accessibility Label: Button to change Nation")
changeNationButton.accessibilityHint = NSLocalizedString ("Tap to Change Nation", comment: "Accessibility Hint: Button to change Nation")
then localize it as a normal String in localizable.strings
"Change Nation" = "Cambia Nazione";

Resources