Switching between test and real url using conditional statement in swift - ios

I want to use local urls for testing. How do I switch between test and real url at compile time?
I'm aware of Active Compilation Conditions but I don't want to use that as it's debug/release thing. I want to switch between test and real urls whenever I wish during my development phase and testing. To do so I want to have a flag which I can change before compilation.
Here is what I want to achieve, it's a pseudo code.
#define TEST=TRUE (or FALSE)
#if TEST
static let URL = "http://127.0.0.1/api/"
//... other code
#else
static let URL = "https:// domain.com/api/"
//... other code
#endif

Alright, then if you want to be able to change URL during development without making a new build, you have different options, but a quick one is always a hidden configuration popup or menu that you access from your first screen before doing login or whatever.
Where the hidden menu comes out: is something you want to decide but maybe, somewhere in a starting ViewController, or onboarding or login, you want to add a Gesture Recognizer for example to show your hidden menu, let's say a double tap anywhere in your ViewController.view which will present a hidden configuration alert only in a Debug build, and not a release, just before the login or any relevant api call:
#if DEBUG
let tap = UITapGestureRecognizer(target: self, action: #selector(presentHiddenConfigurationAlert))
tap.numberOfTapsRequired = 2
view.addGestureRecognizer(tap)
#endif
How the hidden menu looks like and act like: now as well in the same ViewController (so as in my example, a LoginVC or whatever you wish) you'd have the selector method on double tap as this one for example to show an alert that can let you change your current url (as always this can happen only in a Debuggable build):
#if DEBUG
#objc func presentHiddenConfigurationAlert() {
let currentURL = UserDefaults.string(forKey: "current_url")
let alertView = UIAlertController(title: "Hidden Configuration", message: "You are using \(currentURL)", preferredStyle: .alert)
// add a textview to let developer input the url he wants to use as an action
// or add some actions as buttons to choose between a Test URL button action and Dev URL button action
// store the URL as of "current_url" so you can retrieve it in an instance used across the app and here in the message
self.present(alert, animated: true, completion: nil)
}
#endif
And overall the use case is, in my example:
you install the app and are in a debuggable build
a login viewcontroller is the first thing that is shown (for example)
you or another developer using this build double tap anywhere in this viewcontroller
a hidden menu alert config is shown where you can change the current URL that is used across the app
you store this URL in user defaults or where you like most
now wherever you access the URL variable, you want that in debug you use the one from user defaults and in release you use a release one
#if DEBUG
var API_URL = getURLFromUserDefaults()
#else
var API_URL = "https://www.release-product-url.com"
#endif

Related

can we release same app with 1 or 2 different functionality for 2 different countries?

I want to release my app in only 2 country and i want to do 2 different functionality for both country.
For example.
ViewController1 functionality is different in Jamaica.
ViewController1 functionality is different in Kenya.
Different functionality means content is different, or input forms are different.
Is it possible? if yes then please refer some document.
Thanks in advance
You should have a screen that allows user to select their country, after that, store selected country in our app (by UserDefault or Keychain, etc...).
Based on the selected country then you can switch logic/layout to adapt the requirement above
some notes about App Store:
1) language should / must be selected by user on Prefs, NOT in Apps.
Chances Apple will refuse apps not following above logic.
2) You could test current language / Zone using code (see below for language)
BUT I think Apple can refuse as you use a different behaviour
3) if really you need it, You can load a different controller using Storyboards (I suggest using different storyboards AND lod them at runtime using segues and "*.soryboard" as in:
func ViewControllerFromStoryboardWith( name: String ) -> UIViewController {
// we use an identifier equal to filename for now.
let storyboard = UIStoryboard(name: name, bundle: nil)
let vc = storyboard.instantiateViewController(withIdentifier: name) as UIViewController
return vc
}
// test lang:
func currHWLanguage()->String{
let defs : UserDefaults = UserDefaults.standard
let languages : NSArray = defs.object(forKey: "AppleLanguages") as! NSArray
let current = languages[0] as! String
// since 9.0 we get en-US etc.. so cut to 2:
let result = (current as NSString).substring(to: 2)
#if DEBUG
// force to IT as a bug in simulator
// return "IT"
#endif
return result.uppercased()
//NSLog("%#", current)
}
This is a problem many applications are trying to solve. Basically, you have the following options:
Let the user choose. This is the safest option if one application contains two different configuration.
Try to detect location of the user. Language/Locale is unsafe because many people will have English (or different) locale set up. Very unsafe. You shouldn't ask for GPS location for this. The safest option is to create a server request and check the location using the IP address. A bit complicated and won't work if a VPN is used (e.g. antivirus apps create VPNs).
Create two different apps. In the end, this is the best option. Add a second application target to your project and release two separate apps with separate configuration.

iOS UI Tests iMessage App/Extension

I'm currently using Fastlane Snapshot to automate taking screenshots for my application. It's all based on UI Tests.
I'm trying to add this same functionality to an iMessage App/Extension.
So currently I have a test that goes through taps buttons, fills in text fields, takes the screenshots, etc.
After all that is done I'd like it to close the application (click the home button), open iMessage, interact with my iMessage application and take some screenshots there as well.
Is this possible? If so how can I achieve this? Automating screenshots for this one application has been amazing and I'd love to be able to do that for the iMessage App as well.
There is no UI Tests for iMessage app extension in Xcode currently. But you can perform it by launching Messages by yourself and find elements in the Messages app. At first, you'll have to launch the Message app and open a conversation :
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.terminate()
messageApp.activate()
messageApp.cells.firstMatch.tap()
Then, you can access your iMessage app by doing so :
// Replace appIndex by the position of your app in the iMessage bottom bar
let appIndex = 2
messageApp.collectionViews.descendants(matching: .cell).element(boundBy: appIndex).tap()
When your iMessage app is opened in the expanded mode, you can access the close button :
let closeButton = messageApp.buttons.element(boundBy: 1)
If you want to test your iMessage app when the user send a message and then open it, you can do it this way :
// Send your message after it is inserted in the Messages app text field
let sendButton = messageApp.buttons["sendButton"]
waitForElementToExists(sendButton)
sendButton.tap()
// Tap on the iMessage first bubble
let firstBubble = messageApp.collectionViews["TranscriptCollectionView"].cells.element(boundBy: 2)
waitForElementToExists(firstBubble)
firstBubble.tap()
private func waitForElementToExists(_ element: XCUIElement) {
let exists = NSPredicate(format: "exists == 1")
expectation(for: exists, evaluatedWith: element, handler: nil)
waitForExpectations(timeout: 5, handler: nil)
}
With Xcode 9 you can easily switch to the other applications like Messages. The following code switches to Messages, interacts with elements within the app and then switches back to your own app.
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.terminate()
messageApp.activate()
messageApp.cells.staticTexts["Kate Bell"].tap()
XCUIApplication().activate()

Export audiofiles via “open in:” from Voice Memos App

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!
After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

iOS UI Test: how to get message of UIAlertController

My app has a login screen. If the user presses the login button without entering any text in either the username or password fields, the app will display a UIAlertController with an error message.
I am trying to model this logic in UI Tests, and want to assert that the UIAlertController is displaying the correct message. However, I can't find a way for the UI Test to access the message property of the alert. Here is the code generated by the test recorder:
func testLoginWithoutPasswort() {
let app = XCUIApplication()
let emailTextField = app.textFields["email"]
emailTextField.tap()
emailTextField.typeText("xxx#gmail.com")
app.buttons["Login"].tap()
app.alerts["Error"].collectionViews.buttons["OK"].tap()
}
Is there any way I can extract the String value of the alert's message, so I can put an assertion on it?
You can't directly test the alert's message. You can, however, test if the alert contains your error message's copy (at all).
For example, say your alert looks like this:
To assert that the alert contains the "Final Score" message, use:
XCTAssert(app.alerts.element.staticTexts["Final Score: 27 - 25"].exists)
You can also test the title of the alert directly:
XCTAssertEqual(app.alerts.element.label, "You won!")
More examples available in my UI Testing Cheat Sheet and Examples post and sample app.
I think it is: alert.elements()[2].name()
Inside onAlert callback function add alert.logElementTree() to see AlertView elements. It might be nil, maybe just title is shown.
Further to the answers above, which I struggled to get to work, there is another way.
Creata a Bool within your UItest method that is false:
var alertPressed = false
Then add a UIInterruptionMonitor and set the bool to true within it's closure:
addUIInterruptionMonitor(withDescription: "System Dialog") {
(alert) -> Bool in
alert.buttons["Allow"].tap()
alertPressed = true
return true
}
Then interact with the app again, and assert that the Bool is true
app.tap()
XCTAssert(alertPressed)
I hope this is helpful to someone.

How to get the default iOS browser name?

The Stack Overflow app detects the name of my jailbrokenly-set default browser (Chrome). How can I achieve the same thing in objective-c and swift?
(Just the name, not the ActivityView code)
Example:
Update: I went into Settings > Stack Exchange and found this:
It looks like the app defaults to Safari, but iff Chrome is installed then links will be sent to that browser. Chrome is most likely detected by the canOpenUrl method described in the answer below.
I suspect that the Stack Exchange app isn't checking for your default browser specifically. After all, since Apple doesn't provide an easy way to change your default browser, I doubt the SDK provides an API to detect the default browser.
Instead, the Stack Exchange app may use the canOpenURL(_:) method on UIApplication to test if a predetermined set of common browser apps are installed. For each browser that is installed, the action sheet gets a new button. That approach could resemble the following code snippet.
let safariURL = NSURL(string: "http://stackoverflow.com")!
let chromeURL = NSURL(string: "googlechrome://stackoverflow.com")!
let operaURL = NSURL(string: "opera-http://stackoverflow.com")!
let sharedApplication = UIApplication.sharedApplication() // convenience
if sharedApplication.canOpenURL(safariURL) {
// add "Safari" button to action sheet
}
if sharedApplication.canOpenURL(chromeURL) {
// add "Chrome" button to action sheet
}
if sharedApplication.canOpenURL(operaURL) {
// add "Opera" button to action sheet
}
// display action sheet

Resources