iOS UI Tests iMessage App/Extension - ios

I'm currently using Fastlane Snapshot to automate taking screenshots for my application. It's all based on UI Tests.
I'm trying to add this same functionality to an iMessage App/Extension.
So currently I have a test that goes through taps buttons, fills in text fields, takes the screenshots, etc.
After all that is done I'd like it to close the application (click the home button), open iMessage, interact with my iMessage application and take some screenshots there as well.
Is this possible? If so how can I achieve this? Automating screenshots for this one application has been amazing and I'd love to be able to do that for the iMessage App as well.

There is no UI Tests for iMessage app extension in Xcode currently. But you can perform it by launching Messages by yourself and find elements in the Messages app. At first, you'll have to launch the Message app and open a conversation :
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.terminate()
messageApp.activate()
messageApp.cells.firstMatch.tap()
Then, you can access your iMessage app by doing so :
// Replace appIndex by the position of your app in the iMessage bottom bar
let appIndex = 2
messageApp.collectionViews.descendants(matching: .cell).element(boundBy: appIndex).tap()
When your iMessage app is opened in the expanded mode, you can access the close button :
let closeButton = messageApp.buttons.element(boundBy: 1)
If you want to test your iMessage app when the user send a message and then open it, you can do it this way :
// Send your message after it is inserted in the Messages app text field
let sendButton = messageApp.buttons["sendButton"]
waitForElementToExists(sendButton)
sendButton.tap()
// Tap on the iMessage first bubble
let firstBubble = messageApp.collectionViews["TranscriptCollectionView"].cells.element(boundBy: 2)
waitForElementToExists(firstBubble)
firstBubble.tap()
private func waitForElementToExists(_ element: XCUIElement) {
let exists = NSPredicate(format: "exists == 1")
expectation(for: exists, evaluatedWith: element, handler: nil)
waitForExpectations(timeout: 5, handler: nil)
}

With Xcode 9 you can easily switch to the other applications like Messages. The following code switches to Messages, interacts with elements within the app and then switches back to your own app.
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.terminate()
messageApp.activate()
messageApp.cells.staticTexts["Kate Bell"].tap()
XCUIApplication().activate()

Related

iOS 15 DualSense is opening App Library on PS button long press

On iOS 15 a long press on the PS Button of the DualSense controller is opening the App Library and I don't receive a callback via the valueChangedHandler function. The App library which will be opened looks like this
This is how I handle all controller inputs:
func handleController(controller: GCController) {
controller.extendedGamepad?.valueChangedHandler = { [weak self] (gamepad: GCExtendedGamepad, element: GCControllerElement) in
guard let self = self else {
return
}
// no feedback received when performing a long press on the PS button
}
Can the game library be suppressed somehow? Sony's PS Remote Play app somehow manages to suppress it, but I don't know how, nor can I find anything in Apple's official API documentation.
Edit: Seems this problem only occurs on iPads, on iPhones this problem doesn't exist. Is there some API or anything on iPads to suppress this behaviour? I assume the most majority of users don't want to open the App Library in the middle of the game.
If someone ever faces the same problem you can actually disable system gestures for the Home button.
In Swift all you have to add is this line (controller is a GCController object)
controller.physicalInputProfile.buttons[GCInputButtonHome]?.preferredSystemGestureState = .disabled
In ObjectiveC it would work like this
controller.physicalInputProfile.buttons[GCInputButtonHome].preferredSystemGestureState = GCSystemGestureStateDisabled;
Thanks to the Apple employee who helped me here
https://developer.apple.com/forums/thread/711905
Edit: on tvOS this isn't working as the PS button (menu button) of a controller always have to act as home event
https://developer.apple.com/forums/thread/715012

ReplayKit Broadcast upload extension - Service not found

I'm working on an IOS swift application that will allow the user to record the entire screen, any app and even the home screen.
In order to do that, I added a Broadcast Upload Extension to my app.
First I used the RPSystemBroadcastPickerView class in order to add a record button to my view that allow the user to open the record popup and select to which app he wants to broadcast the screen flow. And it's working fine :
But I would like to avoid this step and directly open the popup when the app launch.
So I wrote the following code to do that :
RPBroadcastActivityViewController.load(withPreferredExtension: "ch.jroueche.RecordApp.TestScreen", handler: {broadcastAVC,error in
guard error == nil else {
print("Cannot load Broadcast Activity View Controller.")
return
}
if let broadcastAVC = broadcastAVC {
broadcastAVC.delegate = self
self.present(broadcastAVC, animated: true, completion: {
// broadcastactivityviewcontroller will perform the callback when the broadcast starts (or fails)
print("I've START")
})
}
})
Unlikeenter code here the RPSystemBroadcastPickerView solution, I'm getting the following error :
The preferred broadcast service could not be found.
My issue is similar to the following post :
App not showing up as a broadcast service in RPBroadcastActivityViewController
I also added the extension and the preferred extension identifier is correct.
Why would it be possible using the RPSystemBroadcastPickerView and not programmatically using RPBroadcastActivityViewControllerclass. That does not make sense for me.
Does someone have an idea of what could be the issue and how could I fix it ? Or a workaround in order to do this screen record.
Thanks in advance
It appears that RPBroadcastActivityViewController shows ONLY Broadcast Setup UI Extension, while RPSystemBroadcastPickerView shows ONLY Broadcast Upload Extension. But I have no idea why, as both of them are stated to show list of available providers/services.
It would be very helpful if someone could bring more details on the topic.

App not showing up as a broadcast service in RPBroadcastActivityViewController

I'm trying to use Replaykit to start a broadcast session (recording the App screen works ok) and it opens up a picker from where broadcasting Apps can be picked, Youtube, Facebook and Periscope show up but not my App, code:
if #available(iOS 10.0, *) {
RPBroadcastActivityViewController.load { broadcastAVC, error in
guard error == nil else {
print("Cannot load Broadcast Activity View Controller.")
return
}
if let broadcastAVC = broadcastAVC {
broadcastAVC.delegate = self
self.viewController.present(broadcastAVC, animated: true, completion: {
// broadcastactivityviewcontroller will perform the callback when the broadcast starts (or fails)
})
}
}
}
Full source code here, the code is in the startBroadcast function.
I also tried loading using withPreferredExtension and it says preferred broadcast service not found. When I go to the control centre to try starting a broadcast session I see no Start Broadcast button and no Apps, just Start Recording, I'm using IOS11 so I tried with a phone using IOS12 and same result concerning control centre.
How can I get my App to show in the picker, thanks?
Had to add the Broadcast Extension, in Xcode, File -> New -> Target -> Broadcast Upload Extension. This extension was then displayed in the picker.
Another common issue is that the Deployment Target for your app (specifically your app, not just your Broadcast Extension) needs to be at least iOS 14.0. If it's any lower, it won't show up even if your device is on the correct version.

Export audiofiles via “open in:” from Voice Memos App

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!
After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

How to launch system apps in an iOS Xcode UI test case

I've got an app whose main purpose is to enter data into HealthKit. I'd like to write some Xcode UI tests to verify that it's writing this data successfully, but I'm having some difficulty verifying the data in the Health app.
When I initially recorded my test, it skipped my simulated Home button press, but it was recording as I swiped over to the first home screen and navigated into the Health app to show the data points.
I searched for how to press the Home button, and found this (which works):
XCUIDevice.shared.press(.home)
However, none of the other calls it recorded actually work for navigation outside of the app. The recorded code for swiping on the home screen obviously looks wrong, and also doesn't work when I replace tap() with a swipeRight() or swipeLeft():
app.childrenMatchingType(.Window).elementBoundByIndex(1).childrenMatchingType(.Other).elementBoundByIndex(1).childrenMatchingType(.Other).element.childrenMatchingType(.Other).element.childrenMatchingType(.Other).elementBoundByIndex(0).childrenMatchingType(.ScrollView).element.tap()
The next couple of lines, for launching an app on the home screen, don't even work for an app icon that's on the currently visible page:
let elementsQuery = app.scrollViews.otherElements
elementsQuery.icons["Health"].tap()
Is there any way to achieve what I'm trying to do, or will I need to wait to verify end-to-end testing until I add the ability to read from HealthKit to my app?
Xcode 9
Here's the solution using Xcode 9
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.activate()
You can find a list of bundle identifier for the system apps in this post
Xcode 8
For Xcode 8 it's a little bit more complicated
In order to launch an application from the Springboard you need to import the following headers
https://github.com/facebook/WebDriverAgent/blob/master/PrivateHeaders/XCTest/XCUIElement.h
https://github.com/facebook/WebDriverAgent/blob/master/PrivateHeaders/XCTest/XCUIApplication.h
Then use the following (for example with Health)
Objective-C
#interface Springboard : NSObject
+ (void)launchHealth;
#end
#implementation Springboard
+ (void)launchHealth
{
XCUIApplication *springboard = [[XCUIApplication alloc] initPrivateWithPath:nil bundleID:#"com.apple.springboard"];
[springboard resolve];
XCUIElement *icon = springboard.icons[#"Health"];
if (icon.exists) {
[icon tap];
// To query elements in the Health app
XCUIApplication *health = [[XCUIApplication alloc] initPrivateWithPath:nil bundleID:#"com.apple.Health"];
}
}
#end
Swift
class Springboard {
static let springboard = XCUIApplication(privateWithPath: nil, bundleID: "com.apple.springboard")
class func launchHealth() {
springboard.resolve()
let icon = springboard.icons["Health"]
if icon.exists {
icon.tap()
// To query elements in the Health app
let health = XCUIApplication(privateWithPath: nil, bundleID: "com.apple.Health")
}
}
}
Swift 4
let app = XCUIApplication(bundleIdentifier: "com.apple.springboard")
you bounded your UI tests with your application and the moment you press home button and move out of the application UI you can not perform UI operations as app variable in your code is pointing to your application.
you may be having code like
let app = XCUIApplication()
So you should modify that XCUIApplication() line.
let app = XCUIApplication(privateWithPath: nil, bundleID: "com.apple.springboard")
now you can move out of application.
According to my knowledge it is good to have and independent UITestAgent app and initiate its UiTestcases with springboard bundle id so you can test any application with the help of that app not like i coded some test case inside product XYZ code base and for next product ABC I will write tests inside ABC product's code base!

Resources