I'm trying to use Replaykit to start a broadcast session (recording the App screen works ok) and it opens up a picker from where broadcasting Apps can be picked, Youtube, Facebook and Periscope show up but not my App, code:
if #available(iOS 10.0, *) {
RPBroadcastActivityViewController.load { broadcastAVC, error in
guard error == nil else {
print("Cannot load Broadcast Activity View Controller.")
return
}
if let broadcastAVC = broadcastAVC {
broadcastAVC.delegate = self
self.viewController.present(broadcastAVC, animated: true, completion: {
// broadcastactivityviewcontroller will perform the callback when the broadcast starts (or fails)
})
}
}
}
Full source code here, the code is in the startBroadcast function.
I also tried loading using withPreferredExtension and it says preferred broadcast service not found. When I go to the control centre to try starting a broadcast session I see no Start Broadcast button and no Apps, just Start Recording, I'm using IOS11 so I tried with a phone using IOS12 and same result concerning control centre.
How can I get my App to show in the picker, thanks?
Had to add the Broadcast Extension, in Xcode, File -> New -> Target -> Broadcast Upload Extension. This extension was then displayed in the picker.
Another common issue is that the Deployment Target for your app (specifically your app, not just your Broadcast Extension) needs to be at least iOS 14.0. If it's any lower, it won't show up even if your device is on the correct version.
Related
On iOS 15 a long press on the PS Button of the DualSense controller is opening the App Library and I don't receive a callback via the valueChangedHandler function. The App library which will be opened looks like this
This is how I handle all controller inputs:
func handleController(controller: GCController) {
controller.extendedGamepad?.valueChangedHandler = { [weak self] (gamepad: GCExtendedGamepad, element: GCControllerElement) in
guard let self = self else {
return
}
// no feedback received when performing a long press on the PS button
}
Can the game library be suppressed somehow? Sony's PS Remote Play app somehow manages to suppress it, but I don't know how, nor can I find anything in Apple's official API documentation.
Edit: Seems this problem only occurs on iPads, on iPhones this problem doesn't exist. Is there some API or anything on iPads to suppress this behaviour? I assume the most majority of users don't want to open the App Library in the middle of the game.
If someone ever faces the same problem you can actually disable system gestures for the Home button.
In Swift all you have to add is this line (controller is a GCController object)
controller.physicalInputProfile.buttons[GCInputButtonHome]?.preferredSystemGestureState = .disabled
In ObjectiveC it would work like this
controller.physicalInputProfile.buttons[GCInputButtonHome].preferredSystemGestureState = GCSystemGestureStateDisabled;
Thanks to the Apple employee who helped me here
https://developer.apple.com/forums/thread/711905
Edit: on tvOS this isn't working as the PS button (menu button) of a controller always have to act as home event
https://developer.apple.com/forums/thread/715012
I'm working on an IOS swift application that will allow the user to record the entire screen, any app and even the home screen.
In order to do that, I added a Broadcast Upload Extension to my app.
First I used the RPSystemBroadcastPickerView class in order to add a record button to my view that allow the user to open the record popup and select to which app he wants to broadcast the screen flow. And it's working fine :
But I would like to avoid this step and directly open the popup when the app launch.
So I wrote the following code to do that :
RPBroadcastActivityViewController.load(withPreferredExtension: "ch.jroueche.RecordApp.TestScreen", handler: {broadcastAVC,error in
guard error == nil else {
print("Cannot load Broadcast Activity View Controller.")
return
}
if let broadcastAVC = broadcastAVC {
broadcastAVC.delegate = self
self.present(broadcastAVC, animated: true, completion: {
// broadcastactivityviewcontroller will perform the callback when the broadcast starts (or fails)
print("I've START")
})
}
})
Unlikeenter code here the RPSystemBroadcastPickerView solution, I'm getting the following error :
The preferred broadcast service could not be found.
My issue is similar to the following post :
App not showing up as a broadcast service in RPBroadcastActivityViewController
I also added the extension and the preferred extension identifier is correct.
Why would it be possible using the RPSystemBroadcastPickerView and not programmatically using RPBroadcastActivityViewControllerclass. That does not make sense for me.
Does someone have an idea of what could be the issue and how could I fix it ? Or a workaround in order to do this screen record.
Thanks in advance
It appears that RPBroadcastActivityViewController shows ONLY Broadcast Setup UI Extension, while RPSystemBroadcastPickerView shows ONLY Broadcast Upload Extension. But I have no idea why, as both of them are stated to show list of available providers/services.
It would be very helpful if someone could bring more details on the topic.
When I designed my App Clip Launch experience, I had in mind that the App can only be triggered via QR code, NFC or App Clip Code. That why I linked the App Launch to a specific location with specific Id.
When my App went live last week, and when I try to scan a NFC tag the App is launching as expected every time.
Now, if I tap the App Clip icon on the home screen, the App is launching with the last URL scanned I dig some googling and I found that the App Clip is caching the last URL scanned and simulating a universal link launch when icon tapped!
This is not working for me! So I am looking for a way to check if the App was launched via scan or tap? I tried to log the App launch but it's always running in the order either via Scan (NFC) or icon tap:
AppDelegate.didFinishLaunchingWithOptions()
SceneDelegate.willConnectTo() // It's here where I am handling the Universal Link
How can I check if the user launched the App via Tap or Scan? Knowing that the App is always simulating Universal launch Link when icon tapped!
Or how I can look for the saved URL? I tried to fetch all UserDefaults and Some Keychain data, but I found nothing!
I faced the same issue! And unfortunately there’s no way to:
Check how the App was launched, icon tap or NFC/QR scan
To retrieve cached data from either UserDefaults or Keychain
Apple says clearly on their Human Interface Guidelines that if you want support multiple businesses you should add the location services factor!
Consider multiple businesses. An App Clip may power many different
businesses or a business that has multiple locations. In both
scenarios, people may end up using the App Clip for more than one
business or location at a time. The App Clip must handle this use case
and update its user interface accordingly. For example, consider a way
to switch between recent businesses or locations within your App Clip,
and verify the user’s location when they launch it.
So, now your tags for specific location should be mapped to a coordinates [Longitude, Latitude]. Apple has introduced a new location verification API just for App Clips that allows you to do a one-time check to see if the App Clip code, NFC tag or QR code that the user scanned is where it says it is.
Enable Your App Clip to Verify the User’s Location
To enable your App Clip to verify the user’s location, modify your App Clip’s Info.plist file:
Open your App Clip’s Info.plist, add the NSAppClip key, and set its
type to Dictionary.
Add an entry to the dictionary with NSAppClipRequestLocationConfirmation as the key, select Boolean as
its type, and set its value to true.
But using App Clip Location services is different:
Parse the information on the URL that launches the App CLip
Send a request to your Database to fetch the location information for this business
Use activity.appClipActivationPayload to confirm if the location (in Step 2) is in region where the user is right now.
The Code bellow (Copied from Apple) explains how to do it.
import UIKit
import AppClip
import CoreLocation
class SceneDelegate: UIResponder, UIWindowSceneDelegate {
var window: UIWindow?
// Call the verifyUserLocation(_:) function in all applicable life-cycle callbacks.
func verifyUserLocation(_ activity: NSUserActivity?) {
// Guard against faulty data.
guard activity != nil else { return }
guard activity!.activityType == NSUserActivityTypeBrowsingWeb else { return }
guard let payload = activity!.appClipActivationPayload else { return }
guard let incomingURL = activity?.webpageURL else { return }
// Create a CLRegion object.
guard let region = location(from: incomingURL) else {
// Respond to parsing errors here.
return
}
// Verify that the invocation happened at the expected location.
payload.confirmAcquired(in: region) { (inRegion, error) in
guard let confirmationError = error as? APActivationPayloadError else {
if inRegion {
// The location of the NFC tag matches the user's location.
} else {
// The location of the NFC tag doesn't match the records;
// for example, if someone moved the NFC tag.
}
return
}
if confirmationError.code == .doesNotMatch {
// The scanned URL wasn't registered for the App Clip.
} else {
// The user denied location access, or the source of the
// App Clip’s invocation wasn’t an NFC tag or visual code.
}
}
}
func location(from url:URL) -> CLRegion? {
// You should retrieve the coordinates from your Database
let coordinates = CLLocationCoordinate2D(latitude: 37.334722,
longitude: 122.008889)
return CLCircularRegion(center: coordinates,
radius: 100,
identifier: "Apple Park")
}
}
And that’s it, this his how your support multiple businesses with App Clip
I created a new app in Xcode and added the following code in the AppDelegate file
func updateCarWindow()
{
guard let screen = UIScreen.screens.first(where: { $0.traitCollection.userInterfaceIdiom == .carPlay })
else
{
// CarPlay is not connected
self.carWindow = nil;
return
}
// CarPlay is connected
let carWindow = UIWindow(frame: screen.bounds)
carWindow.screen = screen
carWindow.makeKeyAndVisible()
carWindow.rootViewController = CarViewController(nibName: nil, bundle: nil)
self.carWindow = carWindow
}
and called the function in function application. The app is not showing in the CarPlay external display.
You don’t have direct access to the carplay screen, carplay manages everything using the CPInterfaceController class that is able to display so called templates (such as CPListTemplate and a handful of others). Your ability to draw on the screen is pretty much limited to drawing maps in a CPMapContentWindow.
I recommend you read the Apple docs first starting here:
https://developer.apple.com/carplay/documentation/CarPlay-Navigation-App-Programming-Guide.pdf
Don’t forget to set the correct app permissions and carplay entitlements othereise it simply won’t work and it might not tell you why.
And a final word that the Carplay framework is only supposed to work with navigation apps. Everything else would require a lot of workarounds, not to mention it would never pass app review.
Hope this helps
I'm currently using Fastlane Snapshot to automate taking screenshots for my application. It's all based on UI Tests.
I'm trying to add this same functionality to an iMessage App/Extension.
So currently I have a test that goes through taps buttons, fills in text fields, takes the screenshots, etc.
After all that is done I'd like it to close the application (click the home button), open iMessage, interact with my iMessage application and take some screenshots there as well.
Is this possible? If so how can I achieve this? Automating screenshots for this one application has been amazing and I'd love to be able to do that for the iMessage App as well.
There is no UI Tests for iMessage app extension in Xcode currently. But you can perform it by launching Messages by yourself and find elements in the Messages app. At first, you'll have to launch the Message app and open a conversation :
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.terminate()
messageApp.activate()
messageApp.cells.firstMatch.tap()
Then, you can access your iMessage app by doing so :
// Replace appIndex by the position of your app in the iMessage bottom bar
let appIndex = 2
messageApp.collectionViews.descendants(matching: .cell).element(boundBy: appIndex).tap()
When your iMessage app is opened in the expanded mode, you can access the close button :
let closeButton = messageApp.buttons.element(boundBy: 1)
If you want to test your iMessage app when the user send a message and then open it, you can do it this way :
// Send your message after it is inserted in the Messages app text field
let sendButton = messageApp.buttons["sendButton"]
waitForElementToExists(sendButton)
sendButton.tap()
// Tap on the iMessage first bubble
let firstBubble = messageApp.collectionViews["TranscriptCollectionView"].cells.element(boundBy: 2)
waitForElementToExists(firstBubble)
firstBubble.tap()
private func waitForElementToExists(_ element: XCUIElement) {
let exists = NSPredicate(format: "exists == 1")
expectation(for: exists, evaluatedWith: element, handler: nil)
waitForExpectations(timeout: 5, handler: nil)
}
With Xcode 9 you can easily switch to the other applications like Messages. The following code switches to Messages, interacts with elements within the app and then switches back to your own app.
let messageApp = XCUIApplication(bundleIdentifier: "com.apple.MobileSMS")
messageApp.terminate()
messageApp.activate()
messageApp.cells.staticTexts["Kate Bell"].tap()
XCUIApplication().activate()