Is there any way to customize "You have changed the icon for AppName" message after UIApplication.shared.setAlternateIconName("image") call or get rid of it at all?
Sadly you can't because it is provided by the system, according to Apple's official guidelines.
Note that your app icon can only be changed at the user’s request and
the system always provides the user with confirmation of such a
change.
https://developer.apple.com/ios/human-interface-guidelines/icons-and-images/app-icon/
Well, there's a solution to customize the text, but not sure whether Apple would approve it. We need to get the label:
func findTheLabel(in view: UIView)
{
for item in view.subviews {
if let view = item as? UILabel, view.text == "You have changed the icon for “myApp”." {
view.text = "Hi, \n kind greetings from “myApp”."
view.textAlignment = .justified
}
guard item.subviews.count > 0 else { continue }
findTheLabel(in: item)
}
}
And simply call it right after:
UIApplication.shared.setAlternateIconName(iconName) { (error) in
print(error)
}
with small dispatch
Timer.scheduledTimer(withTimeInterval: 0.3, repeats: false) {_ in
if let view = UIApplication.topViewController()?.view {
self.findTheLabel(in: view)
}
}
Maybe someone would find it useful.
Related
When using Voice Over in iOS, calling UIAccessibility.post(notification:argument:) to announce a field error doesn't actually announce the error.
I have a submit button and, when focusing the button, voice over reads the button title as you would expect. When pressing the button, voice over reads the title again. When the submit button is pressed, I am doing some validation and, when there is a field error, I am trying to announce it by calling:
if UIAccessibility.isVoiceOverRunning {
UIAccessibility.post(notification: .announcement, argument: "my field error")
}
Interestingly enough, if I stop on a breakpoint in the debugger the announcement happens. When I don't stop on a breakpoint, the announcement doesn't happen.
The notification is posting on the main thread and, if is like NotificationCenter.default, I assume that it is handled on the same thread it was posted on. I have tried to dispatch the call to the main queue, even though it is already on the main thread, and that doesn't seem to work either.
The only thing that I can think is that the notification is posted and observed before voice over is finished reading the submit button title and the announcement notification won't interrupt the current voice over.
I would really appreciate any help on this.
This is an admittedly hacky solution, but I was able to prevent the system announcement from pre-empting my own by dispatching to the main thread with a slight delay:
DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) {
UIAccessibility.post(notification: .announcement, argument: "<Your text>")
}
Another work around is to use .screenChanged instead and pass the error label, as:
UIAccessibility.post(notification: .screenChanged, argument: errorLabel)
Your problem may happen because the system needs to take over during the field error appears and, in this case, any customed VoiceOver notification is cancelled.🤯
I wrote an answer about problems with queueing multiple VoiceOver notifications that may help you to understand your current situation.🤓
Your notification works with a breakpoint because you're delaying it and the system works during this time : there's no overlap between your notification and the system work.
A simple solution may be to implement a short delay before sending your notification but the delay depends on the speech rate that's why this is only a temporary workaround. 🙄
Your retry mechanism is smart and could be improved inside a loop of few retries in case of many system takeovers. 👍
I was facing the same issue, so I picked up #brandenesmith idea of the notification queue and wrote a little helper class.
class AccessibilityAnnouncementQueue {
static let shard = AccessibilityAnnouncementQueue()
private var queue: [String] = []
private init() {
NotificationCenter.default.addObserver(self,
selector: #selector(announcementFinished(_:)),
name: UIAccessibility.announcementDidFinishNotification,
object: nil)
}
func post(announcement: String) {
guard UIAccessibility.isVoiceOverRunning else { return }
queue.append(announcement)
postNotification(announcement)
}
private func postNotification(_ message: String) {
let attrMessage: NSAttributedString = NSAttributedString(string: message, attributes: [.accessibilitySpeechQueueAnnouncement: true])
UIAccessibility.post(notification: .announcement, argument: attrMessage)
}
#objc private func announcementFinished(_ sender: Notification) {
guard
let userInfo = sender.userInfo,
let firstQueueItem = queue.first,
let announcement = userInfo[UIAccessibility.announcementStringValueUserInfoKey] as? String,
let success = userInfo[UIAccessibility.announcementWasSuccessfulUserInfoKey] as? Bool,
firstQueueItem == announcement
else { return }
if success {
queue.removeFirst()
} else {
postNotification(firstQueueItem)
}
}
}
I am able to get this to work using a retry mechanism where I register as an observer of the UIAccessibility.announcementDidFinishNotification and then pull the announcement and success status out of the userInfo dictionary.
If the success status is false and the announcement is the same as the one I just sent, I post the notification again. This happens on repeat until the announcement was successful.
There are obviously multiple problems with this approach including having to de-register, what happens if another object manages to post the same announcement (this shouldn't ever happen in practice but in theory it could), having to keep track of the last announcement sent, etc.
The code would look like:
private var _errors: [String] = []
private var _lastAnnouncement: String = ""
init() {
NotificationCenter.default.addObserver(
self,
selector: #selector(announcementFinished(_:)),
name: UIAccessibility.announcementDidFinishNotification,
object: nil
)
}
func showErrors() {
if !_errors.isEmpty {
view.errorLabel.text = _errors.first!
view.errorLabel.isHidden = false
if UIAccessibility.isVoiceOverRunning {
_lastAnnouncement = _errors.first!
UIAccessibility.post(notification: .announcement, argument: _errors.first!)
}
} else {
view.errorLabel.text = ""
view.errorLabel.isHidden = true
}
}
#objc func announcementFinished(_ sender: Notification) {
guard let announcement = sender.userInfo![UIAccessibility.announcementStringValueUserInfoKey] as? String else { return }
guard let success = sender.userInfo![UIAccessibility.announcementWasSuccessfulUserInfoKey] as? Bool else { return }
if !success && announcement == _lastAnnouncement {
_lastAnnouncement = _errors.first!
UIAccessibility.post(notification: .announcement, argument: _errors.first!)
}
}
The problem is that this retry mechanism will always be used because the first call to UIAccessibility.post(notification: .announcement, argument: _errors.first!) always (unless I am stopped on a breakpoint). I still don't know why the first post always fails.
If somebody uses RxSwift, probably following solution will be more suitable:
extension UIAccessibility {
static func announce(_ message: String) -> Completable {
guard !message.isEmpty else { return .empty() }
return Completable.create { subscriber in
let postAnnouncement = {
DispatchQueue.main.async {
UIAccessibility.post(notification: .announcement, argument: message)
}
}
postAnnouncement()
let observable = NotificationCenter.default.rx.notification(UIAccessibility.announcementDidFinishNotification)
return observable.subscribe(onNext: { notification in
guard let userInfo = notification.userInfo,
let announcement = userInfo[UIAccessibility.announcementStringValueUserInfoKey] as? String,
announcement == message,
let success = userInfo[UIAccessibility.announcementWasSuccessfulUserInfoKey] as? Bool else { return }
success ? subscriber(.completed) : postAnnouncement()
})
}
}
}
Here the code for used in app,
//setup recognizer
let usdlRecognizer = MBUsdlCombinedRecognizer()
usdlRecognizer.returnFullDocumentImage = true
usdlRecognizer.scanUncertain = false
// delegate
func documentVerificationOverlayViewControllerDidFinishScanning(_ documentVerificationOverlayViewController: MBDocumentVerificationOverlayViewController, state: MBRecognizerResultState) {
if state == MBRecognizerResultState.valid {
// first, pause scanning until we process all the results
documentVerificationOverlayViewController.recognizerRunnerViewController?.pauseScanning()
DispatchQueue.main.async(execute: {() -> Void in
documentVerificationOverlayViewController.dismiss(animated: false, completion: {
})
self.parseResult(recognizer: self.recognizer)
if let recognizer = self.recognizer as? MBUsdlCombinedRecognizer, recognizer.result.documentDataMatch == true {
// *********** Its always coming here even with mismatched cards ****************
print("both sides of cards matched.. success")
} else {
showAlert(title: "Error", message: "Data not matched", vc: self, okAction: {
})
}
})
}
}
Its always going to success block of the if statement even we scan with different person's US driving license.
That's because the MBUSDLCombinedRecognizer scans only the face and document image on the front.
It doesn't read the fields, all other fields are read from the pdf417 barcode on the backside, so it can't compare front and back side results.
We are working on support for the front side of the USDL and in Q2 of the 2019 we will have some exciting news! A small hint: Autodetection
Kind regards
I've 2 siri shortcuts in my App.
I use NSUserActivity to donate these shortcuts. I've also created 2 NSUserActivityTypes in my info.plist.
There are 2 view controllers which handle these shortcuts (1 view controller for 1 shortcut).
If I add 1 siri shortcut from 1 view controller and then go to 2nd view controller the native siri shortcut button (INUIAddVoiceShortcutButton) on 2nd view controller automatically picks the first shortcut (created from 1st view controller) and shows "Added to Siri" with suggested phrase instead of showing "Add to Siri" button. I double checked that each NSUserActivity has different identifier but still somehow its picks the wrong shortcut.
View Controller 1:
let userActivity = NSUserActivity(activityType: "com.activity.type1")
userActivity.isEligibleForSearch = true
userActivity.isEligibleForPrediction = true
userActivity.title = shortcut.title
userActivity.suggestedInvocationPhrase = suggestedPhrase
let attributes = CSSearchableItemAttributeSet(itemContentType: kUTTypeItem as String)
attributes.contentDescription = description
userActivity.contentAttributeSet = attributes
let shortcut = INShortcut(userActivity: userActivity)
let siriButton = INUIAddVoiceShortcutButton(style: .whiteOutline)
siriButton.translatesAutoresizingMaskIntoConstraints = false
siriButton.shortcut = shortcut
self.view.addSubview(siriButton)
View Controller 2:
let userActivity2 = NSUserActivity(activityType: "com.activity.type2")
userActivity2.isEligibleForSearch = true
userActivity2.isEligibleForPrediction = true
userActivity2.title = shortcut.title
userActivity2.suggestedInvocationPhrase = suggestedPhrase
let attributes = CSSearchableItemAttributeSet(itemContentType: kUTTypeItem as String)
attributes.contentDescription = description
userActivity2.contentAttributeSet = attributes
let shortcut = INShortcut(userActivity: userActivity2)
let siriButton = INUIAddVoiceShortcutButton(style: .whiteOutline)
siriButton.translatesAutoresizingMaskIntoConstraints = false
siriButton.shortcut = shortcut
self.view.addSubview(siriButton)
A similar thing happens when I delete the App and reinstall without deleting the shortcuts from Phone's Settings App.
Seems like its an IOS bug. I figured out a workaround for this problem. You have to create a new siri button every time the user add/edit the siri shortcut. Before creating siri button do the following things
1- Get all the voice shortcuts from INVoiceShortcutCenter by calling the function. Note that this happens asynchronously, so you need to do it some time before you need the data (e.g. in your AppDelegate). You'll also need to re-load this whenever the user adds a Siri Shortcut (probably in the INUIAddVoiceShortcutViewControllerDelegate.addVoiceShortcutViewController(_:didFinishWith:error) method).
INVoiceShortcutCenter.shared.getAllVoiceShortcuts { (voiceShortcutsFromCenter, error) in
guard let voiceShortcutsFromCenter = voiceShortcutsFromCenter else {
if let error = error as NSError? {
os_log("Failed to fetch voice shortcuts with error: %#", log: OSLog.default, type: .error, error)
}
return
}
self.voiceShortcuts = voiceShortcutsFromCenter
}
2- In View Controller-1 check if the shortcut is already added or not by iterating all the voice shortcuts
let voiceShorcut = voiceShortcuts.first { (voiceShortcut) -> Bool in
if let activity = voiceShortcut.shortcut.userActivity, activity.activityType == "com.activity.type1" {
return true
}
return false
}
3- If your voice shortcut is registered then pass the INShortcut to siri button otherwise don't set it.
if voiceShorcut != nil {
let shortcut = INShortcut(userActivity: userActivity1)
siriButton.shortcut = shortcut
}
Do the same thing in Second View Controller.
It's iOS 12.0 bug.
You can fix it by update INUIAddVoiceShortcutButton.voiceShortcut with correct value.
Use KVO to observe "voiceShortcut" property and when it change assign correct value to it.
I've moved to intents setup now and I find that even having just one intent setup and working the INUIAddVoiceShortcutButton is not able to track my shortcut. Once phrase is recorded it shows the Added to Siri with phrase.
But every time the app relaunches the Add to Siri button shows up instead of the Added to Siri button with recorded phrase.
I have tried going by Bilal's suggestion and although I can see the INVoiceShortcutCenter showing me my shortcut as present it doesn't loaded it into the Siri button.
My code looks like this for the button itself.
private func addSiriButton() {
let addShortcutButton = INUIAddVoiceShortcutButton(style: .blackOutline)
addShortcutButton.delegate = self
addShortcutButton.shortcut = INShortcut(intent: engine.intent )
addShortcutButton.translatesAutoresizingMaskIntoConstraints = false
siriButtonSubView.addSubview(addShortcutButton)
siriButtonSubView.centerXAnchor.constraint(equalTo: addShortcutButton.centerXAnchor).isActive = true
siriButtonSubView.centerYAnchor.constraint(equalTo: addShortcutButton.centerYAnchor).isActive = true
}
I have all the protocols implement and I had a close look at the Soup app but just can't figure out what drives this inaccuracy.
Funny enough, even British Airways app developers have given up on that as their button has exactly the same fault behaviour.
Update: I've built another test project with minimal amount implementation for the Intent and the Add to Siri and Added to Siri works perfectly. I'm guessing at this point that there is something in my own apps codebase that is causing this unwanted behaviour.
update 2 Just wanted to let everyone know I have fixed the issue. Using intents works fine but there is definitely a little sensitivity in the Intents definition file itself. All I had to do is create a new intent which then was generated and that worked. Seems my initial intent was somehow corrupt but there were no errors. After creating another intent and re-assigning intent handling function to that it all worked as intended. (pun intended)
I encountered this error when I had an existing intent and working configuration, but added a new parameter. However, in my Intent configuration, I had not added the new parameter name to a supported combination under the Shortcuts app section.
For example, if I had two properties myId and myName, and specified them as such:
let intent = MyIntent()
intent.myId = 1234
intent.myName = "banana"
Then I would need a supported combination of myId, myName in my intents definition file. In my particular case, I had forgotten myName so the INUIAddVoiceShortcutButton was attempting to do a lookup using myId, myName but didn't know how.
I just fixed this issue myself by changing my implementation (originally based on the soupchef app) to this code sample provided by apple (https://developer.apple.com/documentation/sirikit/inuiaddvoiceshortcutbutton):
EDIT: I added code that shows how I create and pass in the shortcutObject (INShortcut) for both UserActivity and custom Intent shortcuts.
The Shortcut class is an enum that contains a computed property called intent that returns an instantiation of the custom intent.
private func addShortcutButton(shortcut: Shortcut, parentViewController: UIViewController, shortcutViewControllerDelegate: INUIAddVoiceShortcutViewControllerDelegate) {
guard let view = parentViewController.view else { return }
if let intent = shortcut.intent {
shortcutObject = INShortcut(intent: intent)
} else if let userActivity = view.userActivity {
shortcutObject = INShortcut(userActivity: userActivity)
}
self.shortcutViewControllerDelegate = shortcutViewControllerDelegate
addSiriButton(to: shortcutButtonContainer)
}
func addSiriButton(to view: UIView) {
let button = INUIAddVoiceShortcutButton(style: .whiteOutline)
button.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(button)
view.centerXAnchor.constraint(equalTo: button.centerXAnchor).isActive = true
view.centerYAnchor.constraint(equalTo: button.centerYAnchor).isActive = true
button.addTarget(self, action: #selector(addToSiri(_:)), for: .touchUpInside)
}
// Present the Add Shortcut view controller after the
// user taps the "Add to Siri" button.
#objc
func addToSiri(_ sender: Any) {
guard let shortcutObject = shortcutObject else { return }
let viewController = INUIAddVoiceShortcutViewController(shortcut: shortcutObject)
viewController.modalPresentationStyle = .formSheet
viewController.delegate = shortcutViewControllerDelegate
parentViewController?.present(viewController, animated: true, completion: nil)
}
So we can't use the default Siri button, you have to use custom UIButton.
The class VoiceShortcutsManager will check all the voice intents and then we can search that list check if exist one match if yes so we should suggest edition if not we should suggest adding.
public class VoiceShortcutsManager {
private var voiceShortcuts: [INVoiceShortcut] = []
public init() {
updateVoiceShortcuts(completion: nil)
}
public func voiceShortcut(for order: DeviceIntent, powerState: State) -> INVoiceShortcut? {
for element in voiceShortcuts {
guard let intent = element.shortcut.intent as? ToggleStateIntent else {
continue
}
let deviceIntent = DeviceIntent(identifier: intent.device?.identifier, display: intent.device?.displayString ?? "")
if(order == deviceIntent && powerState == intent.state) {
return element
}
}
return nil
}
public func updateVoiceShortcuts(completion: (() -> Void)?) {
INVoiceShortcutCenter.shared.getAllVoiceShortcuts { (voiceShortcutsFromCenter, error) in
guard let voiceShortcutsFromCenter = voiceShortcutsFromCenter else {
if let error = error {
print("Failed to fetch voice shortcuts with error: \(error.localizedDescription)")
}
return
}
self.voiceShortcuts = voiceShortcutsFromCenter
if let completion = completion {
completion()
}
}
}
}
And then implement in your ViewController
class SiriAddViewController: ViewController {
let voiceShortcutManager = VoiceShortcutsManager.init()
override func viewDidLoad() {
super.viewDidLoad()
contentView.btnTest.addTarget(self, action: #selector(self.testBtn), for: .touchUpInside)
}
...
#objc func testBtn() {
let deviceIntent = DeviceIntent(identifier: smartPlug.deviceID, display: smartPlug.alias)
//is action already has a shortcut, update shortcut else create shortcut
if let shortcut = voiceShortcutManager.voiceShortcut(for: deviceIntent, powerState: .off) {
let editVoiceShortcutViewController = INUIEditVoiceShortcutViewController(voiceShortcut: shortcut)
editVoiceShortcutViewController.delegate = self
present(editVoiceShortcutViewController, animated: true, completion: nil)
} else if let shortcut = INShortcut(intent: intentTurnOff) {
let addVoiceShortcutVC = INUIAddVoiceShortcutViewController(shortcut: shortcut)
addVoiceShortcutVC.delegate = self
present(addVoiceShortcutVC, animated: true, completion: nil)
}
}
}
#available(iOS 12.0, *)
extension SiriAddViewController: INUIAddVoiceShortcutButtonDelegate {
func present(_ addVoiceShortcutViewController: INUIAddVoiceShortcutViewController, for addVoiceShortcutButton: INUIAddVoiceShortcutButton) {
addVoiceShortcutViewController.delegate = self
addVoiceShortcutViewController.modalPresentationStyle = .formSheet
present(addVoiceShortcutViewController, animated: true, completion: nil)
}
func present(_ editVoiceShortcutViewController: INUIEditVoiceShortcutViewController, for addVoiceShortcutButton: INUIAddVoiceShortcutButton) {
editVoiceShortcutViewController.delegate = self
editVoiceShortcutViewController.modalPresentationStyle = .formSheet
present(editVoiceShortcutViewController, animated: true, completion: nil)
}
}
#available(iOS 12.0, *)
extension SiriAddViewController: INUIAddVoiceShortcutViewControllerDelegate {
func addVoiceShortcutViewController(_ controller: INUIAddVoiceShortcutViewController, didFinishWith voiceShortcut: INVoiceShortcut?, error: Error?) {
voiceShortcutManager.updateVoiceShortcuts(completion: nil)
controller.dismiss(animated: true, completion: nil)
}
func addVoiceShortcutViewControllerDidCancel(_ controller: INUIAddVoiceShortcutViewController) {
controller.dismiss(animated: true, completion: nil)
}
}
#available(iOS 12.0, *)
extension SiriAddViewController: INUIEditVoiceShortcutViewControllerDelegate {
func editVoiceShortcutViewController(_ controller: INUIEditVoiceShortcutViewController, didUpdate voiceShortcut: INVoiceShortcut?, error: Error?) {
voiceShortcutManager.updateVoiceShortcuts(completion: nil)
controller.dismiss(animated: true, completion: nil)
}
func editVoiceShortcutViewController(_ controller: INUIEditVoiceShortcutViewController, didDeleteVoiceShortcutWithIdentifier deletedVoiceShortcutIdentifier: UUID) {
voiceShortcutManager.updateVoiceShortcuts(completion: nil)
controller.dismiss(animated: true, completion: nil)
}
func editVoiceShortcutViewControllerDidCancel(_ controller: INUIEditVoiceShortcutViewController) {
voiceShortcutManager.updateVoiceShortcuts(completion: nil)
controller.dismiss(animated: true, completion: nil)
}
}
}
This code was inspired/copy from this webpage:
https://www.nodesagency.com/test-drive-a-siri-shortcuts-intro/
My experience with solving this was a little different. Some intents added via the Add to Siri button worked, which adjusted to "Added to Siri", while others didn't. I realised the actions that worked didn't require parameters.
After setting default values for intents that exposed parameters, which are passed into INShortcut (and then assigned to INUIAddVoiceShortcutButton), all buttons updated their state correctly!
So I'm playing around a bit with iMessage apps, and have hit a weird issue. I want to try and use TouchID authentication inside of iMessage, and am able to pop the TouchID alert fine from the iMessage app. However, when I go to insert a message showing the result of TouchID, it won't insert the message for me. Here's the relevant code:
#IBAction func authenticateTapped(_ sender: Any) {
let context = LAContext()
var wasSuccessful = false
self.group.enter()
context.evaluatePolicy(.deviceOwnerAuthenticationWithBiometrics, localizedReason: "Testing authentication") { (successful, _) in
wasSuccessful = successful
self.group.leave()
}
self.group.notify(queue: DispatchQueue.main) {
self.sendResult(wasSuccessful)
}
}
#IBAction func sendMessageTapped(_ sender: Any) {
sendResult(true)
}
func sendResult(_ successful: Bool) {
guard let conversation = self.activeConversation else { fatalError("expected conversation") }
var components = URLComponents()
components.queryItems = [URLQueryItem(name: "successful", value: successful.description)]
let layout = MSMessageTemplateLayout()
layout.image = UIImage(named: "green_checkmark")
layout.caption = "Authentication Result"
let message = MSMessage(session: conversation.selectedMessage?.session ?? MSSession())
message.url = components.url!
message.layout = layout
print("queryParts: \(String(describing: components.queryItems))")
print("message: \(message)")
print("activeConversation: \(String(describing: conversation))")
conversation.insert(message) {
(error) in
print("in completion handler")
print(error ?? "no error")
}
}
When authenticateTapped is triggered, the TouchID prompt shows, I successfully authenticate, and then see every log message inside of the sendResult message, except for any of the ones in the completion handler of the insert method.
The weird thing is, when the sendMessageTapped method is fired, everything works as expected. Does anyone know what's going on here, and why I can't seem to insert a message after I successfully authenticate using TouchID?
The only thing I can think of that's different between the two is that the view controller is disappearing when the TouchID prompt comes up, however, if that were the cause, I would expect none of my print statements would show up in the console, when everyone does except those in the completion handler?
Edit: I've done a bit more digging. When presenting the Touch ID authentication in compact mode, your view controller resigns active. When presenting in expanded mode, it stays active, allowing you to insert the message.
Does anyone know if resigning active when presenting the Touch ID alert is a bug or intended behavior?
I want to build a radio app and so I would like to use the stop button instead of the pause button in the control center like Apple Radio does in the native music app :
Here is what I did in my RadioPlayer class :
private var shoutcastStream = NSURL(string: "http://shoutcast.com:PORT/;stream.mp3")
var playerItem:AVPlayerItem?
var player:AVPlayer?
let commandCenter = MPRemoteCommandCenter.sharedCommandCenter()
override init() {
super.init()
do {
// Allow background audio
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch _ as NSError {
}
// Disable Next, Prev and Pause
commandCenter.pauseCommand.enabled = false
commandCenter.nextTrackCommand.enabled = false
commandCenter.previousTrackCommand.enabled = false
// Enable Play
commandCenter.playCommand.enabled = true
commandCenter.playCommand.addTarget(self, action: #selector(RadioPlayer.play))
// Enable Stop
commandCenter.stopCommand.enabled = true
commandCenter.stopCommand.addTarget(self, action: #selector(RadioPlayer.stop))
} catch _ as NSError {
}
}
It's now working fine but the stop button isn't showing. Instead, I have the Pause button, which doesn't make sense for a radio player haha.
Note that in the above case, even if the control center is showing the pause button, nothing happens when pause button is pressed, because no target is attached to it (I attached it to the stopCommand).
So the question is: how to use that Stop button? Thank you.
EDIT:
I think the "stop" command is only displayed when MPNowPlayingInfoPropertyIsLiveStream = true (available only from iOS 10) /:
It does not matter if you disable the "pause" or "togglePlayPause" commands. From iOS 10, the "stop" command will be displayed if MPNowPlayingInfoPropertyIsLiveStream = true.
You may need to handle the "pause" or the "togglePlayPause" command too (for earlier versions). Good luck!
OK, I also had this doubt and did not find on the internet how to do what I wanted so I started reading more about MPRemoteCommandCenter and MPNowPlayingInfoCenter.
I tried disabling all the buttons I did not use. Also, I read about MPNowPlayingInfoPropertyIsLiveStream and I share in this post in case anyone finds it useful (look at the comments in the code):
Swift 3
MPNowPlayingInfoCenter (for metadata):
var songInfo = [:] as [String : Any]
if NSClassFromString("MPNowPlayingInfoCenter") != nil {
songInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(image: UIImage(named: "your_image_name")!)
songInfo[MPMediaItemPropertyTitle] = "Title"
songInfo[MPMediaItemPropertyArtist] = "Artist name"
// If is a live broadcast, you can set a newest property (iOS 10+): MPNowPlayingInfoPropertyIsLiveStream indicating that is a live broadcast
if #available(iOS 10.0, *) {
songInfo[MPNowPlayingInfoPropertyIsLiveStream] = true
} else {
// Fallback on earlier versions
}
MPNowPlayingInfoCenter.default().nowPlayingInfo = songInfo
} // end if MPNowPlayingInfoCenter
MPRemoteCommandCenter:
if #available(iOS 9.1, *) {
let center = MPRemoteCommandCenter.shared()
// Disable all buttons you will not use (including pause and togglePlayPause commands)
[center.pauseCommand, center.togglePlayPauseCommand, center.nextTrackCommand, center.previousTrackCommand, center.changeRepeatModeCommand, center.changeShuffleModeCommand, center.changePlaybackRateCommand, center.seekBackwardCommand, center.seekForwardCommand, center.skipBackwardCommand, center.skipForwardCommand, center.changePlaybackPositionCommand, center.ratingCommand, center.likeCommand, center.dislikeCommand, center.bookmarkCommand].forEach {
$0.isEnabled = false
}
// For "play" command
center.playCommand.addTarget { (commandEvent) -> MPRemoteCommandHandlerStatus in
// play the song here
return MPRemoteCommandHandlerStatus.success
}
// For "stop" command
center.stopCommand.addTarget { (commandEvent) -> MPRemoteCommandHandlerStatus in
// stop the song here
return MPRemoteCommandHandlerStatus.success
}
} else {
// Fallback on earlier versions
}
I have done. I hope I have helped you and others (:
according to this question-answer , Apparently ControlCenter isn't customizable(At least until now).