Having a problem with CallKit integration. I am creating a configuration like this:
let providerConfiguration = CXProviderConfiguration(localizedName: "XXX")
providerConfiguration.supportsVideo = false
providerConfiguration.maximumCallsPerCallGroup = 1
providerConfiguration.supportedHandleTypes = [.phoneNumber]
if let callKitIcon = UIImage(named: "IconMask")
{
providerConfiguration.iconTemplateImageData = callKitIcon.pngData()
}
providerConfiguration.ringtoneSound = "Ringtone.caf"
And then creating the provider like so:
self.provider = CXProvider(configuration: providerConfiguration)
self.provider.setDelegate(self, queue: nil)
Problem is that all of this seems to be ignored. Custom Ringtone does not sound, IconMask does not show up on the iOS UI (its just a blank). IconMask is correct and the 3 images are 40, 80 and 120 pixels with alpha channel. Ringtone.caf is a valid sound file copied in the bundle.
Nothing in this CXProviderConfiguration seems to have any impact at all. Very frustrating! I get called back on the delegate function:
func providerDidBegin(_ provider: CXProvider)
And there I can inspect provider.configuration and it all looks correct.
What am I doing wrong?
The inbound call actually works and I am integrating with TwilioVoice and VOIP push. So just the UI is not picking up anything in the configuration.
Can you check if both ringtone and icon file have the targetMembership selected for them in the FileInspector
Related
I'm trying to share a story with a background image a a sticker image via URL Scheme on my ios app, i am using the attached code and it dose not work.
When i'm trying to share just a background image or just a sticker it does work. But when im trying share both a background image and a sticker in top of it, it dose not work.
Any Ideas?
func shareToInstagram(deepLinkString : String){
let url = URL(string: "instagram-stories://share")!
if UIApplication.shared.canOpenURL(url){
let backgroundData = UIImageJPEGRepresentation(UIImage(named: "shop_placeholder")!, 1.0)!
let creditCardImage = UIImage(named: "share_instagram")!
let stickerData = UIImagePNGRepresentation(creditCardImage)!
let pasteBoardItems = [
["com.instagram.sharedSticker.backgroundImage" : backgroundData],
["com.instagram.sharedSticker.stickerImage" : stickerData],
]
if #available(iOS 10.0, *) {
UIPasteboard.general.setItems(pasteBoardItems, options: [.expirationDate: Date().addingTimeInterval(60 * 5)])
} else {
UIPasteboard.general.items = pasteBoardItems
}
UIApplication.shared.openURL(url)
}
I copy pasted OP's code for use in my own app (only substituting different UIImages) and found only 1 issue, pasteboard items should be contained in a single array otherwise instagram will render only the first item (in this case the background layer). To fix this, replace the declaration of pasteboard items with the following code
let pasteBoardItems = [
["com.instagram.sharedSticker.backgroundImage" : backgroundData,
"com.instagram.sharedSticker.stickerImage" : stickerData]
]
(basically just remove the close and open bracket separating the two items)
Also as a previous answer stated, make sure "instagram-stories" is included in LSApplicationQueriesSchemes in the info.plist file
I use this exact code in my app and it now works perfect
Everything is correct, my code is similar and it works for iOS 11+. I suggest you the following:
check the image data you pass to pasteboard (jpg can't be converted with
UIImagePNGRepresentation and vice versa)
check the info.plist. You should enable "instagram-stories" scheme in it (LSApplicationQueriesSchemes key)
Like Alec said, you need to put all of Instagram data in one list, not multiple lists. look at the example from the meta documents:
NSArray *pasteboardItems = #[#{#"com.instagram.sharedSticker.stickerImage" : stickerImage,
#"com.instagram.sharedSticker.backgroundTopColor" : backgroundTopColor,
#"com.instagram.sharedSticker.backgroundBottomColor" : backgroundBottomColor}];
2. For more recent readers, as of swift 4.2 and iOS 12 UIImageJPEGRepresentation is replaced by jpegData. change
let backgroundData = UIImageJPEGRepresentation(yourImage, 1.0)
with
let backgroundData = yourImage.jpegData(compressionQuality: 1.0)
I'm just trying to have a form for users to create a ticket in my app. Nothing else is needed besides that.
I have used the methods here https://developer.zendesk.com/embeddables/docs/ios_support_sdk/requests
Here is my code
//Create a configuration object
let config = RequestUiConfiguration()
config.subject = "App Ticket"
config.tags = ["ios", "testing"]
config.ticketFormID = 20
//Present the SDK
let requestScreen = RequestUi.buildRequestUi(with: [config])
self.navigationController?.pushViewController(requestScreen, animated: true)
But the screen that shows in my app doesn't have a send button or back. I also set the color to orange, but I don't see that change either.
Theme.currentTheme.primaryColor = UIColor.orange
That bottom bar with the Zendesk logo just takes me to a zendesk page
I have the proper set up in app delegate as well
Zendesk.initialize(appId: "myAppID",
clientId: "myClientID",
zendeskUrl: "myURL")
Support.initialize(withZendesk: Zendesk.instance)
I moved step by step for getting rich push notifications. Here they are :
Created Notification service extension with plist :
NotificationService didRecieve :
override func didReceive(_ request: UNNotificationRequest, withContentHandler contentHandler: #escaping (UNNotificationContent) -> Void) {
func failEarly() {
contentHandler(request.content)
}
self.contentHandler = contentHandler
bestAttemptContent = (request.content.mutableCopy() as? UNMutableNotificationContent)
// Get the custom data from the notification payload
if let data = request.content.userInfo as? [String: AnyObject] {
// Grab the attachment
// let notificationData = data["data"] as? [String: String]
if let urlString = data["attachment-url"], let fileUrl = URL(string: urlString as! String) {
// Download the attachment
URLSession.shared.downloadTask(with: fileUrl) { (location, response, error) in
if let location = location {
// Move temporary file to remove .tmp extension
let tmpDirectory = NSTemporaryDirectory()
let tmpFile = "file://".appending(tmpDirectory).appending(fileUrl.lastPathComponent)
let tmpUrl = URL(string: tmpFile)!
try! FileManager.default.moveItem(at: location, to: tmpUrl)
// Add the attachment to the notification content
if let attachment = try? UNNotificationAttachment(identifier: "video", url: tmpUrl, options:nil) {
self.bestAttemptContent?.attachments = [attachment]
}else if let attachment = try? UNNotificationAttachment(identifier: "image", url: tmpUrl, options:nil) {
self.bestAttemptContent?.attachments = [attachment]
}else if let attachment = try? UNNotificationAttachment(identifier: "audio", url: tmpUrl, options:nil) {
self.bestAttemptContent?.attachments = [attachment]
}else if let attachment = try? UNNotificationAttachment(identifier: "image.gif", url: tmpUrl, options: nil) {
self.bestAttemptContent?.attachments = [attachment]
}
}
// Serve the notification content
self.contentHandler!(self.bestAttemptContent!)
}.resume()
}
}
}
Configured AppId and provision profile for extension.
Rich notification is coming correctly :
But here are the issues I am facing :
didRecieve is not getting called. For that I attached the serviceExtension process to the app target and ran the app.
Note : Extension is getting called as soon as notification arrives but didRecieve is not called :
On opening the push notification (which has video attachment), nothing happens. Ideally it should get played.
If I have to open the video and play it, do I have to explicitly do something or extension will take care of that ?
Payload :
aps = {
alert = "This is what your message will look like! Type in your message in the text area and get a preview right here";
badge = 1;
"mutable-content" = 1;
sound = default;
};
"attachment-url" = "https://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_1mb.mp4";
deeplinkurl = "";
"message_id" = 1609;
}
I did try going through following posts but that didn't help :
iOS10 UNNotificationServiceExtension not called
NotificationServiceExtension not called
UNNotificationServiceExtension not working on iPhone 5 (iOS 10)
Good news! Your service extension is indeed being called - the image on your notification is evidence of that. What is probably happening here is that you are unable to debug the extension using the workflow you are used to with applications.
Debugging notification extensions is not like debugging an app. Extensions are plug-ins to an iOS process outside your application. Just setting a breakpoint is not a reliable way to debug them. Instead:
Debugging A Notification Service Extension
Launch the app from Xcode or the device
In Xcode, select Attach To Process or PID By Name... from the Debug menu
Enter the name of your notification extension
Trigger a notification (by sending a push, etc.).
When the notification is delivered the service extension should launch in to the debugger. Service extensions are only relevant to remote (push) notifications, so you will need a device to troubleshoot them.
Debugging A Notification Content Extension
There are at least two ways. The steps shown above for a service extension also work for a content extension. The second method is more familiar but less reliable.
Select the extension scheme in Xcode using the toolbar
In the Product menu, select Edit Scheme...
Set the Executable to the parent application.
Set a breakpoint inside the content extension.
Now build and run your extension. It will launch the parent application.
Trigger a notification that will cause the content extension to load.
It's worth noting that adding logging using the logging framework can be very useful for debugging and troubleshooting as well.
Why The Video May Not Be Playing
iOS limits the size of content that can be presented in notifications. This is described in the documentation for UNNotificationAttachment. For video it is generally 50Mb. Make sure your video is as small as you can make it in terms of bytes, and of course provide a video that is sized appropriately for the device it will be played on. Do not try to play a 1080p video in a notification that is 400 points wide!
In practice it is almost always better to use HLS instead of downloading video, and present it in a content extension.
Another thing in your code that may be problematic is the identifiers you are assigning to your attachments. Identifiers should be unique. Typically this would be a reverse-domain notation string like your bundle ID followed by a UUID string. You could also use the original URL of the content followed by a UUID string. If you provide an empty string iOS will create a unique identifier for you.
With the user notifications framework having non-unique identifiers (for notifications, attachments, etc.) tends to cause difficult to track down issues inside the framework. For example, this can cause an attached watchOS device to crash.
If you want to implement "auto play" for your video - it is not clear from your question wether that is what you are describing - you will need to implement your own player functionality in a content extension.
If you are going to do that, again, HLS is the preferred way to display video in a notification. It usually uses less RAM, offers a better user experience and tends to be more stable.
I am developing an iOS application with a button to report an issue using SMS/iMessage. I am using MFMessageComposeViewController to present the message composition interface using the following code (Swift 3):
if(MFMessageComposeViewController.canSendText()){
let controller = MFMessageComposeViewController()
controller.messageComposeDelegate = self
controller.body = "Example Message"
controller.recipients = ["2345678901"]
self.present(controller, animated: true, completion: nil)
}
I have also implemented the MFMessageComposeViewControllerDelegate function to dismiss properly. A standard text message / iMessage sends successfully, but the user does not have the option to attach an image. The buttons for camera, iMessage Apps, etc. are there, but they are disabled and cannot be pressed. How can I enable these buttons (camera, specifically) to allow my users to attach images to messages composed with the app?
The Buttons in Question:
EDIT:
Thanks Abdelahad for the suggestion. I've modified his response to allow multiple recipients and to include a message body. I also updated it to remove the deprecated addingPercentEscapes(using: ) method.
Here is a solution using a url to open the Messages app. NOTE: This takes users out of the app.
let recipients = "2345678901,3456789012" //Phone Numbers
let messageBody = "This is a test"
let sms: String = "sms://open?addresses=\(recipients)&body=\(messageBody)"
let smsEncoded = sms.addingPercentEncoding(withAllowedCharacters: .urlFragmentAllowed)
let url = URL(string: smsEncoded!)
UIApplication.shared.openURL(url!)
But still I would like a solution that does not take the user out of the app. Is this possible? Why would the MFMessageComposeViewController show the buttons without enabling them?
Don't use MFMessageComposeViewController use UIApplication.shared.openURL(url!) but this will takes the user out of the app
var phoneToCall: String = "sms: +201016588557"
var phoneToCallEncoded = phoneToCall.addingPercentEscapes(using: String.Encoding.ascii)
var url = URL(string: phoneToCallEncoded)
UIApplication.shared.openURL(url!)
I am trying to play a video using MPMoviePlayerController for an iOS app in Swift.
My goal is to be able to play system music with something like apple music, then open my app and have the audio mix in, but I want my app to be able to take control of MPNowPlayingInfoCenter.
How can I use AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers) while still set the MPNowPlayingInfoCenter?
Google Maps mixes in audio while taking setting MPNowPlayingInfoCenter. Below is how I am trying to set the MPNowPlayingInfoCenter:
func setMeta(){
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
if let player = PlayWorkoutViewController.player{
let coverArt = MPMediaItemArtwork(image: UIImage(named: "AlbumArt")!)
let dict: [String: AnyObject] = [
MPMediaItemPropertyArtwork: coverArt,
MPMediaItemPropertyTitle:workout.title,
MPMediaItemPropertyArtist:"Alex",
MPMediaItemPropertyAlbumTitle:workout.program.title,
MPNowPlayingInfoPropertyPlaybackRate: player.currentPlaybackRate,
MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: player.playableDuration
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = dict
}
}
The above function works when I am not trying to play outside music with an option (.MixWithOthers) at the same time, but while I am trying to play outside music with the option (.MixWithOthers) the info center does not update.
Edit 1: Just to make things super clear, I already having video playing properly I am trying to play video with other background audio while being able to set MPNowPlayingInfoCenter.
This isn't currently possible in iOS. Even just changing your category options to .MixWithOthers causes your nowPlayingInfo to be ignored.
My guess is iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
I'd very much like it if iOS used a best-effort approach for choosing the "now playing app", something like this:
If there's a non-mixing app playing, pick that. Else..
If there's only one mixing app playing, pick that. Else..
If there are multiple mixing apps playing, just pick one :) Or pick none, I'm fine with either.
If you'd like this behavior as well, I'd encourage you to file a bug with Apple.
Have you tried implementing your own custom function to update the MPNowPlayingInfoCenter? Recently I was using an AVAudioPlayer to play music and needed to do the updating manually.
This is basically the function I called upon a new song being loaded.
func updateNowPlayingCenter() {
let center = MPNowPlayingInfoCenter.defaultCenter()
if nowPlayingItem == nil {
center.nowPlayingInfo = nil
} else {
var songInfo = [String: AnyObject]()
// Add item to dictionary if it exists
if let artist = nowPlayingItem?.artist {
songInfo[MPMediaItemPropertyArtist] = artist
}
if let title = nowPlayingItem?.title {
songInfo[MPMediaItemPropertyTitle] = title
}
if let albumTitle = nowPlayingItem?.albumTitle {
songInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
if let playbackDuration = nowPlayingItem?.playbackDuration {
songInfo[MPMediaItemPropertyPlaybackDuration] = playbackDuration
}
if let artwork = nowPlayingItem?.artwork {
songInfo[MPMediaItemPropertyArtwork] = artwork
}
center.nowPlayingInfo = songInfo
}
}
I am not sure if doing this upon a movie being loaded will override the MPMoviePlayerController, but it seems worth a shot.
Additionally, they have depreciated MPMoviePlayerController and replaced it with AVPlayerViewController, so thats also worth looking into.
Edit: Also I would check to make sure that you are properly receiving remote control events, as this impacts the data being displayed by the info center.
To play the video in swift use this:-
func playVideoEffect() {
let path = NSBundle.mainBundle().pathForResource("egg_grabberAnmi", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
let screenSize: CGRect = UIScreen.mainScreen().bounds
player.view.frame = CGRect(x: frame.size.width*0.10,y: frame.size.width/2, width:screenSize.width * 0.80, height: screenSize.width * 0.80)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.Fill
player.fullscreen = true
player.controlStyle = MPMovieControlStyle.None
player.movieSourceType = MPMovieSourceType.File
player.play()
self.view?.addSubview(player.view)
var timer = NSTimer.scheduledTimerWithTimeInterval(6.0, target: self, selector: Selector("update"), userInfo: nil, repeats: false)
}
}
// And Use the function to play Video
playVideoEffect()