Allow user to change app notification sound - ios

I have an app that uses Firebase Messaging to send notifications, and I have a set of sounds added to my xcode project resources, to play on the device.
I send the notification to users subscribed to specific topics, from my server like this:
"apns": {
"headers": {
"apns-priority": "10",
},
"payload": {
"aps": {
"alert": {
"title": titleMessage,
"body": bodyMessage,
},
"sound": 'alert.wav',
},
},
}
Now the sound "alert.wav" plays fine on the device, when the notification is received.
What I want to do is inside my app:
I want to allow users to change the notification sound, from different sets of sounds.
Example: Option to play sound from: set 1, or set 2. and I would have them in my app as seperate folders with the same file names.
Is this possible in iOS? and how can I achieve it in react native?

I was able to do it, and this is how:
1. Added the sounds file in xcode, in my app resources.
Just added a link to the folder that contains all my sounds, and did not link each file, as I want the app to play sounds later from: The "/Library/Sounds directory".
2. Used rn-fetch-blob to copy sounds and replace them.
By using rn-fetch-blob I was able to create the /Library/Sounds directory and then move the sounds from MainBundleDir to /Library/Sounds, where the app will be looking for sounds when a notification is received.
I created an interface to allow users to replace these sounds.
//Function: setup sounds
setupSounds(){
//sounds directory
const dirs = RNFetchBlob.fs.dirs
const soundsDir = dirs.CacheDir.replace('Caches','Sounds')
//Check if has sounds directory
RNFetchBlob.fs.isDir(soundsDir).then((isDir) => {
//create directory
if(!isDir) RNFetchBlob.fs.mkdir(soundsDir)
})
//Add the sound file to /Library/Sounds
RNFetchBlob.fs.writeFile(soundsDir+'/default.wav', dirs.MainBundleDir + '/sounds/jingle.wav', 'uri');
}
Now if the app received a notification with sound: 'default.wav' the app will play default.wav which is a copy of jingle.wav.

Related

How to play a custom sound in Electron

I am using Electron 8.0.3 and I am trying to play a custom sound. Here's what I am doing:
const notif = new Notification({
title: 'Finished Download',
body: 'test',
sound: 'vapp/assets/sounds/mighty_sound.mp3',
});
notif.show();
It doesn't seem to play that sound but instead a default macOS sound. I've tried:
Using a absolute path like '/Users/<name>/Desktop/workspace/proj/vapp/assets/sounds/sound.mp3'
Packaging the application so that the sound is bundled
Playing different file types: .wav, .mp3, .aiff
Choosing other macOS sounds that might exist in /System/Library/Sounds
For whatever reason, it plays the same sound.
I have referenced this documentation
My approach was to set the silent attribute of electron's Notification module to true so that the OS sound doesn't play and then use this sound-play npm package to play my own sound.
const showNotification = () => {
new Notification({
title: "Elon Musk is Tesla CEO",
body: "The automaker just got...",
silent: true, //// Disable sound by operating system
}).show();
// Play custom sound
const sound = require("sound-play");
sound.play("./src/quite-impressed.mp3");
};

How to access native audio sounds in ios / Cordova?

Are ios native sounds Ringtone, Text Tone, New Mail, etc. available to play directly via Cordova (with the native audio plugin, for instance)?
All the examples I can find require a direct URL to the sound file in your www/audio directory, like this:
//preload the media
window.plugins.NativeAudio.preloadComplex( 'music', 'audio/music.mp3', 1, 1, 0, function(msg){
}, function(msg){
console.log( 'error: ' + msg )
});
window.plugins.NativeAudio.loop( 'music' );
Can't they be accessed and played directly? Let's say the iphone has Text Tone preference set to "Aurora." I'd want the app to be able to trigger the "Text Tone," which would play the sound "Aurora".
You can access ringtones now via this plugin: cordova-plugin-native-ringtones.

Export audiofiles via “open in:” from Voice Memos App

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!
After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

Apple Watch OS 2 how to show/ hide action button from custom notification

I am building a an apple watch os 2 app extension and I need to implement notifications. As I read the apple docs I sow that what I need is an custom notification, so have build the UI for it, but the thing is that I have to handle multiple notifications cases and not all of them have one action button, so my question is how do change the name of an action button and if it possible to hide it.
I know that the action buttons can be created using the PushNotificationPayload.apns file and also in the apns file you set up the action button and function, but is it possible to also do this in code(hide/ show button action and also adjust it's name).
I was thinking that I could do some implementation in didReceiveRemoteNotification:withCompletion:?
And here is some background info:
Update 1
After reading again this apple doc I made the next changes:
Implemented another Notification Interface Controller and checked the "Has Dynamic Interface"
Added "Notification Category Name"
Created another subclass of the "WKUserNotificationInterfaceController" and associated it with the new custom notification in storyboard
Managed to test that the correct Categorry is loaded when the notification arrives using the "category" key and the correct value for each custom category, in my case "Cat1" and "Cat2".
And now the only question that I have is:
How do I manage the custom notifications button locally?(how many buttons are displayed, the title of the buttons, so that I am not depending on the server as in the .apns test notification)
If this is not possible then my question becomes:
How should the notification from server look like(the format) regarding the actions that are displayed? should I still send the actions as:
"WatchKit Simulator Actions": [
{
"title": "First Button",
"identifier": "firstButtonAction"
}
]
or should I pass them in the aps key as:
"alert" : {
"body" : "Test message",
"title" : "Optional title",
"action" : [
{
"title" : "First Button",
"identifier" : "firstButtonAction"
}
]
}
Update 2
After reading the documentation for Adding Action Buttons to Notifications I think that this is what I need to do, also read this tutorial.
But the main difference is that by doing so it will also affect my iOS app and I want the action button only for the notifications that I retrieve on my watch, is this possible to add actions in the app delegate on the IOS app and only affect the look of the notifications on my watch app?

How to customize messages with static variables for push notifications (parse.com)

i got an application, using parse.com i established a back-end
my problem is this; is there a way to send customized push notifications based on the information i gathered?
Something like this;
[#"Hello %#, Don't forget to brush your teeth",userName]
or
a system that i can use the gathered data to send push notifications. for example
CTTelephonyNetworkInfo *netInfo =[[CTTelephonyNetworkInfo alloc]init];
CTCarrier *carrier =[netInfo subscriberCellularProvider];
testUser[#"Carrier"]=[carrier carrierName];
I get carrier of the user like this and lets say i want to send push notification for those who use carrier A
is it possible?I tired to figure out with the apple documentations but it didn't help me.
If not with parse.com is there any system that let me do it?
Yes you can use the Parse CloudCode. Keep in mind that you can create channels, each channel in your case could be a carrier, so if the first user tries to subscribe to the channel you have to ensure that the channel exists, if not, just create it. Then you can target notifications for the channels you want from the Parse push console or from the Parse Cloud Code. It's all in the docs.
So, if you're currently using Parse, to subcribe a user to the channel you can use:
PFInstallation *currentInstallation = [PFInstallation currentInstallation];
[currentInstallation addUniqueObject:#"TMobile" forKey:#"channels"];
[currentInstallation saveInBackground];
Then from the Parse Push console you will see that the "TMobile" channel has been added and you can send push just to the users subscribed to that channel.
Otherwise if you want to send dinamically the notifications from your webserver, you can call a CloudFunction similar to this:
Parse.Push.send({
channels: [ "TMobile" ],
data: {
alert: "Alert message"
}
}, { success: function() {
// success!
}, error: function(err) {
console.log(err);
}
});
It looks like you are asking two different questions.
For the first question, Apple Push Notifications support this.
...the client application can store in its bundle the alert-message
strings translated for each localization it supports. The provider
specifies the loc-key and loc-args properties in the aps dictionary of
the notification payload. When the device receives the notification
(assuming the application isn’t running), it uses these aps-dictionary
properties to find and format the string localized for the current
language, which it then displays to the user.
Here’s how that second option works in a little more detail.
An application can internationalize resources such as images, sounds,
and text for each language that it supports, Internationalization
collects the resources and puts them in a subdirectory of the bundle
with a two-part name: a language code and an extension of .lproj (for
example, fr.lproj). Localized strings that are programmatically
displayed are put in a file called Localizable.strings. Each entry in
this file has a key and a localized string value; the string can have
format specifiers for the substitution of variable values. When an
application asks for a particular resource—say a localized string—it
gets the resource that is localized for the language currently
selected by the user. For example, if the preferred language is
French, the corresponding string value for an alert message would be
fetched from Localizable.strings in the fr.lproj directory in the
application bundle. (The application makes this request through the
NSLocalizedString macro.)
To make this clearer, let’s consider an example. The provider
specifies the following dictionary as the value of the alert property:
{"aps" :
{"alert" : {
"loc-key" : "GAME_PLAY_REQUEST_FORMAT",
"loc-args" : [ "Jenna", "Frank"]
}
}
When the device receives the notification, it uses
"GAME_PLAY_REQUEST_FORMAT" as a key to look up the associated string
value in the Localizable.strings file in the .lproj directory for the
current language. Assuming the current localization has an
Localizable.strings entry such as this:
"GAME_PLAY_REQUEST_FORMAT" = "%# and %# have invited you to play
Monopoly";
the device displays an alert with the message “Jenna and Frank have
invited you to play Monopoly”.
In addition to the format specifier %#, you can %n$# format specifiers
for positional substitution of string variables. The n is the index
(starting with 1) of the array value in loc-args to substitute.
(There’s also the %% specifier for expressing a percentage sign (%).)
So if the entry in Localizable.strings is this:
"GAME_PLAY_REQUEST_FORMAT" = "%2$# and %1$# have invited you to play
Monopoly";
the device displays an alert with the message “Frank and Jenna have
invited you to play Monopoly”.

Resources