How to access native audio sounds in ios / Cordova? - ios

Are ios native sounds Ringtone, Text Tone, New Mail, etc. available to play directly via Cordova (with the native audio plugin, for instance)?
All the examples I can find require a direct URL to the sound file in your www/audio directory, like this:
//preload the media
window.plugins.NativeAudio.preloadComplex( 'music', 'audio/music.mp3', 1, 1, 0, function(msg){
}, function(msg){
console.log( 'error: ' + msg )
});
window.plugins.NativeAudio.loop( 'music' );
Can't they be accessed and played directly? Let's say the iphone has Text Tone preference set to "Aurora." I'd want the app to be able to trigger the "Text Tone," which would play the sound "Aurora".

You can access ringtones now via this plugin: cordova-plugin-native-ringtones.

Related

find and share a downloaded video on Flutter ios without going through picker?

I have a Flutter app that can view mp4 files from a URL. (Using a video controller playing directly from the URL.) I want the user to be able to share them if they wish. As best I can tell the file has to actually exist on the device so I have broken down the steps for now into download file, invoke share.
I'm using this guide: https://retroportalstudio.medium.com/saving-files-to-application-folder-and-gallery-in-flutter-e9be2ebee92a
I need to work on ios and android. The problem is that on ios neither the filename I get from the dio downloader nor the ImageGallerySaver seem to "work" when passed to the system ShareSheet.
I'm using the flutter extensions dio, share_plus, cross_file, image_gallery_saver as I've seen recommended in various places.
File saveFile = File(directory.path + "/$fileName");
developer.log("starting download...");
await dio.download(url, saveFile.path,
onReceiveProgress: (value1, value2) {
developer.log("got progress " + value1.toString());
setState(() {
downloadProgress = value1 / value2;
});
});
_permaFile = saveFile.path;
if (Platform.isIOS) {
var galleryResult = await ImageGallerySaver.saveFile(saveFile.path,
isReturnPathOfIOS: true);
developer.log("gallery save result = " + galleryResult.toString());
_permaFile = galleryResult['filePath'];
}
After getting a directory we use dio to download the file, do some log chirping, and then save the name to an object member called _permaFile.
Then the share button triggers:
void _shareAction() async {
final box = context.findRenderObject() as RenderBox?;
final files = <XFile>[];
if (_permaFile == null) {
return;
}
developer.log("sharing file: " + _permaFile.toString());
files.add(XFile(_permaFile!));
await Share.shareXFiles(files,
text: "Event",
// subject: "Subject for Event",
sharePositionOrigin: box!.localToGlobal(Offset.zero) & box.size);
}
This works on android device... after I download I hit share, and I can share the video to a third-party app like WhatsApp.
On ios the ShareSheet is invoked but when I share I only get the text "Event", not the video file that goes along with it.
Note that I have tried both results... setting the _permaFile to be what comes back from ImageGallerySaver but also just using what the dio downloader gives back.
Note also that the ImageGallerySaver seems to work: the video really does land and is there in the ios video lib. If I go into the Photos app I can share from there to WhatsApp and have the video get sent.
In each case I get errors like this:
[ShareSheet] error fetching item for URL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:/// : (null)
[ShareSheet] error fetching file provider domain for URL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:/// : (null)
[ShareSheet] error loading metadata for
documentURL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 --
file:/// error:Error Domain=NSFileProviderInternalErrorDomain Code=0
"No valid file provider found from URL
file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:///."
UserInfo={NSLocalizedDescription=No valid file provider found from URL
file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:///.}
In order to test this further I built the share_plus demo app:
https://github.com/fluttercommunity/plus_plugins/tree/main/packages/share_plus/share_plus
I modified it to share videos to see what was different. The share plus example (sp_example) works for sharing videos that have been selected by the picker.
For this reason I think the problem is something I'm missing about ios video filenames/formats and possibly a built-in conversion step that happens.
Here are what the filenames look like that I see in my app:
dio download result:
file:///var/mobile/Containers/Data/Application/223BF2B9-DDF0-490E-932F-09D5F03B98B3/Library/Caches/test.mp4
ImageGallerySaver result:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0019.MP4
This is what video filenames look like when they are picked and shared in sp_example:
/private/var/mobile/Containers/Data/Application/E5CB4D7C-6CDF-4AA2-8134-C4322ED7C886/tmp/trim.E6633D68-44E3-4853-A29E-A71AC95A0913.MOV
Note that it has been converted to MOV extension and the user gets trim step right in the picker that results in trim in the name.
For my purposes I don't want to go through the picker, the user is on the screen showing the video and they shouldnt have to repick, so where do I get the post-conversion ios filename that references what I just saved?

How to play a custom sound in Electron

I am using Electron 8.0.3 and I am trying to play a custom sound. Here's what I am doing:
const notif = new Notification({
title: 'Finished Download',
body: 'test',
sound: 'vapp/assets/sounds/mighty_sound.mp3',
});
notif.show();
It doesn't seem to play that sound but instead a default macOS sound. I've tried:
Using a absolute path like '/Users/<name>/Desktop/workspace/proj/vapp/assets/sounds/sound.mp3'
Packaging the application so that the sound is bundled
Playing different file types: .wav, .mp3, .aiff
Choosing other macOS sounds that might exist in /System/Library/Sounds
For whatever reason, it plays the same sound.
I have referenced this documentation
My approach was to set the silent attribute of electron's Notification module to true so that the OS sound doesn't play and then use this sound-play npm package to play my own sound.
const showNotification = () => {
new Notification({
title: "Elon Musk is Tesla CEO",
body: "The automaker just got...",
silent: true, //// Disable sound by operating system
}).show();
// Play custom sound
const sound = require("sound-play");
sound.play("./src/quite-impressed.mp3");
};

Why won't this encrypted HLS video play on iOS (but works on Windows Chrome via hls.js library)?

The original video is "Sample Video 5" from https://www.appsloveworld.com/download-sample-mp4-video-mp4-test-videos/.
My /home/vagrant/Code/example/public/hls_hls.keyInfo is:
https://example.com/hls.key
/home/vagrant/Code/example/public/hls_hls.key
467216aae8a26fb699080812628031955e304a66e9e4480f9b70d31d8fe94e9a
My /home/vagrant/Code/example/public/hls_hls.key was generated using PHP: hex2bin('467216aae8a26fb699080812628031955e304a66e9e4480f9b70d31d8fe94e9a')
The ffmpeg command for encrypting the video as HLS playlist with "ts" files:
'/usr/bin/ffmpeg' '-y' '-i' 'storage/app/sample_media2/2020-02-27/Sample_Videos_5.mp4'
'-c:v' 'libx264' '-s:v' '1920x1080' '-crf' '20' '-sc_threshold' '0' '-g' '48'
'-keyint_min' '48' '-hls_list_size' '0'
'-hls_time' '10' '-hls_allow_cache' '0' '-b:v' '4889k' '-maxrate' '5866k'
'-hls_segment_type' 'mpegts' '-hls_fmp4_init_filename' 'output_init.mp4'
'-hls_segment_filename' 'storage/app/public/test/output_1080p_%04d.ts'
'-hls_key_info_file' '/home/vagrant/Code/example/public/hls_hls.keyInfo'
'-strict' '-2' '-threads' '12' 'storage/app/public/test/output_1080p.m3u8'
Then, I know from https://caniuse.com/#search=hls that Windows Chrome won't be able to play the HLS video without a library, so I use https://github.com/video-dev/hls.js/, and Windows Chrome successfully plays the encrypted video!
However, iOS Safari is unable to play it (with or without the hls.js library).
On iOS Safari, when I try to play the video, I see just a quick glimpse (less than a second) where the screen shows 0:15, so it must be reading and decrypting enough to know the correct duration of the video.
So, to debug, I log events:
const nativeHlsEvents = ['play', 'playing', 'abort', 'error', 'canplaythrough', 'waiting', 'loadeddata', 'loadstart', 'progress', 'timeupdate', 'volumechange'];
$.each(nativeHlsEvents, function (i, eventType) {
video.addEventListener(eventType, (event) => {//https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement
console.log(eventType, event);
if (eventType === 'error') {
console.error(video.error);//https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/error
}
});
});
I see in the console log:
loadstart, {"isTrusted":true}
progress, {"isTrusted":true}
play, {"isTrusted":true}
waiting, {"isTrusted":true}
error, {"isTrusted":true}
video.error, {}
I don't know how to find more details about the error.
Note that even though Windows Chrome successfully plays the video, it too shows warnings in the console log:
{"type":"mediaError","details":"fragParsingError","fatal":false,"reason":"TS packet did not start with 0x47","frag":{"...
{"type":"mediaError","details":"fragParsingError","fatal":false,"reason":"no audio/video samples found","frag":{...
Where is my problem?
I would need to buy a newer iPhone.
I see at https://en.wikipedia.org/wiki/IPhone_6#Software and https://support.apple.com/guide/iphone/supported-iphone-models-iphe3fa5df43/ios that ā€œ6sā€ is the oldest hardware that iOS 13 supports, and https://caniuse.com/#search=hls says HLS needs >=13.2.
Iphone devices not supported media source extension.
You should by user agent checking has iphone?
If be right, you should use iphone native video player for play hls video type or more video type

Allow user to change app notification sound

I have an app that uses Firebase Messaging to send notifications, and I have a set of sounds added to my xcode project resources, to play on the device.
I send the notification to users subscribed to specific topics, from my server like this:
"apns": {
"headers": {
"apns-priority": "10",
},
"payload": {
"aps": {
"alert": {
"title": titleMessage,
"body": bodyMessage,
},
"sound": 'alert.wav',
},
},
}
Now the sound "alert.wav" plays fine on the device, when the notification is received.
What I want to do is inside my app:
I want to allow users to change the notification sound, from different sets of sounds.
Example: Option to play sound from: set 1, or set 2. and I would have them in my app as seperate folders with the same file names.
Is this possible in iOS? and how can I achieve it in react native?
I was able to do it, and this is how:
1. Added the sounds file in xcode, in my app resources.
Just added a link to the folder that contains all my sounds, and did not link each file, as I want the app to play sounds later from: The "/Library/Sounds directory".
2. Used rn-fetch-blob to copy sounds and replace them.
By using rn-fetch-blob I was able to create the /Library/Sounds directory and then move the sounds from MainBundleDir to /Library/Sounds, where the app will be looking for sounds when a notification is received.
I created an interface to allow users to replace these sounds.
//Function: setup sounds
setupSounds(){
//sounds directory
const dirs = RNFetchBlob.fs.dirs
const soundsDir = dirs.CacheDir.replace('Caches','Sounds')
//Check if has sounds directory
RNFetchBlob.fs.isDir(soundsDir).then((isDir) => {
//create directory
if(!isDir) RNFetchBlob.fs.mkdir(soundsDir)
})
//Add the sound file to /Library/Sounds
RNFetchBlob.fs.writeFile(soundsDir+'/default.wav', dirs.MainBundleDir + '/sounds/jingle.wav', 'uri');
}
Now if the app received a notification with sound: 'default.wav' the app will play default.wav which is a copy of jingle.wav.

AudioContext.createMediaStreamSource alternative for iOS?

I've developed an app using Cordova and the Web Audio API, that allows the user to plug in headphones, press the phone against their heart, and hear their own heartbeat.
It does this by using audio filter nodes.
//Setup userMedia
context = new (window.AudioContext||window.webkitAudioContext);
navigator.getUserMedia = (navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia);
navigator.getUserMedia(
{audio:true},
userMediaSuccess,
function(e) {
alert("error2 " + e.message);
});
function userMediaSuccess(stream)
{
//set microphone as input
input = context.createMediaStreamSource(stream);
//amplify the incoming sounds
volume = context.createGain();
volume.gain.value = 10;
//filter out sounds below 25Hz
lowPass = context.createBiquadFilter();
lowPass.type = 'lowpass';
lowPass.frequency.value = 25;
//filter out sounds above 425Hz
highPass = context.createBiquadFilter();
highPass.type = 'highpass';
highPass.frequency.value = 425;
//apply the filters and amplification to microphone input
input.connect(lowPass);
input.connect(highPass);
input.connect(volume);
//send the result of these filters to the phones speakers
highPass.connect(context.destination);
lowPass.connect(context.destination);
volume.connect(context.destination);
}
It runs fine when I deploy to Android, but it seems most of these features aren't available on iOS mobile browsers.
I managed to make getUserMedia function using the iosRTC plugin, but createMediaStreamSource is still "not a function."
So, I'm looking for an alternative to the Web Audio API that can filter out frequencies, or if there are any plugins I could use, that would be perfect.
There's no way to do this on ios web. You'd need a native app, since Apple doesn't support audio input in safari.
Did you try to use
document.addEventListener('deviceready', function () {
// Just for iOS devices.
if (window.device.platform === 'iOS') {
cordova.plugins.iosrtc.registerGlobals();
}
});
You asked this question quite a while ago, but sadly createMediaStreamSource is still not supported in Safari Mobile (will it ever be?).
As previously said, a plugin is the only way to achieve this, and there is actually a Cordova/Phonegap plugin that does exactly that. cordova-plugin-audioinput gives you access to the sound from the microphone using either the Web Audio API or by callbacks that delivers raw audio data chunks, and it supports iOS as well as Android.
Since I don't want to post the same answer twice, I'll instead point you to the following answer here on stackoverflow, where you'll also find a code example: https://stackoverflow.com/a/38464815/6609803
I'm the creator of the plugin and any feedback is appreciated.
Good news, full support for ios safari
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamSource

Resources