I'm trying create a list of all available alert sounds given in an iOS device.
In Xamarin's website, there's only one example:
https://developer.xamarin.com/recipes/ios/media/sound/syssound-example/
I would like to know which additional sounds can I access via my Apple device.
Edit:
How do I get the list of build-in alert sounds
Found the answer here :
https://www.theiphonewiki.com/wiki//System/Library/Audio/UISounds
e.g:
string NotificationSoundPath = #"/System/Library/Audio/UISounds/sms-received6.caf";
SystemSound notificationSound = SystemSound.FromFile(NotificationSoundPath);
notificationSound.AddSystemSoundCompletion(SystemSound.Vibrate.PlaySystemSound);
notificationSound.PlaySystemSound();
Related
What I'd like to do is write some code like:
var googleHome = new GoogleHomeSdk();
var devices = ['kitchen', 'lounge', 'bedroom'];
googleHome.broadcast({
devices : devices,
message : 'The front door has been open for 30 seconds'
});
Which would then do something similar to the native broadcast feature, i.e. where you type 'broadcast ' into the google assistant app on phones/tablets.
I can't seem to find any google documentation for an API like this but it seems like such a basic/obvious requirement that surely something like this must exist?
Note: I've seen a workaround where people have casted an audio file (obtained via text-to-speech service) but this isn't really what I'm after.
The platform does not provide a mechanism to programmaticlly broadcast messages to different devices.
I would like to know how it is possible to search something on iOS devices programmatically (Spotlight?) using Universal links or direct methods (native code) to open the search app like on Android.
In my Android app I have
String string = query;
Intent intent = new Intent(Intent.ACTION_WEB_SEARCH);
intent.setPackage("com.google.android.googlequicksearchbox");
intent.setClassName("com.google.android.googlequicksearchbox", "com.google.android.googlequicksearchbox.SearchActivity");
intent.putExtra("query", string);
activity.startActivity(intent);
This code performs a search that sometimes allows assistant-like results with the possibility to open relevant apps or the web browser and so on.
I think also iOS have this feature. How to do that?
You can make your app searchable with Siri, but you cannot, in code, trigger a Siri search.
I am writing an app that includes text-to-speech using AVSpeechSynthesizer. The code for generating the utterance and using the speech synthesizer has been working fine.
let utterance = AVSpeechUtterance(string: text)
utterance.voice = currentVoice
speechSynthesizer.speak(utterance)
Now with iOS 11, I want to match the voice to the one selected by the user in the phone's Settings app, but I do not see any way to get that setting.
I have tried getting the list of installed voices and looking for one that has a quality of .enhanced, but sometimes there is no enhanced voice installed, and even when there is, it may or may not be the voice selected by the user in the Settings app.
static var enhanced: AVSpeechSynthesisVoice? {
for voice in AVSpeechSynthesisVoice.speechVoices() {
if voice.quality == .enhanced {
return voice
}
}
return nil
}
The questions are twofold:
How can I determine which voice has been selected by the user in the Setting app?
Why on some iOS 11 phones that are using the new Siri voice am I not finding an "enhanced" voice installed?
I suppose if there was a method available for selecting the same voice as in the Settings app, it'd be shown on the documentation for class AVSpeechSynthesisVoice under the Finding Voices topic. Jumping to the definition in code of AVSpeechSynthesisVoice, I couldn’t find any different methods to retrieve voices.
Here's my workaround on getting an enhanced voice over for the app I am working on:
Enhanced versions of voices are probably not present in new iOS devices by default in order to save storage. Iterating thru available voices on my brand new iPhone, I only found Default quality voices such as: [AVSpeechSynthesisVoice 0x1c4e11cf0] Language: en-US, Name: Samantha, Quality: Default [com.apple.ttsbundle.Samantha-compact]
I found this article on how to enable additional voice over voices and downloaded the one named “Samantha (Enhanced)” among them. Checking the list of available voices again, I noticed the following addition:
[AVSpeechSynthesisVoice 0x1c4c03060] Language: en-US, Name: Samantha (Enhanced), Quality: Enhanced [com.apple.ttsbundle.Samantha-premium]
As of now I was able to select an enhanced language on Xcode. Given that the AVSpeechSynthesisVoice.currentLanguageCode() method exposes the currently selected language, ran the following code to make a selection of the first enhanced voice I could find. If no enhanced version was available I’d just pick the available default (the code below is for a VoiceOver custom class I am creating to handle all speeches in my app. The piece below updates its voice variable).
var voice: AVSpeechSynthesisVoice!
for availableVoice in AVSpeechSynthesisVoice.speechVoices(){
if ((availableVoice.language == AVSpeechSynthesisVoice.currentLanguageCode()) &&
(availableVoice.quality == AVSpeechSynthesisVoiceQuality.enhanced)){ // If you have found the enhanced version of the currently selected language voice amongst your available voices... Usually there's only one selected.
self.voice = availableVoice
print("\(availableVoice.name) selected as voice for uttering speeches. Quality: \(availableVoice.quality.rawValue)")
}
}
if let selectedVoice = self.voice { // if sucessfully unwrapped, the previous routine was able to identify one of the enhanced voices
print("The following voice identifier has been loaded: ",selectedVoice.identifier)
} else {
self.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode()) // load any of the voices that matches the current language selection for the device in case no enhanced voice has been found.
}
I am also hoping Apple will expose a method to directly load the selected language, but I hope this work around can serve you in the meantime. I guess Siri’s enhanced voice is downloaded on the go, so maybe this is the reason it takes so long to answer my voice commands :)
Best regards.
It looks like the new Siri voice in iOS 11 isn't part of the AVSpeechSynthesis API, and isn't available to developers.
In macOS 10.13 High Sierra (which also gets the new voice), there seems to be a new SiriTTS framework that's probably related to this functionality, but it's in PrivateFrameworks so it doesn't have a developer API.
I'll try to provide a more detailed answer. AVSpeechSynthesizer cannot use the Siri voice. Apple has locked this voice to ensure privacy as the malicious app could impersonate Siri and get private information that way.
Apple hasn't changed this for years, but there is ongoing initiative regarding this. We already know that there is a solution to access privacy sensitive features in the iOS using the permissions, and there is no reason why Siri voice couldn't be accessed with user permission. You may vote for this to happen using this petition and with some hope Apple may implement that in the future: https://www.change.org/p/apple-apple-please-allow-3rd-party-apps-to-use-siri-voices-for-improved-accessibility
How to detect in which app my custom keyboard used and show different button?
E.g. in Twitter I would add # to string I post into input field and in Reddit /r/
It is possible through following code. As you'll get bundle identifier of the app where you're using your custom keyboard.
Swift
let hostBundleID = self.parentViewController!.valueForKey("_hostBundleID")
let currentHostBundleID = String(hostBundleID)
print(currentHostBundleID);
From bundle identifier you can find app name easily.
Edit: See above. Things have changed.
This is not possible. An extension runs sandboxed and is only fed information from the API and cannot access anything else. The keyboard can only receive text context changes and activate/deactivate calls. Being able to detect an app lies outside of the extension sandbox and therefore is impossible.
I'm trying to show apps from UIDocumentInteractionControllerDelegate , default apple apps like; sms, email or UIPasteBoard (Facebook and Twitter is optional) . Just like the similar of Dropbox app :
How can I handle this kind of situation? I've done already open apps from their UTI's, open just email or just sms but I don't how to show all of them in one sheet.
Thanx
I believe you're looking at an UIActivityController instead of a UIDocumentInteractionController, which I believe Apple is trying to phase out.
Check out the docs here https://developer.apple.com/library/ios/documentation/uikit/reference/UIActivityViewController_Class/Reference/Reference.html
You should try QLPreviewController. This will give you desired result.
https://developer.apple.com/library/ios/Documentation/NetworkingInternet/Reference/QLPreviewController_Class/Reference/Reference.html