iOS13 Safari WebSpeechApi Bug: SpeechSynthesisUtterance won't use provided locale - ios

There seems to be a bug in iOS 13 (Safari and WkWebView) that makes iOS use the device language voice and not find a suitable voice by looking at the 'lang' provided in the SpeechSynthesisUtterance.
I worked around the issue by setting a suitable voice myself.
This is not needed in other browsers/platforms (eg macOS Safari, iOS < 13, Chrome etc.)
this._getUtteranceRate().then((rate) => {
let utterance = new SpeechSynthesisUtterance(words);
utterance.rate = rate;
utterance.lang = 'sv-SE';
utterance.voice = this.voice; //IOS13 fix
window.speechSynthesis.speak(utterance);
});
window.speechSynthesis.onvoiceschanged = () => {
this.setVoice();
}
setVoice() {
this.voice = window.speechSynthesis.getVoices().find((voice) => {
return voice.lang === 'sv-SE';
});
}

It seems one needs to set the voice explicitly on the SpeechSynthesisUtterance for iOS13, as the locale is not used to find the voice.

Related

IOS 13 doesn't play a notification sound using FirebasePushNotificationPlugin

I use Firebase to push notifications to the users at a certain time. They receive the notification but no alert sound is played. In the settings, the allow sound/notifications are turned on and other IOS13 and other apps play sound.
Version Number of FirebasePushNotificationPlugin Plugin: 3.3.10
Device Tested On: iphone X, OS: 13.4.1
Simulator Tested On: N/A (simulators don't receive notifications)
Version of VS: VS for Mac Community, 8.6.6 (build 11)
Version of Xamarin: Xamarin.IOS 13.18.2.1, Xamarin.Forms v4.6.0.847
AppDelegate.cs:
public override bool FinishedLaunching(UIApplication app, NSDictionary options)
{
bool fbaseStarted = false;
try
{
// This method does all the UNUserNotificationCenter.Current.RequestAuthorization() code so we don't have to.
FirebasePushNotificationManager.Initialize(options, true);
fbaseStarted = true;
}
catch
{ }
LoadApplication(new App());
if (!fbaseStarted)
{
try
{
FirebasePushNotificationManager.Initialize(options, true);
}
catch { }
}
FirebasePushNotificationManager.CurrentNotificationPresentationOption = UNNotificationPresentationOptions.Badge | UNNotificationPresentationOptions.Alert | UNNotificationPresentationOptions.Sound;
}
Within one of the pages of my code, I subscribe a list of tags (please note that I unsubscribe because the first time the code runs it fails silently if the notifications aren't approved - resulting in the model thinking the notifications was subscribed when it wasn't):
CrossFirebasePushNotification.Current.UnsubscribeAll();
CrossFirebasePushNotification.Current.Subscribe(Constants.NotificationTagsArray);
I keep coming across payload json solutions but unless I am wrong, I don't think that applies to me as I am using Xamarin and the FirebasePushNotificationPlugin. Is there any additional permissions that were added in ios 13 for playing notifications with sound that I have missed?
I have also posted here: https://github.com/CrossGeeks/FirebasePushNotificationPlugin/issues/348 but nobody has been able to assist me yet.
Thanks
The issue actually lies with the sending of the notifications nothing to do with the Xamarin App. The issue resided in the services that sends the notifications to firebase (to then be sent out to the phones).
In the service we were sending a FirebaseNet.Messaging.Message() to the phones:
Message FireBasemessage = new Message()
{
To = "/topics/" + PushNote.Tag,
TimeToLive = 86400,
Priority = MessagePriority.high,
ContentAvailable = true,
Notification = new AndroidNotification()
{
Tag = "/topics/" + PushNote.Tag,
Body = enhancedMessage,
Title = xtitle,
}
,
Data = new Dictionary<string, string>
{
{ "param", PushNote.Tag },
{ "text", enhancedMessage}
}
};
In the AndroidNotification() object required Sound = "default" to be added for it to work. Please note that this works for both Android and IOS notifications despite the fact it is an AndroidNotification object.

AudioComponentGetIcon unavailable on Catalyst (iOS to macOS porting)

I would like to port an iOS app to macOS using Catalyst.
The app is an audio host for the AUv3 plugins.
The problem is that I can't get the plugin icon using the AudioComponentGetIcon API since is unavailable on macOS.
How can I get the plugin icon?
while (true) {
comp = AudioComponentFindNext(comp, &rau->_desc);
if (comp == NULL) break;
AudioComponentDescription desc = { 0, 0, 0, 0, 0 };
if (AudioComponentGetDescription(comp, &desc) != noErr) continue;
#if !TARGET_OS_MACCATALYST
rau->_image = AudioComponentGetIcon(comp, 76);
rau->_lastActiveTime = AudioComponentGetLastActiveTime(comp);
#else
#warning CATALYST WHAT I SHOULD DO here?
//rau->_image = AudioComponentGetIcon(comp);
#endif
if (rau->_image == nil) {
rau->_image = [UIImage imageNamed:DEFAULT_AU_IMAGE];
}
}
Apple's official answer: AudioComponentGetIcon was only ever supported on iOS for Inter-App Audio, not for audio unit extensions. Since Inter-App Audio is now deprecated on iOS, the functionality is not available in Catalyst, including AudioComponentGetIcon.
So, yes, in a Mac Catalyst app, it is not possible to get the AU icon.
In a macOS app (non-Catalyst), the AudioComponentGetIcon function is still available, but that doesn’t provide a solution to a Catalyst app. There is no “bridging” mechanism that allows your Catalyst app to cross over to the macOS function.
I’ve also figure out this workaround but __comp.icon and __comp.iconURL still unavailable in Catalyst.
AVAudioUnitComponentManager* av = [AVAudioUnitComponentManager sharedAudioUnitComponentManager];
NSArray<AVAudioUnitComponent *>* comps = [av componentsMatchingDescription:desc];
AVAudioUnitComponent* __comp = [comps objectAtIndex:0];
NSImage *iconIMG = __comp.icon; //not available on Catalyst
NSURL *iconURL = __comp.iconURL; //not available on Catalyst

Safari WebRTC support in IOS 11

We have an application developed using WebRTC in iOS 11, and it says it supports WebRTC but the application is not working in Safari on iOS 11. Is there anything required to do from our end to support this on the Safari browser? Do we have to make any changes in the script? Please help.
Did you got the last adapter.js that assume the browser compatibility ?
Regards
Here is a sample code that worked for me
// create video element first
var video = document.createElement('video');
video.style.width = document.width + 'px';
video.style.height = document.height + 'px';
video.setAttribute('autoplay', '');
video.setAttribute('muted', '');
video.setAttribute('playsinline', '');
document.body.appendChild(video);
//setup your constraints
constraints = {
audio: false,
video: {
facingMode: front
}
}
//ask navigator to allow access
navigator.mediaDevices.getUserMedia(constraints).then(function
success(stream) {
video.srcObject = stream;
});
});

How to detect that haptic feedback is disabled on iOs device?

I want to to show message in my application when haptic feedback is disabled in phone settings. How to detect that haptic feedback is disabled in device settings?
It's kludgy, but might this work?
- (BOOL)isHapticFeedbackDisabled {
BOOL result = NO;
UISelectionFeedbackGenerator *feedbackGenerator = [[UISelectionFeedbackGenerator alloc] init];
[feedbackGenerator prepare];
if ([feedbackGenerator.description containsString:#"prepared=0"]) result = YES;
feedbackGenerator = nil;
return result;
}
There is no way to check Haptic Feedback is enabled/disabled but there is private int _feedbackSupportLevel in UIKit for checking if device supports it:
func logFeedbackSupported() {
let supportLevel = UIDevice.current.value(forKey: "_feedbackSupportLevel")
print(supportLevel ?? "")
}
0: Not available,
1: First generation available (< iPhone 7),
2: Second generation available.
I advise you not to use Apples private APIs because:
The API could be changed in any version without you knowing about it.
Apple is parsing your app code to find out if you're using private API so be aware. Your app could be rejected.

Xamarin forms zxing ZXingScannerView on ios

I am writing a mobile app in xamarin forms and I have half the screen continuously scanning barcodes using ZXingScannerView. This works great in android however in ios it will not pick up any barcodes using ZXingScannerView. However ios does pick up barcodes using the full page ZXingScannerPage. In my example code below the method Scanner_OnScanResult is never getting hit. How can I get this to work in ios am i missing something?
ZXingScannerView scanner = new ZXingScannerView
{
HorizontalOptions = LayoutOptions.FillAndExpand,
VerticalOptions = LayoutOptions.FillAndExpand,
AutomationId = "zxingScannerView",
IsScanning = true,
Options = new ZXing.Mobile.MobileBarcodeScanningOptions
{
UseFrontCameraIfAvailable = false,//update later to come from settings
PossibleFormats = new List<ZXing.BarcodeFormat>(),
TryHarder = true
}
};
ZXingDefaultOverlay overlay = new ZXingDefaultOverlay();
scanner.Options.PossibleFormats.Add(ZXing.BarcodeFormat.QR_CODE);.
scanner.OnScanResult += Scanner_OnScanResult;
private void Scanner_OnScanResult(ZXing.Result result)
{
DisplayAlert("Exit", "TEST", "Yes", "No");
}
I eventually got this working however i'm not sure if its a bug or just inconsistent design but in iOS IsAnalyzing must be set to true manually when working in a view

Resources