Text to speech not working on iOS in the background - ios

I am unable to play an audio that is generated in the background using text to speech plugin flutter_tts on iOS only.
I am getting the following error:
2023-01-11 09:52:04.095673+0400 Runner[12016:2350386] [AXTTSCommon] Failure starting audio queue alp!
2023-01-11 09:52:05.218595+0400 Runner[12016:2350386] [AXTTSCommon] _BeginSpeaking: couldn't begin playback
Background Settings in info.plist
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
<string>fetch</string>
<string>location</string>
<string>processing</string>
</array>
Speaker Configuration
Future<void> initializeSpeaker() async {
speakerPlugin = FlutterTts();
speakerPlugin.setQueueMode(0);
/////////////////////////////
//Selecting voices
////////////////////////////
var voices = await speakerPlugin.getVoices;
//example {name: ur-PK-language, locale: ur-PK}
String deviceLanguage= Platform.localeName.substring(0,2);
int defaultvoiceix = 0;
String userlocale = "en-gb";
try {
String languageCode = Platform.localeName.split('_')[0];
String countryCode = Platform.localeName.split('_')[1];
userlocale = languageCode + "-"+countryCode;
} catch (exception) {
userlocale = "en-gb";
}
for (var i=0;i<voices.length;i++) {
String currentLocale = voices[i]["locale"];
if (currentLocale.toLowerCase() == userlocale.toLowerCase()){
defaultvoiceix = i;
}
}
await speakerPlugin.setSharedInstance(true);
await speakerPlugin.setVoice(voices[defaultvoiceix]);
await speakerPlugin.setVolume(1.0);
await speakerPlugin.setSpeechRate(1.0);
await speakerPlugin.setPitch(1.0);
await speakerPlugin.setLanguage(userlocale);
await speakerPlugin
.setIosAudioCategory(IosTextToSpeechAudioCategory.playback, [
IosTextToSpeechAudioCategoryOptions.mixWithOthers,
IosTextToSpeechAudioCategoryOptions.duckOthers
]);
}

Related

Safari browsers above iOS 14 cannot play m3u8 videos and cannot load .ts files

Safari browsers above iOS 14 cannot play m3u8 videos and cannot load .ts files
Safari browsers below ios 14 can play
First request the m3u8 file, then request the corresponding decryption key, then perform key replacement and ts resource domain replacement, and then generate base64 and put it in video src.
There is no problem with this operation below ios 14 but not above ios 14
async iosAutoKey(xhr) {
const me = this;
const res = me.addVideoFilePrefix(xhr.response);
let resKey = "";
resKey = await api
.downloadCertificateKeyH5({
videoId: me.videoData.id,
})
.then((resKey) => {
return resKey;
});
let key = new Blob([resKey], {
type: "text/plain",
});
const keyUrl = URL.createObjectURL(key);
let blob = new Blob(
[res.replace(/URI="[\d]{13}"|URI="{REMOTE_KEY}"/, `URI="${keyUrl}"`)],
{
type: "application/vnd.apple.mpegurl",
}
);
let reader = new FileReader();
reader.readAsDataURL(blob);
reader.onloadend = function () {
const url = reader.result;
me.arrayBufferMap.set(me.videoData.id, url);
if (Hls.isSupported()) {
me.hlsInstance.loadSource(url);
} else if (isIOS()) {
me.theVideo.src = url;
} else {
me.$dialog.alert({
message: "The current browser does not support playing m3u8, please use the latest version of chrome",
});
}
};
},
addVideoFilePrefix(res) {
let result;
let reg = new RegExp('(http|https)://.+/group');
let prefix = `${this.vipFileSource}/group`;
if (reg.test(res)) {
result = res.replace(/(http|https):\/\/.+\/group/g, prefix);
} else {
result = res.replace(/\/group/g, prefix);
}
return result;
},
Video trigger event below ios 14
enter image description here
Video trigger event above ios 14
enter image description here
m3u8 first load
enter image description here
The m3u8 file decrypted on ios 14 cannot load the video, and the .ts file cannot be loaded
m3u8 decrypt the requested base64
enter image description here

how to control background audio from Dependecy-service in xamarin.ios

I made a Dependency-Service for playing audio and background control in Xamarin.ios .
Audio was executed in background well.
but, Lockscreen controller was not shown.
I don't know what is problem.
I registered "UIBackgroundMode" and "audio" in "info.plist"
and set the AVAudioSession as playback.
I used play_pause(string url) function for playing audio and setting the MPNowPlayingInfo.
[assembly: Dependency(typeof(AudioService))]
namespace BibleHymn.iOS
{
class AudioService : IAudio
{
private bool isPlaying = false;
private bool interrupted = false;
int audioNum = 0;
AVFoundation.AVPlayer player;
public AudioService()
{
var avSession = AVAudioSession.SharedInstance();
avSession.SetCategory(AVAudioSessionCategory.Playback);
NSError activationError = null;
avSession.SetActive(true, out activationError);
}
public bool Play_Pause(string url)
{
if(player == null)
{
this.player = new AVFoundation.AVPlayer();
this.player = AVFoundation.AVPlayer.FromUrl(Foundation.NSUrl.FromString(url));
this.player.Play();
UpdateNotification();
}
isPlaying !=isPlaying;
return isPlaying;
}
public void UpdateNotification()
{
var item = new MPNowPlayingInfo
{
Title = "My Title"
};
MPNowPlayingInfoCenter.DefaultCenter.NowPlaying = item;
Device.BeginInvokeOnMainThread(() => {
UIKit.UIApplication.SharedApplication.BeginReceivingRemoteControlEvents();
});
}
}
I used play_pause from viewModel through button control's command.
init like ...
public IAudio streamService = DependencyService.Get<IAudio>();
and in command ...
streamService.Play_Pause(url);
You need to register background support for Audio in the info.plist by following screenshot
Also did you import Mediaplayer.framework

How to send mail with attachments in Flutter?

I found dart plugin called mailer3: "^1.1.9". Previously I create an image in mobile temp directory. In Flutter mobile app I try to send this saved picture using mailer3 plugin as a mail. The mail reach the destination, I don't get error but seems lost the attachment in process.
In dart it works very well and send the attachment as well. In flutter I can use the temp directory to show the image in app but cannot be able attache to mail.
The image location is in the temp folder of the device:
'/data/user/0/com.myApp.myApp/app_flutter/20180700087.jpg'
I can show the image using below code:
new FileImage(File('$newDekontImage'),
Error:
E/flutter (21184): [ERROR:topaz/lib/tonic/logging/dart_error.cc(16)] Unhandled exception:
E/flutter (21184): FileSystemException: Cannot open file, path = '/data/user/0/com.myApp.myApp/app_flutter/20180700087.jpg' (OS Error: No such file or directory, errno = 2)
How to send mail with attachments in Flutter with provided information in this question?
The Flutter Code:
// TODO: SEND MAIL
void _sendMail() async {
if (!_formKey.currentState.validate()) {
return;
} else {
_formKey.currentState.save();
var _options = new GmailSmtpOptions()
..username = “mymailaddress#gmail.com"
..password = “myPassword”;
var _emailTransport = new SmtpTransport(_options);
var _envelope = new Envelope()
..from = "mymailaddress#gmail.com"
..recipients.add(_receiverMailAddress)
..subject = "${_userDekontDetails[0][0].toString()} - Receipt”
..attachments.add(await new Attachment(file: await new File('$newDekontImage')))
..text = "${_userDekontDetails[0][0].toString()} - Receipt"
..html = '<h3>${_userDekontDetails[0][0].toString()} Receipt.</h3>'
'<p>Hi, registered under my name, I am sending the receipt (${widget._currentUserReceiptNo}) with attached to this mail.</p>'
'<p></p>'
'<h5>Regards, </br></h5>'
'${_userDekontDetails[0][0].toString()}';
_emailTransport.send(_envelope)
..then((envelope) => print('Email sent'))
..catchError((e) => print('Error occured: $e'));
}
}
As of this writing, the mailer3 plugin is outdated and mailer is the most up-to-date plugin for sending emails. The mailer plugin currently contains important fixes that mailer2 and mailer3 has. I suggest opting to use mailer package instead of mailer3.
Here's a port of your code snippet from mailer3 to mailer
_sendMail(String username, String accessToken) async {
// Read https://pub.dev/documentation/mailer/latest/smtp_server_gmail/gmailSaslXoauth2.html
var _emailTransport = gmailSaslXoauth2(username, accessToken);
var _envelope = new Message()
..from = "mymailaddress#gmail.com"
..recipients.add("recepient#gmail.com")
..subject = '{EMAIL_SUBJECT_GOES_HERE}'
// Read https://pub.dev/documentation/mailer/latest/mailer/FileAttachment-class.html
..attachments
.add(FileAttachment(File('{FILE_PATH}')))
..text = '{PLAIN_TEXT_GOES_HERE}'
..html = '{HTML_CONTENT_GOES_HERE}';
send(_envelope, _emailTransport)
..then((envelope) => print('Email sent'))
..catchError((e) => print('Error occured: $e'));
}
I have used enough_mail, Hope it will be helpful.
MessageBuilder messageBuilder = MessageBuilder();
Future<bool> onFileSelect(BuildContext context) async {
final result = await FilePicker.platform
.pickFiles(type: FileType.any, allowMultiple: true, withData: true);
if (result == null) {
return false;
}
for (final file in result.files) {
final lastDotIndex = file.path.lastIndexOf('.');
MediaType mediaType;
if (lastDotIndex == -1 || lastDotIndex == file.path.length - 1) {
mediaType = MediaType.fromSubtype(MediaSubtype.applicationOctetStream);
} else {
final ext = file.path.substring(lastDotIndex + 1);
mediaType = MediaType.guessFromFileExtension(ext);
}
messageBuilder.addBinary(file.bytes, mediaType, filename: file.name);
}
return true;
}
Future<void> sendMail(BuildContext buildContext) async {
setState(() {
needToFreezeUi = true;
});
MySnackBar.show(buildContext, MySnackBar.loadingIcon, "Please wait...!");
SmtpClient smtpClient = SmtpClient(domain, isLogEnabled: true);
try {
await smtpClient.connectToServer(
"$serverPrefix.${userInfo.domainName}",
smtpServerPort,
isSecure: isSmtpServerSecure
);
await smtpClient.ehlo();
await smtpClient.authenticate(userInfo.email, userInfo.password);
messageBuilder.from = [MailAddress('', userInfo.email)];
messageBuilder.to = [MailAddress('', toTextEditCtl.text)];
messageBuilder.cc = selectedCCEmailInfos.map((e) => MailAddress('',e.emailAddress)).toList();
messageBuilder.bcc = selectedBCCEmailInfos.map((e) => MailAddress('',e.emailAddress)).toList();
messageBuilder.subject = subjectTextEditCtl.text;
String htmlText = await htmlEditorController.getText();
messageBuilder.addTextHtml(htmlText);
messageBuilder.hasAttachments ? messageBuilder.getPart(
MediaSubtype.multipartAlternative,
recursive: false
) : messageBuilder.addPart(
mediaSubtype: MediaSubtype.multipartAlternative,
insert: true
);
if (!messageBuilder.hasAttachments) {
messageBuilder.setContentType(
MediaType.fromSubtype(MediaSubtype.multipartAlternative)
);
}
MimeMessage mimeMessage = messageBuilder.buildMimeMessage();
SmtpResponse smtpResponse = await smtpClient.sendMessage(mimeMessage);
MySnackBar.hide(buildContext);
if(smtpResponse.isOkStatus){
MySnackBar.show(buildContext,MySnackBar.successIcon,"Mail send successfully");
clearInputFields(buildContext);
}else {
MySnackBar.show(buildContext,MySnackBar.errorIcon,"Something went wrong, please try again!");
}
} on SmtpException catch (e) {
MySnackBar.show(buildContext,MySnackBar.errorIcon,"Something went wrong, please try again!");
}
setState(() {
needToFreezeUi = false;
});
}
============================
dependencies:
enough_mail: ^1.3.4
html_editor_enhanced: ^1.4.0
file_picker: ^3.0.2+2

Multiple StreamingRecognizeRequest

I'm trying to setup a StreamingRecognize, with multiple request's. Is it possible ?
The point is that i want to send audio stream from the mic with a unknown time, so i think that i must implement multiple requests. (Considering that a request session has a max_time = 65 seconds).
Anyone can help me with this ?
Thank's alot ;)
Google sample code:
static async Task<object> StreamingMicRecognizeAsync(int seconds)
{
if (NAudio.Wave.WaveIn.DeviceCount < 1)
{
Console.WriteLine("No microphone!");
return -1;
}
var speech = SpeechClient.Create();
var streamingCall = speech.StreamingRecognize();
// Write the initial request with the config.
await streamingCall.WriteAsync(
new StreamingRecognizeRequest()
{
StreamingConfig = new StreamingRecognitionConfig()
{
Config = new RecognitionConfig()
{
Encoding =
RecognitionConfig.Types.AudioEncoding.Linear16,
SampleRateHertz = 16000,
LanguageCode = "en",
},
InterimResults = true,
}
});
// Print responses as they arrive.
Task printResponses = Task.Run(async () =>
{
while (await streamingCall.ResponseStream.MoveNext(
default(CancellationToken)))
{
foreach (var result in streamingCall.ResponseStream
.Current.Results)
{
foreach (var alternative in result.Alternatives)
{
Console.WriteLine(alternative.Transcript);
}
}
}
});
// Read from the microphone and stream to API.
object writeLock = new object();
bool writeMore = true;
var waveIn = new NAudio.Wave.WaveInEvent();
waveIn.DeviceNumber = 0;
waveIn.WaveFormat = new NAudio.Wave.WaveFormat(16000, 1);
waveIn.DataAvailable +=
(object sender, NAudio.Wave.WaveInEventArgs args) =>
{
lock (writeLock)
{
if (!writeMore) return;
streamingCall.WriteAsync(
new StreamingRecognizeRequest()
{
AudioContent = Google.Protobuf.ByteString
.CopyFrom(args.Buffer, 0, args.BytesRecorded)
}).Wait();
}
};
waveIn.StartRecording();
Console.WriteLine("Speak now.");
await Task.Delay(TimeSpan.FromSeconds(seconds));
// Stop recording and shut down.
waveIn.StopRecording();
lock (writeLock) writeMore = false;
await streamingCall.WriteCompleteAsync();
await printResponses;
return 0;
}
In Cloud Speech-to-Text audio length limit for each streaming request is around 1 minute [1]. You can either use asynchronous speech recognition [2] for audio files up to 180 minutes or renew the streaming request before it reaches to the time limit for streaming speech recognition [3].
Here is a Python example how to renew streaming request and stream audio more than 1 minute [4].

FCM Xamarin.Forms, iOS background Notification doesnt show

Platform: iOS 10.2+
Xamarin Plugin: Firebase iOS cloud messaging https://components.xamarin.com/view/firebaseioscloudmessaging
Problem: When I send a notification from FireBase console or from my code calling FireBase. My iPhones don't receive the background notifications(as a bubble). But, if I had the application in foreground, I received the notification as a "DisplayAlert" from the function ApplicationReceivedRemoteMessage(RemoteMessage remoteMessage).
So, the device is registered on FCM, the device have the token, the device can receive the notifications, but the device didnt receive the notifications on background.
** In VisualStudio 2017 at .iOS project manifest I have the Background mode activated and the remote notifications activated too **
¿It's this a common issue? ¿Can I solve for working at my project?
Code of AppDelegate
[Register("AppDelegate")]
public partial class AppDelegate : global::Xamarin.Forms.Platform.iOS.FormsApplicationDelegate, IUNUserNotificationCenterDelegate, IMessagingDelegate
{
protected SQLiteAsyncConnection conn;
//
// This method is invoked when the application has loaded and is ready to run. In this
// method you should instantiate the window, load the UI into it and then make the window
// visible.
//
// You have 17 seconds to return from this method, or iOS will terminate your application.
//
public override bool FinishedLaunching(UIApplication app, NSDictionary options)
{
global::Xamarin.Forms.Forms.Init();
global::Xamarin.FormsMaps.Init();
CachedImageRenderer.Init();
LoadApplication(new App());
UITabBar.Appearance.SelectedImageTintColor = UIColor.FromRGB(139, 194, 77);
UINavigationBar.Appearance.TintColor = UIColor.FromRGB(139,194,77);
CrossVersionTracking.Current.Track();
// Firebase component initialize
Firebase.Analytics.App.Configure();
// Register your app for remote notifications.
if (UIDevice.CurrentDevice.CheckSystemVersion(10, 0))
{
// iOS 10 or later
var authOptions = UNAuthorizationOptions.Alert | UNAuthorizationOptions.Badge | UNAuthorizationOptions.Sound;
UNUserNotificationCenter.Current.RequestAuthorization(authOptions, (granted, error) => {
Console.WriteLine(granted);
});
// For iOS 10 display notification (sent via APNS)
UNUserNotificationCenter.Current.Delegate = this;
// For iOS 10 data message (sent via FCM)
Messaging.SharedInstance.RemoteMessageDelegate = this;
}
else
{
// iOS 9 or before
var allNotificationTypes = UIUserNotificationType.Alert | UIUserNotificationType.Badge | UIUserNotificationType.Sound;
var settings = UIUserNotificationSettings.GetSettingsForTypes(allNotificationTypes, null);
UIApplication.SharedApplication.RegisterUserNotificationSettings(settings);
}
UIApplication.SharedApplication.RegisterForRemoteNotifications();
Firebase.InstanceID.InstanceId.Notifications.ObserveTokenRefresh((sender, e) =>
{
newToken = Firebase.InstanceID.InstanceId.SharedInstance.Token;
//Conectamos con la base de datos.
database = new SQLiteClient();
conn = database.GetConnection();
usuario = null;
try
{
var task = Task.Run(async () =>
{
usuario = await conn.Table<Usuario>()
.FirstOrDefaultAsync();
});
task.Wait();
if (usuario != null)
{
usuario.token = newToken;
task = Task.Run(async () =>
{
await conn.InsertOrReplaceAsync(usuario);
});
task.Wait();
}
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine("TOKEN ERROR\tNo se ha podido Guardar el Token" + ex.Message);
}
System.Diagnostics.Debug.WriteLine("TOKEN\t" + newToken);
connectFCM();
});
#endregion
return base.FinishedLaunching(app, options);
}
public override void DidEnterBackground(UIApplication uiApplication)
{
Messaging.SharedInstance.Disconnect();
Console.WriteLine("Disconnected from FCM");
}
public override void OnActivated(UIApplication uiApplication)
{
connectFCM();
base.OnActivated(uiApplication);
}
public override void RegisteredForRemoteNotifications(UIApplication application, NSData deviceToken)
{
#if DEBUG
Firebase.InstanceID.InstanceId.SharedInstance.SetApnsToken(deviceToken, Firebase.InstanceID.ApnsTokenType.Sandbox);
#endif
#if RELEASE
Firebase.InstanceID.InstanceId.SharedInstance.SetApnsToken(deviceToken, Firebase.InstanceID.ApnsTokenType.Prod);
#endif
}
// iOS 9 <=, fire when recieve notification foreground
public override void DidReceiveRemoteNotification(UIApplication application, NSDictionary userInfo, Action<UIBackgroundFetchResult> completionHandler)
{
Messaging.SharedInstance.AppDidReceiveMessage(userInfo);
// Generate custom event
NSString[] keys = { new NSString("Event_type") };
NSObject[] values = { new NSString("Recieve_Notification") };
var parameters = NSDictionary<NSString, NSObject>.FromObjectsAndKeys(keys, values, keys.Length);
// Send custom event
Firebase.Analytics.Analytics.LogEvent("CustomEvent", parameters);
if (application.ApplicationState == UIApplicationState.Active)
{
System.Diagnostics.Debug.WriteLine(userInfo);
var aps_d = userInfo["aps"] as NSDictionary;
var alert_d = aps_d["alert"] as NSDictionary;
var body = alert_d["body"] as NSString;
var title = alert_d["title"] as NSString;
debugAlert(title, body);
}
}
// iOS 10, fire when recieve notification foreground
[Export("userNotificationCenter:willPresentNotification:withCompletionHandler:")]
public void WillPresentNotification(UNUserNotificationCenter center, UNNotification notification, Action<UNNotificationPresentationOptions> completionHandler)
{
System.Console.WriteLine(notification.Request.Content.UserInfo);
var title = notification.Request.Content.Title;
var body = notification.Request.Content.Body;
debugAlert(title, body);
}
private void connectFCM()
{
Console.WriteLine("connectFCM\tEjecutandose la función.");
Messaging.SharedInstance.Connect((error) =>
{
if (error == null)
{
//TODO: Change Topic to what is required
Messaging.SharedInstance.Subscribe("/topics/all");
}
//System.Diagnostics.Debug.WriteLine("connectFCM\t" + (error != null ? "error occured" : "connect success"));
Console.WriteLine("connectFCM\t" + (error != null ? "error occured" + error.DebugDescription : "connect success"));
});
}
private void debugAlert(string title, string message)
{
var alert = new UIAlertView(title ?? "Title", message ?? "Message", null, "Cancel", "OK");
alert.Show();
}
public void ApplicationReceivedRemoteMessage(RemoteMessage remoteMessage)
{
Console.WriteLine("\n*******************************************");
Console.WriteLine("AplicationReceivedRemoteMessage\t" + remoteMessage.AppData);
Console.WriteLine("\n*******************************************");
var title = remoteMessage.AppData.ValueForKey(new NSString("title"));
var text = remoteMessage.AppData.ValueForKey(new NSString("text"));
debugAlert("" + title, "" + text);
}
[Export("userNotificationCenter:didReceiveNotificationResponse:withCompletionHandler:")]
public void DidReceiveNotificationResponse(UNUserNotificationCenter center, UNNotificationResponse response, Action completionHandler)
{
debugAlert("DidreceiveNotificationResponse", response + "" );
}
For more info, my info.plist contais the key:
<key>UIBackgroundModes</key>
<array>
<string>location</string>
<string>bluetooth-central</string>
<string>bluetooth-peripheral</string>
<string>fetch</string>
<string>remote-notification</string>
</array>
After a long investigation I solve it.
It is necesary to add that key to Entile.plist file.
<dict>
<key>aps-environment</key>
<string>development</string>
</dict>

Resources