Alarm clock or timing notification in Flutter - dart

Hello i'm some new with flutter and i need to do an alarm or a timing notification.I need some like AlarmManager in Android Studio.
I'm taking the hour from my server so i need the alarm or notification runs with that hour. I'll be so grateful if someone can help me thanks.

You can try using flutter_local_notifications for configuring a timer and notification in Flutter.
Here's a code snippet that sends out an hourly notification.
final FlutterLocalNotificationsPlugin flutterLocalNotificationsPlugin =
FlutterLocalNotificationsPlugin();
// hourly notification schedule config, set to either `hours: 1` or `minutes: 60`
var scheduledNotificationDateTime =
DateTime.now().add(Duration(minutes: 60));
Future<void> _showNotificationHourly() async {
const AndroidNotificationDetails androidPlatformChannelSpecifics =
AndroidNotificationDetails(
'your other channel id',
'your other channel name',
'your other channel description',),
);
const IOSNotificationDetails iOSPlatformChannelSpecifics =
IOSNotificationDetails();
const MacOSNotificationDetails macOSPlatformChannelSpecifics =
MacOSNotificationDetails();
const NotificationDetails platformChannelSpecifics = NotificationDetails(
android: androidPlatformChannelSpecifics,
iOS: iOSPlatformChannelSpecifics,
macOS: macOSPlatformChannelSpecifics);
// https://pub.dev/documentation/flutter_local_notifications/latest/#scheduling-a-notification
await flutterLocalNotificationsPlugin.zonedSchedule(
0,
'scheduled title',
'scheduled body',
scheduledNotificationDateTime,
platformChannelSpecifics);
}

Related

Flutter(iOS) Start Audio_Service When App is Recieving Notification Terminated and Background Status

I need to play audio to app users when recieve Firebase Cloud Messaging notification on background and terminated. FCM _backgroundhandler works when background, foreground and terminated. Everything is working on android. I examine all the status on ios side it is working clumsily. It work a bit better on ios16< devices but process is still maverick?(clumsy).
I think audiosession is not starting respect to decision of ios system. ios 15.5 device mostly play when app recieves the notification but it is on developer mode enabled. I Can not play sound on a device which is developer mode disabled.
On the other hand when app is terminated it is never been invoked. I can not play announcement when the app recives the notification.
The actual issue is how can I play the announcement when app recieves notification? Sound files are longer than 1 minutes and no longer than 10 minutes.
My minimal code is below;
Background Handler fcm backgroundhandler uses it should start AudioService;
late AudioHandler _audioHandler;
#pragma('vm:entry-point')
Future<void> backgroundHandler(RemoteMessage message) async {
WidgetsFlutterBinding.ensureInitialized();
final session = await AudioSession.instance;
//await session.setActive(false);
await session.configure(AudioSessionConfiguration.music());
await session.setActive(true);
_audioHandler = await AudioService.init(
builder: () => AudioPlayerHandler(
notMediaItem: MediaItem(
id: DateTime.now().microsecondsSinceEpoch.toString(),
extras: {
"url": "${ApiConstant.FILES}${message.data["audio_path"]}"
},
title: message.data["title"] ?? "Belediye Duyurusu",
album: message.data["type_name"] ?? "Duyuru",
artist: message.data["hall_name"] ?? "Belediye",
artUri: message.data["image_path"] == null
? null
: Uri.tryParse(
"${ApiConstant.FILES}${message.data["image_path"]}"))),
config: const AudioServiceConfig(
androidNotificationChannelId: 'com.ryanheise.myapp.channel.audio',
androidNotificationChannelName: 'Audio playback',
androidNotificationOngoing: true,
),
);
await _audioHandler.play();
}
And Main Function
void main() async {
WidgetsFlutterBinding.ensureInitialized();
await Firebase.initializeApp();
final session = await AudioSession.instance;
await session.configure(AudioSessionConfiguration.music());
await session.setActive(true);
_audioHandler = await AudioService.init(
builder: () => AudioPlayerHandler(
notMediaItem: MediaItem(
id: DateTime.now().microsecondsSinceEpoch.toString(),
extras: {
"url": "****url/mp3",
},
title: "*****",
album: "******",
artist: "*********",
)),
config: const AudioServiceConfig(
androidNotificationChannelId: 'com.ryanheise.myapp.channel.audio',
androidNotificationChannelName: 'Audio playback',
androidNotificationOngoing: true,
),
);
await _audioHandler.play();
FirebaseMessaging.onBackgroundMessage(backgroundHandler);
DynamicLinkServices().handledynamiclink();
FlutterError.onError = FirebaseCrashlytics.instance.recordFlutterError;
PlatformDispatcher.instance.onError = (error, stack) {
FirebaseCrashlytics.instance.recordError(error, stack, fatal: true);
return true;
};
}
I used sample audio service handler
/// An [AudioHandler] for playing a single item.
class AudioPlayerHandler extends BaseAudioHandler with SeekHandler {
MediaItem? notMediaItem;
final _player = AudioPlayer();
/// Initialise our audio handler.
AudioPlayerHandler({required this.notMediaItem}) {
final _item = notMediaItem;
// So that our clients (the Flutter UI and the system notification) know
// what state to display, here we set up our audio handler to broadcast all
// playback state changes as they happen via playbackState...
_player.playbackEventStream.map(_transformEvent).pipe(playbackState);
// ... and also the current media item via mediaItem.
mediaItem.add(_item);
// Load the player.
//_player.setAudioSource(AudioSource.uri(Uri.parse(_item!.id)));
_player.setAudioSource(AudioSource.uri(Uri.parse(_item!.extras!["url"])));
}
// In this simple example, we handle only 4 actions: play, pause, seek and
// stop. Any button press from the Flutter UI, notification, lock screen or
// headset will be routed through to these 4 methods so that you can handle
// your audio playback logic in one place.
#override
Future<void> onNotificationDeleted() async {
await stop();
_player.dispose();
return super.onNotificationDeleted();
}
Future<void> dispose() => _player.dispose();
#override
Future<void> onTaskRemoved() => _player.dispose();
#override
Future<void> playFromUri(Uri uri, [Map<String, dynamic>? extras]) async {
await setAudioSource(uri);
return _player.play();
}
#override
Future<void> play() {
return _player.play();
}
#override
Future<void> pause() => _player.pause();
Future<void> setAudioSource(Uri path) async =>
_player.setAudioSource(AudioSource.uri(path));
#override
Future<void> seek(Duration position) => _player.seek(position);
#override
Future<void> stop() => _player.stop();
/// Transform a just_audio event into an audio_service state.
///
/// This method is used from the constructor. Every event received from the
/// just_audio player will be transformed into an audio_service state so that
/// it can be broadcast to audio_service clients.
PlaybackState _transformEvent(PlaybackEvent event) {
return PlaybackState(
controls: [
MediaControl.rewind,
if (_player.playing) MediaControl.pause else MediaControl.play,
MediaControl.stop,
MediaControl.fastForward,
],
systemActions: const {
MediaAction.seek,
MediaAction.seekForward,
MediaAction.seekBackward,
MediaAction.playPause,
MediaAction.stop,
MediaAction.play,
MediaAction.pause,
},
androidCompactActionIndices: const [0, 1, 3],
processingState: const {
ProcessingState.idle: AudioProcessingState.idle,
ProcessingState.loading: AudioProcessingState.loading,
ProcessingState.buffering: AudioProcessingState.buffering,
ProcessingState.ready: AudioProcessingState.ready,
ProcessingState.completed: AudioProcessingState.completed,
}[_player.processingState]!,
playing: _player.playing,
updatePosition: _player.position,
bufferedPosition: _player.bufferedPosition,
speed: _player.speed,
queueIndex: event.currentIndex,
);
}
}

Receiving and handling media messages in Twilio-dialogflow WhatsApp integration

I have deployed a server on GCP to receive message traffic from twilio via webhook and integrated it with Google's dialogflow. You can see the original project here "https://github.com/GoogleCloudPlatform/dialogflow-integrations#readme".
The function works fine for receiving and responding via intent detection but it can't handle any media inputs from a user as dialogflow can't interpret it. I've been trying to code a simple IF statement that converts any input media into URL's prior to processing by dialogflow. The full code server.js file is given below:
const express = require('express');
const request = require('request');
const app = express();
const dialogflowSessionClient = require('../botlib/dialogflow_session_client.js');
const bodyParser = require('body-parser');
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
const projectId = 'PROJECT-ID';
const phoneNumber = "+1##########";
const accountSid = '*********************';
const authToken = '*******************';
const client = require('twilio')(accountSid, authToken);
const MessagingResponse = require('twilio').twiml.MessagingResponse;
const sessionClient = new dialogflowSessionClient(projectId);
const listener = app.listen(process.env.PORT, function() {
console.log('listner marker');
console.log('Your Twilio01 integration server is listening on port '+ listener.address().port);
});
app.post('/', async function(req, res) {
const body = req.body;
const text = body.Body;
const id = body.From;
console.log('body marker');
const dialogflowResponse = (await sessionClient.detectIntent(text, id, body)).fulfillmentText;
const twiml = new MessagingResponse();
const message = twiml.message(dialogflowResponse);
res.send(twiml.toString());});
process.on('SIGTERM', () => {
listener.close(() => {
console.log('Closing http server.');
process.exit(0);
});
});
I have tried to add my IF statement like below but it fails to execute when placed into the main file.
if (MessagingResponse.NumMedia != "0") {
console.log(MessagingResponse.MediaUrl0);
MessagingResponse = MessagingResponse.MediaUrl0;
console.log(response.toString());
}
you cannot change the const variable value on the next statement so
change your
const text;
to
var text;
and add
if(body.MediaUrl0){
text = body.MediaUrl0;
}
before
const dialogflowResponse = (await sessionClient.detectIntent(
text, id, body)).fulfillmentText;
Twilio developer evangelist here.
You're trying to read the NumMedia and other properties from the MessagingResponse class. You should be trying to read it from the body of the incoming request: req.body.NumMedia.
Edit
Checked your code from the repo. The const still doesn't seem right. Try this:
app.post("/", async function (req, res) {
const body = req.body;
let text;
if (body.NumMedia != "0") {
text = body.MediaUrl0;
} else {
text = body.Body;
}
const id = body.From;
const dialogflowResponse = (await sessionClient.detectIntent(text, id, body))
.fulfillmentText;
const twiml = new MessagingResponse();
twiml.message(dialogflowResponse);
res.send(twiml.toString());
});
In this case we define text with let instead of const and outside of the conditional. That way we can reassign to text and use it later in the function too.

Flutter LocalNotification Plugin, Showing notification every 2 hours

How do I change this _repeatNotification() method, to repeat every 2 hours?
Future _repeatNotification() async {
var androidPlatformChannelSpecifics = AndroidNotificationDetails(
'repeating channel id',
'repeating channel name',
'repeating description');
var iOSPlatformChannelSpecifics = IOSNotificationDetails();
var platformChannelSpecifics = NotificationDetails(
androidPlatformChannelSpecifics, iOSPlatformChannelSpecifics);
await flutterLocalNotificationsPlugin.periodicallyShow(0, 'repeating title',
'repeating body', RepeatInterval.EveryMinute, platformChannelSpecifics);
}
Referring to: https://pub.dartlang.org/documentation/flutter_local_notifications/latest/flutter_local_notifications/FlutterLocalNotificationsPlugin/periodicallyShow.html
This package only offers 4 enums for periodically showing notification.
What you can do is to edit it's source code.
Here: https://github.com/MaikuB/flutter_local_notifications/blob/master/android/src/main/java/com/dexterous/flutterlocalnotifications/FlutterLocalNotificationsPlugin.java#L266
change 60000 * 60 to 60000 * 60 * 2
Remember to keep the backup of your original package.

Push Notifications using Firebase/Google Cloud Functions

I am trying to send a notification when a number is written to _random. I am able to get the device token, and the cloud function works perfectly. However, I do not receive the notification on the simulator. I am using the notification token that is pushed to Firebase to test with. If anybody can help, it will be highly appreciated.
https://i.stack.imgur.com/OB94c.png
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
//Initial function call:
exports.makeRandomFigures = functions.https.onRequest((req, res) => {
    //create database ref
    var rootRef = admin.database().ref();
    var doc_count_temp = 0;
    var keys = [];
    var random_num = 0;
    //get document count
    rootRef.once('value', (snapshot) => {
        doc_count_temp = snapshot.numChildren();
        //real number of member. if delete _timeStamp then minus 2 not 3!
        var doc_count = doc_count_temp - 3;
        //get num array previous generated
        var xRef = rootRef.child("_usedFigures");
        xRef.once('value', function(snap) {
            snap.forEach(function(item) {
                var itemVal = item.val();
                keys.push(itemVal);
            });
            //get non-duplicated random number
            var is_equal = true;
            while (is_equal) {
                random_num = Math.floor((Math.random() * doc_count) + 1);
                is_equal = keys.includes(random_num);
            }
            //insert new random vaule to _usedFigures collection
            rootRef.child('_usedFigures').push(random_num);
            rootRef.child('_random').set(random_num);
        });
    });
    //send back response
    res.redirect(200);
});
exports.sendFigureNotification = functions.database.ref('_random').onWrite(event => {
const payload = {
notification: {
title: 'Title',
body: `Test`, //use _random to get figure at index key
badge: '1',
sound: 'default'
}
};
const options = {
priority: "high",
timeToLive: 60 * 60 * 24, //24 hours
content_available: true
};
const token = "cge0F9rUTLo:APA91bGNF3xXI-5uxrdj8BYqRPkxUPA5x9IQALtm3VEFJAdV2WQrQufNkzIclT5B671mBcvR6IDMbgSKyL7iG2jAuxRM3qR3MXhkNp1_utlXhCpE2VZqTw6Yw3d4iMMvHl1B-Cvik6NY";
console.log('Sending notifications');
return admin.messaging().sendToDevice(token, payload, options);
});
You can't get push notifications on the simulator.
To try some alternative ways check this link :
How can I test Apple Push Notification Service without an iPhone?

Notifications ANE not displaying delayed notification

I have the latest Notification ANE for iOS at the moment and I have a couple of questions about set up and display of these notifications
When I set up categories is it necessary to set all the parameters that appear in the example to show actions?(or for example we can set up categories without including .setIcon( "ic_stat_distriqt_default" ) for example) This is the code I am using to set up a category:
service.categories.push(
new CategoryBuilder()
.setIdentifier( Constants.REMINDER_CATEGORY )
.addAction(
new ActionBuilder()
.setTitle( "Snooze" )
.setIdentifier( "ACCEPT_IDENTIFIER" )
.build()
)
.addAction(
new ActionBuilder()
.setTitle( "Accept" )
.setDestructive( true )
.setIdentifier( "DELETE_IDENTIFIER" )
.build()
)
.build()
);
One the other hand I am using for now two types of notification, simple and delayed. Simple notification is being displayed normally in notifications box and the event is being dispatched, however delayed notifications is not being displayed in notifications box but event is being dispatched.(Both cases with App open)
Is this the normal behaviour of delayed notification?
Will App dispatch and receive a delayed notification even if it's closed?
Here it's my code for both cases:
//simple notification
public function notifySimple(vo:LocalNotificationVO):void{
if (Notifications.isSupported)
{
_service.notify(
new NotificationBuilder()
.setId( vo.id )
.setAlert( vo.type )
.setTitle( vo.tittle +": "+vo.body)
.setBody( vo.body )
// .setSound( vo.soundPath )
.enableVibration(vo.enableVibration)
.setPayload( vo.payload )
.build()
);
}
}
//Delayed notification
public function setReminderNotification(vo:LocalNotificationVO):void
{
if (Notifications.isSupported)
{
_service.notify(
new NotificationBuilder()
.setId( vo.id )
.setDelay( vo.delay ) // Show the notification 5 seconds in the future
.setRepeatInterval( NotificationRepeatInterval.REPEAT_NONE ) // Repeat the notification every hour
.setAlert( vo.tittle )
.setTitle( vo.body ) //
.setBody( vo.body ) //Body is not displayed
// .setSound( vo.soundPath ) //SOUNDS DISABLED FOR NOW
// .enableVibration(vo.enableVibration)
// .setPayload( vo.payload )
// .setCategory( Constants.REMINDER_CATEGORY )
.build()
);
}
}
Being LocalNotificationVO (With getters that I didn't include):
public function LocalNotificationVO(id:int,
type:String,
tittle:String,
body:String,
payload:String,
delay:int = 0,
enableVibration:Boolean = true,
soundPath:String = "") {
_id = id;
_type = type;
_tittle = tittle;
_body = body;
_payload = payload;
_delay = delay;
_enableVibration = enableVibration;
_soundPath = soundPath;
}
Thank you very much in advance

Resources