I am currently developing an app that requires real time face detection. Right now I have the mlkit library in the app and I am using the firebase face detector. At the moment, it produces an error every time I try to detect a face from file:
DynamiteModule(13840): Local module descriptor class for com.google.android.gms.vision.dynamite.face not found.
As for the real time part, I tried using the RepaintBoundary in flutter to get a screenshot of the camera widget (almost)every frame and convert it into a binary file for face detection. But for some reason, flutter crashed every time I tried to screenshot the camera widget. It worked for other widgets.
After coming across both of these problems and spending quite a while trying to solve them, I've been thinking about just doing the camera part of the app in android/iOS native code(I would do this with OpenCV so that I can have real time detection). Is there a way I could use platform channels to implement a camera view in kotlin and swift and import that to a flutter widget? Or is there another easier way to implement this?
For the real-time access to camera image stream, I answered in another question How to access camera frames in flutter quickly that you want to use CameraController#startImageStream
import 'package:camera/camera.dart';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
void main() => runApp(MaterialApp(home: _MyHomePage()));
class _MyHomePage extends StatefulWidget {
#override
_MyHomePageState createState() => _MyHomePageState();
}
class _MyHomePageState extends State<_MyHomePage> {
dynamic _scanResults;
CameraController _camera;
bool _isDetecting = false;
CameraLensDirection _direction = CameraLensDirection.back;
#override
void initState() {
super.initState();
_initializeCamera();
}
Future<CameraDescription> _getCamera(CameraLensDirection dir) async {
return await availableCameras().then(
(List<CameraDescription> cameras) => cameras.firstWhere(
(CameraDescription camera) => camera.lensDirection == dir,
),
);
}
void _initializeCamera() async {
_camera = CameraController(
await _getCamera(_direction),
defaultTargetPlatform == TargetPlatform.iOS
? ResolutionPreset.low
: ResolutionPreset.medium,
);
await _camera.initialize();
_camera.startImageStream((CameraImage image) {
if (_isDetecting) return;
_isDetecting = true;
try {
// await doOpenCVDectionHere(image)
} catch (e) {
// await handleExepction(e)
} finally {
_isDetecting = false;
}
});
}
Widget build(BuildContext context) {
return null;
}
}
I did something this with OpenCV before, my solution was:
Start a new Activity or ViewController on Android and iOS respectively via platform channel. Example:
class FaceScanPlugin(val activity: Activity) : MethodCallHandler, PluginRegistry.ActivityResultListener {
var result: Result? = null
companion object {
#JvmStatic
fun registerWith(registrar: Registrar): Unit {
val channel = MethodChannel(registrar.messenger(), "com.example.facescan")
val plugin = BarcodeScanPlugin(registrar.activity())
channel.setMethodCallHandler(plugin)
registrar.addActivityResultListener(plugin)
}
}
override fun onMethodCall(call: MethodCall, result: Result): Unit {
if (call.method.equals("scan")) {
this.result = result
showFaceScanView()
} else {
result.notImplemented()
}
}
private fun showFaceScanView() {
val intent = Intent(activity, FaceScannerActivity::class.java)
activity.startActivityForResult(intent, 100)
}
override fun onActivityResult(code: Int, resultCode: Int, data: Intent?): Boolean {
if (code == 100) {
if (resultCode == Activity.RESULT_OK) {
return true
}
}
return false
}
}
Refer to Flutter QR scanner plugin on how to navigate to Android activity or iOS View.
Then do your OpenCV real-time face detection via Camera2 and AVFoundation.
Other than that, I supposed you can try out the new AndroidView or UIKitView if you want to embed the android or iOS into your Flutter app.
Related
Evening guys,
Im looking into building a plugin for Flutter that detects if the device is shaking. Now i've found how to technically do it in Swift (Detect shake gesture IOS Swift) but im stuck on how to hook it up as a Flutter plugin, because i don't have direct access to the view controller lifecycle events.
Need a way to hook up
viewDidLoad
canBecomeFirstResponder
motionEnded
Can anyone nudge me in the right direction?
The Flutter Team has already published a plugin called sensors, which can be used to detect motion from the accelerometer (and gyroscope).
import 'package:sensors/sensors.dart';
accelerometerEvents.listen((AccelerometerEvent event) {
// "calculate" "shakes" here
});
The event contains x, y and z values. Combining this with time will make it possible to check for shakes.
I am just pointing this out because it is way less to go than creating a full plugin from scratch.
You could try this plugin: shake_event
It's pretty simple to work with and works both for iOS and Android.
I ran into the same issue, so I figured Reactive Programming and RxDart could help.
You can create a BLoC (Business logic component) called sensor_bloc.dart :
import 'dart:async';
import 'dart:math';
import 'package:rxdart/rxdart.dart';
import 'package:sensors/sensors.dart';
class SensorBloc {
StreamSubscription<dynamic> _accelerometerStream;
//INPUT
final _thresholdController = StreamController<int>();
Sink<int> get threshold => _thresholdController.sink;
// OUTPUT
final _shakeDetector = StreamController<bool>();
Stream<bool> get shakeEvent => _shakeDetector.stream.transform(ThrottleStreamTransformer(Duration(seconds: 2)));
SensorBloc() {
const CircularBufferSize = 10;
double detectionThreshold = 70.0;
List<double> circularBuffer = List.filled(CircularBufferSize,0.0);
int index = 0;
double minX=0.0, maxX=0.0;
_thresholdController.stream.listen((value){
// safety
if (value > 30) detectionThreshold = value*1.0;
});
_accelerometerStream = accelerometerEvents.listen((AccelerometerEvent event){
index = (index == CircularBufferSize -1 ) ? 0: index+1;
var oldX = circularBuffer[index];
if (oldX == maxX) {
maxX = circularBuffer.reduce(max);
}
if (oldX == minX) {
minX = circularBuffer.reduce(min);
}
circularBuffer[index] = event.x;
if (event.x < minX ) minX=event.x;
if (event.x> maxX) maxX = event.x;
if (maxX-minX>detectionThreshold)
{
_shakeDetector.add(true);
circularBuffer.fillRange(0, CircularBufferSize, 0.0);
minX=0.0;
maxX=0.0;
}
});
}
void dispose() {
_shakeDetector.close();
_accelerometerStream.cancel();
_thresholdController.close();
}
}
Then, just subscribe to its events in your widget :
Declare StreamSubscription<bool> shakeSubscriber ; in your state, and hook to lifecycle events
(NB: I use a InheritedWidget giving me access to the umbrella BLoC via the static function MainWidget.bloc(context)):
StreamSubscription<bool> shakeSubscriber ;
#override
Widget build(BuildContext context) {
if(shakeSubscriber == null ) {
MainWidget.bloc(context).sensorBloc.shakeEvent.listen((_){
print("SHAKE ! *************************");
});
}
return _buildMainScaffold();
}
#override
void dispose() {
if(shakeSubscriber != null ) shakeSubscriber.cancel();
super.dispose();
}
I'm using FirebaseMessaging to send push notifications to my Flutter app. Those notifications contain chat details.
If the app is currently active and the user is at the chat page, the page should be updated to show the new message (this I can handle).
If the app is on any other page, a local notification/toast should be shown.
My problem, how do I forward the notification to the chat page?
I have FirebaseMessaging listening on the root page. And I can use ModalRoute.of(context).isCurrent to determine if the root page is the current page when the notification comes in. How can I broadcast the notification to the chat page when it is the active page?
In Swift, I'd use NotificationCenter to send data from the app delegate and the chat page would listen for it. I'm hoping something similar is available for Flutter.
Try this dart-event-bus.
An Event Bus using Dart Streams for decoupling applications
I've found a solution and I'm hoping it can help someone else in a similar situation.
I found the package Dart Message Bus
It does everything I need and make handling Streams much easier.
I did have to add one additional method (see the end).
Since the instructions were a bit cryptic, here's how I got it working.
//main.dart
import 'package:dart_message_bus/dart_message_bus.dart';
//global variable
final globalBus = new MessageBus();
class _MyHomePageState extends State<MyHomePage> {
// set up firebase messaging here
#override
void initState() {
super.initState();
_setupPushNotifs;
}
_setupPushNotifs() {
_firebaseMessaging.configure(
onMessage: (Map<String, dynamic> message) {
_processChatPush(message);
},
);
}
_processChatPush(Map<String, dynamic> message) async {
String messageID = message['messageID'];
globalBus.publish(new Message('newChat', data: "$messageID"));
}
}
//chat.dart
import 'package:dart_message_bus/dart_message_bus.dart';
import 'main.dart';
class _ChatPageState extends State<ChatPage> {
StreamSubscription streamListener;
#override
void initState() {
super.initState();
_listen();
}
#override
void dispose() {
streamListener.cancel();
streamListener = null;
}
_listen() async {
streamListener = globals.globalBus.subscribe('newChat', (Message m) async {
Map<String, dynamic> message = m.data;
String messageID = message['messedID'];
}
}
The dispose method is very important or the listener will keep listening and cause problems.
If you need verification that a subscriber is actually listening, modify the calling and listen methods:
// main.dart
_processChatPush(Map<String, dynamic> message) async {
String messageID = message['messageID'];
var callbackMessage = await globalBus.publish(
new Message('newChat', data: "$messageID"),
waitForKey: 'ackChat',
timeout: const Duration(seconds: 2)
);
if(callbackMessage.isEmpty) {
// means the another service's message was not received
// and timeout occured.
// process the push notification here
} else {
// the callback from another service received
// and callbackMessage.data contains callback-data.
// the push notification has been handled by chat.dart
}
}
// chat.dart
_listen() async {
streamListener = globals.globalBus.subscribe('newChat', (Message m) async {
Map<String, dynamic> message = m.data;
String messageID = message['messageID'];
var data = "ack";
var ack = new Message('ackChat', data: data);
globalBus.publish(ack);
}
}
I had to add one additional method in order to close the publishing stream when it's no longer needed.
Add to the end of Class MessageBus in message_bus.dart in the package source:
close() {
_streamController.close();
_streamController = null;
}
and then you can dispose the stream:
void dispose() {
globalBus.close();
super.dispose();
}
I ended up putting the globalBus variable in a library file. Then import that library in main.dart and chat.dart and remove the import main.dart from chat.dart.
Am new to Dart-flutter, and I need to write an app that can pick user location in the background after say an hour.
I have tried to work things around the geolocation plugin and Timer class, which worked fine but only picks the location once.
I need to know if there is a service workaround or the best way to go about this.
thank you
class HomePage extends StatefulWidget{
static const timeout = const Duration(seconds: 5);
#override
State createState() => new HomePageState();
HomePage(){
startTimeout();
}
startTimeout()async {
final GeolocationResult result = await
Geolocation.isLocationOperational();
if(result.isSuccessful) {
return new Timer(timeout, handleTimeout);
}
else {
debugPrint(result.error.toString());
GeolocationResult res = await
Geolocation.requestLocationPermission(const LocationPermission(
android: LocationPermissionAndroid.fine,
ios: LocationPermissionIOS.always,
));
if(res.isSuccessful) {
new Timer.periodic(timeout,(Timer t)=> handleTimeout);
} else {
debugPrint(res.error.toString());
}
}
}
void handleTimeout(){
debugPrint("Print after 10 seconds");
Geolocation.currentLocation(accuracy:
LocationAccuracy.best).listen((result) {
if(result.isSuccessful) {
var posts = new postservice.Post();
var userid=100;
posts.sendLocation(result.location.latitude,result.location.longitude,userid);
}
});
}
}
I'm new to both Flutter and Dart, and I'm trying to use the Camera Plugin to understand how things work. All examples I find have this part:
List<CameraDescription> cameras;
Future<Null> main() async {
cameras = await availableCameras();
runApp(new CameraApp());
}
Is there some way I could do this inside the initState() method? I guess this is also a more general question regarding async work required before the initState-method is run. (As the initState-method cannot be async).
My goal is to create a StatefulWidget containing a feed from the camera, that is used from another file. Here's what I have so far. Any help appreciated!
List<CameraDescription> cameras;
#override
void initState() {
super.initState();
getCameras();
controller = new CameraController(cameras[0], ResolutionPreset.medium);
controller.initialize().then( (_) {
if (!mounted) {
return;
}
setState(() {});
});
}
Future<Null> getCameras() async {
cameras = await availableCameras();
}
You can't do async work in initState, but you can kick-off async work done in other functions, and then signal when you are done with a setState call. Using await you can make sure the cameras and controller are setup in the correct order. calling setState at the end will make sure the widget gets rebuilt at the end, where you can pass your initialized camera controller wherever you want.
class _CameraState extends State<CameraWidget> {
List<CameraDescription> cameras;
CameraController controller;
bool _isReady = false;
#override
void initState() {
super.initState();
_setupCameras();
}
Future<void> _setupCameras() async {
try {
// initialize cameras.
cameras = await availableCameras();
// initialize camera controllers.
controller = new CameraController(cameras[0], ResolutionPreset.medium);
await controller.initialize();
} on CameraException catch (_) {
// do something on error.
}
if (!mounted) return;
setState(() {
_isReady = true;
});
}
Widget build(BuildContext context) {
if (!_isReady) return new Container();
return ...
}
}
You also want to make sure you handle any errors, the package includes a CameraException which is thrown when the platform specific code fails.
I am trying to scan QR code with my code. My code is running fine with 5.0(Bold) and 7.1(Torch) OS phones. It is running fine with 7.1 and 5.0. but giving problem while running with 6.0 OS(Bold 9700). The problem is - "While trying to scan QR code, app scans the QR code but camera screen doesn't pop and it remains at the front. Event it is not able to hide by using Esc key". please help me to resolve the issue with os6.
Edit:
Code while opening camera screen for QR code scan:
Hashtable hints = new Hashtable();
// The first thing going in is a list of formats. We could look for
// more than one at a time, but it's much slower.
Vector formats = new Vector();
formats.addElement(BarcodeFormat.QR_CODE);
hints.put(DecodeHintType.POSSIBLE_FORMATS, formats);
// We will also use the "TRY_HARDER" flag to make sure we get an
// accurate scan
hints.put(DecodeHintType.TRY_HARDER, Boolean.TRUE);
// We create a new decoder using those hints
BarcodeDecoder decoder = new BarcodeDecoder(hints);
// Finally we can create the actual scanner with a decoder and a
// listener that will handle the data stored in the QR code. We put
// that in our view screen to handle the display.
try {
_scanner = new BarcodeScanner(decoder, new LeadQRcodeDecoderListener());
_QRcodeScreen = new LeadQRcodeScannerViewScreen(_scanner);
// If we get here, all the QR code scanning infrastructure should be set
// up, so all we have to do is start the scan and display the viewfinder
_scanner.startScan();
UiApplication.getUiApplication().pushScreen(_QRcodeScreen);
}
catch (Exception e) {
e.printStackTrace();
return;
}
code for closing screen is:
UiApplication.getUiApplication().invokeLater(new Runnable() {
public void run() {
UiApplication.getUiApplication().popScreen(_QRcodeScreen);
}
});
I am calling this code after scanning of QR code.
This is a problem with OS6 in some devices that has been asked before on this site. Last one was two days ago:
Blackberry OS6 camera wont shut down after capture
AFAIK there's no API to close the camera app, so it has to be done with key injection hacks, that are tricky because they need accurate timing and as CPUs are different in some models, and also because the camera app has a different design in some OSes.
So either you use JSR135 and use a renamed Zxing package to provide a camera view contained in your app, or just follow your approach but instead of closing the camera app you just bring to foreground your own app.
I have solved my same issue for os 6. After scanning of QR code, close all player and scanner connection.
You can use-
if (_scanner != null && _scanner.getPlayer() != null) {
_scanner.getPlayer().close();
}
It is helpful to me.
This will definitely help you.
here is my code , it's working perfectly in OS 6.0 device 9830
/**
* First Invoke the QR Scanner
*/
ViewFinderScreen _viewFinderScreen =
new ViewFinderScreen(ShoopingCartScreen.this); // ShoopingCartScreen.this Current Screen Object
UiApplication.getUiApplication().pushScreen(_viewFinderScreen);
package com.application.qrScanner;
import java.util.Hashtable;
import java.util.Vector;
import javax.microedition.media.MediaException;
import javax.microedition.media.Player;
import javax.microedition.media.control.VideoControl;
import net.rim.device.api.barcodelib.BarcodeDecoder;
import net.rim.device.api.barcodelib.BarcodeDecoderListener;
import net.rim.device.api.barcodelib.BarcodeScanner;
import net.rim.device.api.io.Base64InputStream;
import net.rim.device.api.io.http.HttpDateParser;
import net.rim.device.api.ui.Field;
import net.rim.device.api.ui.FieldChangeListener;
import net.rim.device.api.ui.Keypad;
import net.rim.device.api.ui.UiApplication;
import net.rim.device.api.ui.component.ButtonField;
import net.rim.device.api.ui.container.MainScreen;
import com.application.data.ShoopingCartObj;
import com.application.global.Global;
import com.application.log.Log;
import com.application.main.MessageScreen;
import com.application.main.orderDetail.orderSection.InputPopUpScreen;
import com.application.main.shoopingSection.ShoopingCartScreen;
import com.google.zxing.BarcodeFormat;
import com.google.zxing.DecodeHintType;
public class ViewFinderScreen extends MainScreen
{
private BarcodeScanner _scanner;
private short _frequency = 1046;
private short _duration = 200;
private int _volume = 100;
private VideoControl vc;
private ButtonField _btnCancel;
private ShoopingCartScreen _shoopingCartScreen;
/**
* Creates a new ViewFinderScreen object
*/
public ViewFinderScreen(ShoopingCartScreen _shoopingCartScreen)
{
this._shoopingCartScreen = _shoopingCartScreen;
_btnCancel = new ButtonField("Cancel" , ButtonField.USE_ALL_WIDTH)
{
protected boolean navigationClick(int status, int time)
{
fieldChangeNotify(1);
return true;
}
};
_btnCancel.setChangeListener(new FieldChangeListener()
{
public void fieldChanged(Field field, int context)
{
stopScan();
UiApplication.getUiApplication().popScreen(ViewFinderScreen.this);
}
});
// Initialize Hashtable used to inform the scanner how to
// recognize the QR code format.
Hashtable hints = new Hashtable();
Vector formats = new Vector(1);
formats.addElement(BarcodeFormat.QR_CODE);
hints.put(DecodeHintType.POSSIBLE_FORMATS, formats);
// Initialize the BarcodeDecoder
BarcodeDecoder decoder = new BarcodeDecoder(hints);
// Create a custom instance of a BarcodeDecoderListener to pop the
// screen and display results when a QR code is recognized.
BarcodeDecoderListener decoderListener = new BarcodeDecoderListener()
{
/**
* #see BarcodeDecoderListener#barcodeDecoded(String)
*/
public void barcodeDecoded(String rawText)
{
try {
String encoded = rawText;
byte[] decoded = Base64InputStream.decode( encoded );
rawText = new String(decoded);
System.out.println( new String( decoded ) );
}
catch (Throwable t) {
System.out.println( "Unable to decode string: " + t.getMessage() );
}
displayMessage(rawText);
ViewFinderScreen.this. _shoopingCartScreen.beep();
}
};
try
{
// Initialize the BarcodeScanner object and add the associated
// view finder.
_scanner = new BarcodeScanner(decoder, decoderListener);
vc = _scanner.getVideoControl();
vc.setDisplayFullScreen(true);
add(_scanner.getViewfinder());
setStatus(_btnCancel);
}
catch(Exception e)
{
displayMessage("Initilize Scanner: " + e.getMessage());
}
startScan();
}
/**
* Informs the BarcodeScanner that it should begin scanning for QR Codes
*/
public void startScan()
{
try
{
_scanner.startScan();
}
catch(MediaException me)
{
displayMessage(" Start Scan Error: " + me.getMessage());
}
}
public void stopScan()
{
try
{
Player p = _scanner.getPlayer() ;
if(p != null)
{
p.stop();
p.deallocate();
p.close();
}
}
catch (Exception e)
{
MessageScreen.msgDialog("Exception in Stop Scanning "+e.toString());
}
}
/**
* Pops the ViewFinderScreen and displays text on the main screen
*
* #param text Text to display on the screen
*/
private void displayMessage(final String text)
{
Log.d("QR Code String ", text);
UiApplication.getUiApplication().invokeLater(new Runnable()
{
public void run()
{
stopScan();
}
});
}
protected boolean keyDown(int keycode, int time)
{
if (Keypad.key(keycode) == Keypad.KEY_ESCAPE)
{
stopScan();
return true;
}
return super.keyDown(keycode, time);
}
}