I've search and tried these plugins:
qr_bar_code_scanner_dialog: ^0.0.5
ai_barcode_scanner: ^0.0.1+1
But none of these works when I tried to open my Web app in my phones browser. The screen turns black when I press my Button that should have started the scanner.
I've seen that many others have problems with this. So I hope there is a solution out there that works for me.
The whole reason I use Flutter is that it seems to be the easiest way to develope an app for both Android and iOS, using Web.
In debug mode everything works fine with desktop browser and phone.
But not when I have deployed my app with Firebase Hosting and use the phone browser.
IconButton(
onPressed: (){
//scanQR();
_qrBarCodeScannerDialogPlugin.getScannedQrBarCode(
context: context,
onCode: (code) {
print(code);
setState(() {
this.code = code;
});
});
},
icon: const Icon(Icons.qr_code_scanner),
iconSize: 130,
tooltip: 'Scan',
color: const Color.fromRGBO(28,37,44,1),
),
You can combine two plugins - camera and flutter_barcode_sdk - to implement a web QR scanner app.
flutter_barcode_sdk: provide API to decode barcode and Qr code from byte array, supporting Windows, Linux, macOS, Android, iOS and Web:
List<BarcodeResult> results =
await _barcodeReader.decodeImageBuffer(
byteData.buffer.asUint8List(),
image. Width,
image. Height,
byteData.lengthInBytes ~/ image.height,
ImagePixelFormat.IPF_ARGB_8888.index);
camera: the official camera plugin that can return camera frame in real-time:
await _controller.startImageStream((CameraImage availableImage) async {
int format = ImagePixelFormat.IPF_NV21.index;
switch (availableImage.format.group) {
case ImageFormatGroup.yuv420:
format = ImagePixelFormat.IPF_NV21.index;
break;
case ImageFormatGroup.bgra8888:
format = ImagePixelFormat.IPF_ARGB_8888.index;
break;
default:
format = ImagePixelFormat.IPF_RGB_888.index;
}
_barcodeReader
.decodeImageBuffer(
availableImage.planes[0].bytes,
availableImage.width,
availableImage.height,
availableImage.planes[0].bytesPerRow,
format)
.then((results) {
// TODO
}).catchError((error) {
});
});
Related
I'm using the multi_image_picker2 library. It is functioning fine, however there is no option to access camera or photo library (iOS specific here).Here is a snippet of my code:
import 'dart:typed_data';
import 'package:multi_image_picker2/multi_image_picker2.dart';
class ImagePickerFacade {
Future<List<Asset>?> loadImage() async {
try {
var resultList = await MultiImagePicker.pickImages(
maxImages: 300,
enableCamera: true,
cupertinoOptions: const CupertinoOptions(takePhotoIcon: "chat"),
);
if (resultList.isEmpty) {
return null;
} else {
return resultList;
}
} on NoImagesSelectedException catch (e) {
// User pressed cancel, update ui or show alert
print(e);
} on Exception catch (e) {
// Do something
print(e);
}
}
}
I have also included the following keys in my Info.plist file:
<key>NSCameraUsageDescription</key>
<string>App requires access to your phone camera.</string>
<key>NSMicrophoneUsageDescription</key>
<string>App requires access to your phone audio.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>App requires access to your phone photo library.</string>
I have tested my app using iOS 10, iOS 13, iOS 9 and the problem still persists. I also tried entering the keys above through XCode instead of manually through the Info.plist file, however nothing seems to work. This is what I currently see when the image picker pops up:
Here is a link to the package I am using: https://pub.dev/packages/multi_image_picker2
Cheers
You can use this package for picking multiple images, videos or any types https://pub.dev/packages/file_picker
Define a value for picked images
List<PlatformFile> images = [];
When you pressed button, gallery will open and you can choose image, video or file (if you want, you can pick multiple files).
RaisedButton(
color: Colors.red,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(5.0),
side: BorderSide(
color: Colors.yellow)),
onPressed: () async {
FilePickerResult result = await FilePicker.platform.pickFiles(
allowMultiple: true,
allowCompression: true,
type: FileType.any,
);
setState(() {
model.images = result.files;
});
},
child: Text(
"Import Image/Video", style: TextStyle(
color: Colors.white,
fontWeight: FontWeight.w500),
),
)
The problem is accessing the camera on iPhone does not work. Trying access the camera on Android does work. Im building app using flutter which Im trying to make the app functional for both device. Every time I press on camera button. The app exit by itself. Im trying to send this picture to the firebase database. Doing this way won't let me access camera functionality for iOS.
I have tried to eliminate photo library functionality but that did not work. Camera before was working. The camera dependency is in Pubspec.yaml
Future getImage() async {
var tempimage = await ImagePicker.pickImage(source: ImageSource.camera);
setState(() {
sampleimage = tempimage;
});
}
Future getLibraryImage() async {
var libraryimage = await ImagePicker.pickImage(source: ImageSource.gallery);
setState(() {
sampleimage = libraryimage;
});
}
shape: RoundedRectangleBorder(borderRadius: BorderRadius.circular(9.0)),
onPressed: () { getImage(); },
child: Row(
children: [
Text(" Camera "),
Image.asset("images/camera.png", height: 15.0,),
],
),
),
),
),
You have to ask the permission to the user for using the camera.
For that, you can add these lines in Info.plist file in ios/Runner
<key>NSPhotoLibraryUsageDescription</key>
<string>Using the library for ...</string>
<key>NSMicrophoneUsageDescription</key>
<string>Using the mic for ...</string>
<key>NSCameraUsageDescription</key>
<string>Using the camera for ...</string>
UPDATE: Problem solved with new released googlemaps api 3.35
Problem:
When trying to pinch (or double tap quickly) to zoom in, if there's marker on the mapview, the app may crash.
It didn't happen until Google releases their JS API 3.32 which uses a new map renderer. We used to force using 3.31 but since August it has been deleted.
Using:
Phonegap: Phonegap cli-7.1.0, Cordova iOS 4.5.4
iOS: 11, 12 (possibly all iOS versions)
Devices: iPhone 8, iPhone X, iPad Pro (possibly all Apple devices)
GoogleMaps JavaScript API: 3.32 - 3.34
React: 16.2.0
Google Maps Implementation (tried in 4 ways):
react-google-maps: crash
google-map-react: crash
normal js implementation: not crash
var marker1 = new window.google.maps.Marker({
position: { lat: 39.304, lng: -76.617 },
map: this.map
});
var marker2 = new window.google.maps.Marker({
position: { lat: 39.305, lng: -76.616 },
map: this.map
});
create a React component with normal js implementation: crash
My Marker Component:
class Marker extends Component {
componentDidMount() {
const { position, icon } = this.props;
this.marker = new window.google.maps.Marker({
position,
icon
});
}
componentWillReceiveProps(newProps) {
const { map, position } = newProps;
if (map && !this.marker.map) {
this.marker.setMap(map);
}
if (position.lat !== this.props.position.lat || position.lng !== this.props.position.lng) {
this.marker.setPosition(position);
}
}
componentWillUnmount() {
this.marker.setMap(null);
}
render() {
return false;
}
}
How I use this Component:
{_.map(studyAreas, (s, i) => (
<Marker
icon={s.icon}
position={s.position}
map={this.map}
key={i}
/>
))
}
Relevant Links
IOS : Cordova App Crash on Google Map api zoom in ios 11.3 iphone x
https://github.com/ionic-team/cordova-plugin-ionic-webview/issues/75
In the discussion of this github issue, it seems like they had this crashing problem because of some js libraries. However, I'm having this problem even when I created a simple new phonegap app without referencing any extra libraries.
Any help, discussion, thoughts would be appreciated, thanks!
I need some cordova plugin or function, which can stream video from device camera on ios. GetUserMedia is not supported on iOS.
I tryied Crosswalk project, it should support WebRTC (getUserMedia), but it doesnt work on phonegap build.
So I tryied cordova-camera-preview, but I am not able to make it fullscreen, because camera stream is shrinked.
Is there anybody, who used fullscreen video in cordova application?
<script>
document.addEventListener("deviceready", function () {
app_ready();
}, false);
function app_ready() {
var vw = window.innerWidth;
var vh = window.innerHeight;
var tapEnabled = true; //enable tap take picture
var dragEnabled = true; //enable preview box drag across the screen
var toBack = true; //send preview box to the back of the webview
var rect = {x: 0, y: 0, width: vw, height: vh};
cordova.plugins.camerapreview.startCamera(rect, "front", tapEnabled, dragEnabled, toBack);
}
</script>
After deep searching I found out some augmented reality plugin called "ezAR videooverlay", check out npm -> https://www.npmjs.com/package/com.ezartech.ezar.videooverlay
Hope it helps someone
It just driving me crazy why it is not working. It should be very simple.
I added the plugin using the CLI command. I am sure it is a beginner question, but I just don't get it.
Then I have a "button" called "prendrephoto1". When click it should take a picture. But when button click, just nothing happen on the Xcode Ios Simulator.
I use JqueryMobile and Cordova.
$(document).ready(function(){
var photo1;
var photo2;
$("#prendrephoto1").bind("click",prendrephoto);
function prendrephoto(){
navigator.camera.getPicture(onSuccess, onFail, { quality: 50,
destinationType: Camera.DestinationType.DATA_URL
});
function onSuccess(imageData) {
var image = document.getElementById('myImage');
image.src = "data:image/jpeg;base64," + imageData;
}
function onFail(message) {
alert('ca ne marche pas');
alert('Failed because: ' + message);
}
};
//DERNIERE BALISE JQUERY
});
The iOS simulator does not provide capture simulation, you'll need to run your app on an actual device to retrieve a picture from the camera.