The problem is accessing the camera on iPhone does not work. Trying access the camera on Android does work. Im building app using flutter which Im trying to make the app functional for both device. Every time I press on camera button. The app exit by itself. Im trying to send this picture to the firebase database. Doing this way won't let me access camera functionality for iOS.
I have tried to eliminate photo library functionality but that did not work. Camera before was working. The camera dependency is in Pubspec.yaml
Future getImage() async {
var tempimage = await ImagePicker.pickImage(source: ImageSource.camera);
setState(() {
sampleimage = tempimage;
});
}
Future getLibraryImage() async {
var libraryimage = await ImagePicker.pickImage(source: ImageSource.gallery);
setState(() {
sampleimage = libraryimage;
});
}
shape: RoundedRectangleBorder(borderRadius: BorderRadius.circular(9.0)),
onPressed: () { getImage(); },
child: Row(
children: [
Text(" Camera "),
Image.asset("images/camera.png", height: 15.0,),
],
),
),
),
),
You have to ask the permission to the user for using the camera.
For that, you can add these lines in Info.plist file in ios/Runner
<key>NSPhotoLibraryUsageDescription</key>
<string>Using the library for ...</string>
<key>NSMicrophoneUsageDescription</key>
<string>Using the mic for ...</string>
<key>NSCameraUsageDescription</key>
<string>Using the camera for ...</string>
Related
I've search and tried these plugins:
qr_bar_code_scanner_dialog: ^0.0.5
ai_barcode_scanner: ^0.0.1+1
But none of these works when I tried to open my Web app in my phones browser. The screen turns black when I press my Button that should have started the scanner.
I've seen that many others have problems with this. So I hope there is a solution out there that works for me.
The whole reason I use Flutter is that it seems to be the easiest way to develope an app for both Android and iOS, using Web.
In debug mode everything works fine with desktop browser and phone.
But not when I have deployed my app with Firebase Hosting and use the phone browser.
IconButton(
onPressed: (){
//scanQR();
_qrBarCodeScannerDialogPlugin.getScannedQrBarCode(
context: context,
onCode: (code) {
print(code);
setState(() {
this.code = code;
});
});
},
icon: const Icon(Icons.qr_code_scanner),
iconSize: 130,
tooltip: 'Scan',
color: const Color.fromRGBO(28,37,44,1),
),
You can combine two plugins - camera and flutter_barcode_sdk - to implement a web QR scanner app.
flutter_barcode_sdk: provide API to decode barcode and Qr code from byte array, supporting Windows, Linux, macOS, Android, iOS and Web:
List<BarcodeResult> results =
await _barcodeReader.decodeImageBuffer(
byteData.buffer.asUint8List(),
image. Width,
image. Height,
byteData.lengthInBytes ~/ image.height,
ImagePixelFormat.IPF_ARGB_8888.index);
camera: the official camera plugin that can return camera frame in real-time:
await _controller.startImageStream((CameraImage availableImage) async {
int format = ImagePixelFormat.IPF_NV21.index;
switch (availableImage.format.group) {
case ImageFormatGroup.yuv420:
format = ImagePixelFormat.IPF_NV21.index;
break;
case ImageFormatGroup.bgra8888:
format = ImagePixelFormat.IPF_ARGB_8888.index;
break;
default:
format = ImagePixelFormat.IPF_RGB_888.index;
}
_barcodeReader
.decodeImageBuffer(
availableImage.planes[0].bytes,
availableImage.width,
availableImage.height,
availableImage.planes[0].bytesPerRow,
format)
.then((results) {
// TODO
}).catchError((error) {
});
});
I'm using the multi_image_picker2 library. It is functioning fine, however there is no option to access camera or photo library (iOS specific here).Here is a snippet of my code:
import 'dart:typed_data';
import 'package:multi_image_picker2/multi_image_picker2.dart';
class ImagePickerFacade {
Future<List<Asset>?> loadImage() async {
try {
var resultList = await MultiImagePicker.pickImages(
maxImages: 300,
enableCamera: true,
cupertinoOptions: const CupertinoOptions(takePhotoIcon: "chat"),
);
if (resultList.isEmpty) {
return null;
} else {
return resultList;
}
} on NoImagesSelectedException catch (e) {
// User pressed cancel, update ui or show alert
print(e);
} on Exception catch (e) {
// Do something
print(e);
}
}
}
I have also included the following keys in my Info.plist file:
<key>NSCameraUsageDescription</key>
<string>App requires access to your phone camera.</string>
<key>NSMicrophoneUsageDescription</key>
<string>App requires access to your phone audio.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>App requires access to your phone photo library.</string>
I have tested my app using iOS 10, iOS 13, iOS 9 and the problem still persists. I also tried entering the keys above through XCode instead of manually through the Info.plist file, however nothing seems to work. This is what I currently see when the image picker pops up:
Here is a link to the package I am using: https://pub.dev/packages/multi_image_picker2
Cheers
You can use this package for picking multiple images, videos or any types https://pub.dev/packages/file_picker
Define a value for picked images
List<PlatformFile> images = [];
When you pressed button, gallery will open and you can choose image, video or file (if you want, you can pick multiple files).
RaisedButton(
color: Colors.red,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(5.0),
side: BorderSide(
color: Colors.yellow)),
onPressed: () async {
FilePickerResult result = await FilePicker.platform.pickFiles(
allowMultiple: true,
allowCompression: true,
type: FileType.any,
);
setState(() {
model.images = result.files;
});
},
child: Text(
"Import Image/Video", style: TextStyle(
color: Colors.white,
fontWeight: FontWeight.w500),
),
)
I have a grid of Images and I want to open a simple dialog when I long press an Image and to be closed automatically when my finger no longer contacts with the screen (like Instagram quick image preview).
I attached LongPress event to all the images and it works fine so a dialog opens up when I long press an image however when I put my finger up nothing happens even though I attached events like onTapUp, onLongPressEnd, onPointerUp Because of the new opened dialog, All of those events are lost and no longer fires up.
I tried to add the pointer up events to the opened dialog instead but there is a catch, I must tap and release again in order to make it work because Flutter unable to recognize that my finger is already in contact with screen and the opened dialog caused flutter to forget about this fact.
You can insert an OverlayEntry into the Overlay stack by using Overlay.of(context).insert(overlayEntry).
In this overlay, you can catch gestures when required and take actions accordingly. As overlays always sit on top of anything else, the dialog will not cancel your long press gesture and you will be able to respond to longPressEnd.
You will only need to calculate which image has been pressed or use the Offset's provided by onTapDown and the position of the images.
To get the global position of your images, you can assign GlobalKey's to your images and get their global positions in the following way:
final RenderBox renderBox = globalKey.currentContext.findRenderObject() as RenderBox;
final Offset position = renderBox.localToGlobal(Offset.zero);
final Size size = renderBox.size;
To get the position of your long press, you will need to store the position of onTapDown:
onTapDown: (details) => position = details.globalPosition
Now you have everything you need to figure out which bounds the long press happened in.
I found a way to make it work. It can be done with Overlay Widget.
In the widget with GestureDetector, when onLongPress is called, create an OverlayEntry object with your dialog, and insert it into Overlay.
When onLongPressEnd is called, call the remove function of OverlayEntry object.
// Implement a function to create OverlayEntry
OverlayEntry getMyOverlayEntry({
#required BuildContext context,
SomeData someData,
}) {
return OverlayEntry(
builder: (context) {
return AlertDialog(child: SomeWidgetAgain());
}
);
}
// In the widget where you want to support long press feature
OverlayEntry myOverayEntry;
GestureDetector(
onLongPress: () {
myOverayEntry = getMyOverlayEntry(context: context, someData: someData);
Overlay.of(context).insert(myOverayEntry);
},
onLongPressEnd: (details) => myOverayEntry?.remove(),
child: SomeWidgerHere(),
)
Here's the gist on Github:
https://gist.github.com/plateaukao/79aa39854dc4eabf1220bdfa9a0334b6
You can use AnimatedContainer and put a GestureDetector inside.
change width and height using setState and it's done.
Center(
child: AnimatedContainer(
width: containerWidth,
height: containerHeight,
color: Colors.red,
duration: Duration(seconds: 1),
child: GestureDetector(
onLongPress: (){
print("Long Press");
setState(() {
containerWidth = 200;
containerHeight = 200;
});
},
onLongPressUp: (){
print("On Long Press UP");
setState(() {
containerWidth = 100;
containerHeight = 100;
});
},
),
),
)
I completed the Flutter NameGenerator code lab and wanted to extend it to remove items directly from the "Saved suggestions list".
To do so, I've added the onTap handler below which removes the pair from the list.
However, the list doesn't update until I navigate back and reopen the screen again.
How do I immediately update the list on the second screen?
void _pushSaved() {
Navigator.of(context).push(MaterialPageRoute<void>(
builder: (BuildContext context) {
final Iterable<ListTile> tiles = _saved.map((WordPair pair) {
return ListTile(
title: Text(
pair.asPascalCase,
style: _biggerFont,
),
onTap: () => setState(() {
_saved.remove(pair);
}),
);
});
final List<Widget> divided = ListTile.divideTiles(
context: context,
tiles: tiles,
).toList();
return Scaffold(
appBar: AppBar(
title: const Text('Saved Suggestions'),
),
body: new ListView(children: divided),
);
}),
);
}
Why your code doesn't work
The reason your list doesn't update is that it's a different screen pushed on the Navigator.
Because your _pushSaved method is inside the original screen, you call setState on that screen and rebuild all the widgets of the original screen.
The pushed screen isn't affected because it's not a child of your original screen.
Rather, the original screen told the Navigator to create a new screen, so it's some subtree of the Navigator of your MaterialApp and not accessible to you.
Solution
Accessing the same live data on different screens is something that's not that easy to do just with StatefulWidgets.
Basically, your project has grown complex enough so that it's time to think about a more sophisticated state management solution.
Here's a video from Google I/O about state management that you could check out for some inspiration.
Let's say I have an image on the device's disk and load it on the screen providing its path to a FileImage.
I edit that image and save it on the same path expecting calling the setState(() {}) function will reload it. But it doesn't.
I tried clearing the image cache by calling imageCache.clear() function and also imageProvider.evict() but no difference.
If I close that page and open it again, I see the updated images.
I assume the images that are being displayed on the screen are in the memory, if my assumption is correct, how to reload it?
I know I'm a bit late, but I used this workaround:
_tmpImageFile.existsSync() ? Image.memory(
Uint8List.fromList(_tmpImageFile.readAsBytesSync()),
alignment: Alignment.center,
height: 200,
width: 200,
fit: BoxFit.contain,
) : Container(),
This is also working for me:
Image previewImage = Image.file(
File(scenePreviewFilename),
);
And when pressing a button or something just evict the cached image from imageCache like this:
setState(() {
imageCache.evict(previewImage.image, includeLive: true);
});
PS: I had some issues getting this to work(similar to the initial post) because I was generating the image in another thread and the image was loaded before it was generated, resulting in the imageCache appearing not to have an effect ... so make sure you are not in this race situation.
Already tested - Adding the both image cache commands will solve the problem.
imageCache.clear();
imageCache.clearLiveImages();
});
Would changing the key of the Image do the trick?
Set a key to the image
Image.file(
_imageFile,
key: _key,
fit: BoxFit.cover,
),
and change the key every time you change _imageFile, let's set it to timestamp for example
setState(() {
_imageFile = newImageFile;
_key = DateTime.now().millisecondsSinceEpoch;
});
This is a bit ugly workaround, but it works for me. Here we're resetting image cache + force reloading image from disk:
class _MyAppState extends State<MyApp> {
final imageKey = GlobalKey();
bool forceLoad = false;
#override
Widget build(BuildContext context) {
final app = MaterialApp(
home: Scaffold(
body: forceLoad
? Image.memory(currentFile.readAsBytesSync(), key: imageKey)
: Image.file (currentFile, key: imageKey),
floatingActionButton: FloatingActionButton(
child: const Icon(Icons.refresh),
onPressed: () async {
await (imageKey.currentWidget as Image).image.evict(); // reset cache for current image
setState(() {
forceLoad = true; // reloading current image from disk
});
},
)
)
);
forceLoad = false;
return app;
}
}
The other answers were missing step 2. Full explanation here.
call imageCache.clear(); and then imageCache.clearLiveImages(); when the new image needs to be reloaded.
Add a value key in the image widget to ensure the new image gets rebuilt
Image(image: FileImage(File(pathName)), key: ValueKey(File(pathName).lengthSync()))
The image should reload after.