I develop a screen in flutter which sending an image to my node.js server.
I have a File(from image_picker package) and I want to convert the image to base64 string.
I wrote this code:
List<int> imageBytes = await widget.picture.readAsBytes();
String base64Image = base64Encode(imageBytes);
and when the function base64Encode start, the UI of my application get stuck.
I didn't find async function and dart does not have threads so I really don't know what to do.
Anyone familiar with the problem?
Related
I am working on an app, that reads a Firestore document id from the NFC card and displays the data contained by this document.
I have tested it on an Samsung J6 and an Oppo for Android and on an iPhone 7 & 12 Pro Max and everything worked perfectly with react-native-nfc-manager v3.10.1.
After I have updated this package to the lastest version (3.13.2), the methods for reading data from the NFC card have changed, e.g:
let tech = Platform === 'ios' ? NfcTech.MifareIOS : NfcTech.MifareUltralight;
...
// for Android
const resp = await NfcManager.mifareUltralightHandlerAndroid.mifareUltralightReadPages(6);
// instead of
const resp = NfcManager.transceive([...]);
// for iOS
const resp = await NfcManager.?.mifareUltralightReadPages(6);
// instead of
const resp = NfcManager.sendMifareCommandIOS([...]);
The problem arises for iOS (where I gave put the ?). Apparently there is no handler for the MifareIOS nfcTech type.
Does anyone have any idea if it will come in the future or if, at the moment, other handler could be used instead?
Thanks a lot!!!
In case you guys don't know, there was a problem previously with this library not rendering local images on Android as well, but apparently it was solved. Now, I'm facing the exact same issue on iOS, with a difference that I can use static images like assets/src/assets/images/logo.png. But when the images start with something like file:///, storage://, ph:// it simply does not get rendered.
What I'm doing is trying to generate a PDF report file, which must be generated independently the user has an internet connection or not. That is the reason why I have to use local images.
The static image is the logo of the company, and the local image which is not getting rendered is an image saved to the phone's storage through Image Picker or Camera Roll. The React Native Image component displays the image perfectly, so I don't think I'm using a wrong path.
What I have tried so far:
Removing the file:/// or storage:// or ph:// from the beginning of the path string;
In some cases, when I save an image to the phone's library with Camera Roll, it will return a path that starts with ph:// but without an extension, such as .jpg or .png. I tried to put the extension manually, and still does not make any difference;
I tried to convert the image to base64 using rn-fetch-blob (with RNFetchBlob.fs.base64.encode(path)), but still got no success.
Devices:
iPhone SE with iOS 14 (also simulator iPhone 11 with iOS 15)
MacBook Air 2017 Core i5 1.8GHz and 8gb RAM (macOS Big Sur 11.5.2)
Environment
node: 12.22.7
npm: 6.14.15
react: 16.9.0
react-native: 0.61.5
react-native-html-to-pdf: ^0.11.0 (updating it to 0.12.0 also got me the same result)
Code:
sharePDF = async () => {
try {
this.changeVisibilityOptions(false);
this.changeVisibilityLoading('Gerando PDF...');
let htmlTemplate = '';
htmlTemplate = await getPDFDespesa(this.state);
const pdfOptions = {
html: htmlTemplate,
fileName: 'RelatorioDespesas',
directory: 'Relatorios'
};
let pdfFile = await RNHTMLtoPDF.convert(pdfOptions);
this.changeVisibilityLoading(false);
const shareOptions = {
title: 'Compartilhar com:',
url: `file://${pdfFile.filePath}`,
type: 'application/pdf',
failOnCancel: false
};
const ShareResponse = await Share.open(shareOptions);
} catch (error) {
this.setState({ visibilityLoadingScreen: false });
console.log('Error =>', error);
}
}
Final thoughts:
Well, since the code is stored at a private repository, I can't show the whole thing here for ethical reasons. But I'm doing my best to give you as much details as possible.
The output the code produces an almost complete PDF, with the only point that I see broken image icons where the images were supposed to be. For Android it works perfectly now.
I think this might be an issue related to WebView, since react-native-html-to-pdf uses WebView to generate the PDF from HTML code. I reached this conclusion after another developer at my job was trying to create a screen with a preview of the PDF before it could be shared got the very same problem for both Android and iOS. The library he used was react-native-webview.
Update with solution
Alright guys, after a long time of research, me and a colleague got to a solution which may not be the best but does what we expected.
First of all, one thing that was discovered is that we have to divide the problem in two, because we actually had two problems.
Images from react-native-image-picker: After a long time trying to find the problem which was preventing the local images from getting rendered, I tried updating the library to version 4.7.3 (latest version at that day) and did a number of required changes to the code, as the version we were using was considerably aged. Well, it happened to work out for my surprise, even with the response uri's format not being changed;
Images from #react-native-community/cameraroll: This one was a bit more complicated. It took me some time to realize that the iOS' PHAsset was not supported in the WebView or react-native-html-to-pdf (which uses WebView in background). So, after some research, me and my colleague found a workaround that lead us to a relatively easy solution. Basically we used react-native-fs to copy the PHAsset media file to a temporary directory, which would return a uri that started with file:// and could be rendered by WebView. That's the code we used to do this:
export default function getImageNameFromUrl(imageUrl = "") {
if (imageUrl) {
const splittedImageUrl = imageUrl.split('/');
return splittedImageUrl.pop();
}
return null;
};
export default async function copyAssetsFileIOSAndReturnURI(remoteURL = '', localURI = '') {
try {
if (remoteURL && localURI) {
const imageName = getImageNameFromUrl(remoteURL);
const imgPath = await RNFS.copyAssetsFileIOS(localURI, RNFS.TemporaryDirectoryPath+imageName, 0, 0);
return imgPath;
}
return null;
} catch (err) {
console.log(err);
return null;
}
}
I'm an iOS developer starting to look into Flutter. I'm following this tutorial from Ray Wenderlich and I'm facing a strange behaviour from this code.
_loadData() async {
var dataURL = "https://api.github.com/orgs/raywenderlich/members";
var response = await http.get(dataURL);
setState(() {
_members = json.decode(response.body);
});
}
Problem is that execution is stopped at var response = await http.get(dataURL);
I know is related to the await but I'm not sure why is happening. On the example code from the http package is using a similar code.
Can anyone help?
Thanks
So, the reason of the unresponsive await was due to the Android simulator. I don't know why but connection is not working on the simulator. I tested with the iOS simulator and worked.
if it is stopped it might be because the url does not return anything, maybe because the url is wrong? also i think string should use single quotations ' instead of ". try replacing it with single quotations?
This is the code in Java to make the socket call, but I want to know how can I replicate this or something similar in iOS (Swift or Objective-C)
public String MakeSocketRequest() {
DataInputStream inputSt;
DataOutputStream outputSt;
Socket socket = new Socket(InetAddress.getByName("socketurl.io"), 40008);
String jsonStr = "{\"id\":1,\"method\":\"themethod\"}";
inputSt = new DataInputStream(socket.getInputStream());
outputSt = new DataOutputStream(socket.getOutputStream());
PrintWriter pw = new PrintWriter(outputSt);
pw.println(string);
Log.d("PrintWriter", jsonStr);
pw.flush();
BufferedReader bfr = new BufferedReader(new InputStreamReader(inputSt));
JSONObject json = new JSONObject(bfr.readLine());
Log.d("Json", json.toString());
inputSt.close();
outputSt.close();
return json.toString();}
If you want to do it natively without 3rd-party libraries,
then you can use CFStreamCreatePairWithSocketToHost function to create input and output streams (no socket object is needed).
Here's some example code to set this up
And the search shows many more
On iOS you can't write or read the streams immediately, and you have to wait until the socket is connected, and you get a permission to read/write. This is done by implementing NSStreamDelegate.
If you get NSStreamEventHasSpaceAvailable event there, you can write your string to the output stream. You don't need a PrintWriter to just write a string, because it is easy to convert NSString to NSData, and write NSData.
If you get NSStreamEventHasBytesAvailable event, means you can try to read data from the input stream to some buffer (like NSMutableData). There's no builtin BufferedReader with a readLine method, so you will have to buffer the data yourself and detect when a new line character appears there. After that you can cut a part of the buffer until the new line, and convert NSData to NSString (or a JSON object by using NSJSONSerialization).
Note: scheduleInRunLoop calls might look confusing, but they are required to start receiving events via the delegate. It kind of tells the system on which thread you want to receive them.
P.S. I agree with commenters that if you have control over the server code, it's better to use a standard protocol like Socket IO or msgpack instead of inventing your own, because they have better and nicer libraries and wider community support.
I'm looking for a way to download large pdf files from an external server with a Flutter application for offline storage.
But downloading a large file (sometimes 100mb+) takes some time. I don't want the app being stuck in a wait function for it to download. What i'm looking for is a download function that has a callback with a progress report (Something like: 250000/500000 bytes done. Doesn't have to be exactly that. Just something that I can work with and make a progress bar out of).
Is this even possible to do in Flutter? The only things I came across were the HTTP library. But that does not seem to have a progress callback and just plainly reading the contents of a http call (Which also doesn't have a progress report). I hope someone has a method for me that I can use to make this happen.
Kind regards,
Kevin Walter
EDIT:
C# has the perfect example of what I mean
https://stackoverflow.com/a/9459441/2854656
https://docs.flutter.io/flutter/dart-io/HttpClient-class.html
https://docs.flutter.io/flutter/dart-io/HttpClientResponse-class.html
int fileSize;
int downloadProgress = 0;
new HttpClient().get('localhost', 80, '/file.txt')
.then((HttpClientRequest request) => request.close())
.then((HttpClientResponse response) {
fileSize ??= respone.contentLength;
response.transform(utf8.decoder).listen((contents) {
downloadProgres += contents.length;
// handle data
});
});