Here is the BLoc that has my BehaviorSubject and getters for stream and latest stream value.
Stream<User> get currentUser => _currentUserSubject.stream;
User get currentUserValue => _currentUserSubject.stream.value;
final _currentUserSubject = BehaviorSubject<User>();
Here is the StreamBuilder that uses these currentUser as stream source and currentUserValue as initialData:
StreamBuilder<User>(
initialData: _loginAuthBloc.currentUserValue,
stream: _loginAuthBloc.currentUser,
...)
I make sure that currentUser has a latest value when the page which returns this StreamBuilder is opened. But because It has some waiting duration before this stream is subscribed and latest value is returned, I set the initialData with currentUserValue so that makes sure that we skip that waiting part. So far so good. Buttt the thing is I want to ignore the value that is coming after stream is subscribed and after waiting that stream's latest value is also coming and it is the same value as initialData. Do you have a suggestion about this?
Related
I am implementing one tokenizer. It parses the document, tokenizes it on a set of possible delimiters and then provides me combination of 1-, 2- and 3-word tokens.
I was able to achieve my goal, but only in one specific way:
Stream<String> contentStr = file.openRead().transform(utf8.decoder);
Stream<String> tokens = contentStr.transform(charSplitter).transform(tokenizer).asBroadcastStream();
var twoWordTokens = tokens.transform(sliding(2));
var threeWordTokens = tokens.transform(sliding(3));
StreamController<String> merger = StreamController();
tokens.forEach((token) => merger.add(token));
threeWordTokens.forEach((token) => merger.add(token));
twoWordTokens.forEach((token) => merger.add(token));
merger.stream.forEach(print);
As you can see I do following:
broadcast original stream of tokens
transform it to 2 additional streams by sliding window transformation
create a StreamConsumer (StreamController to be precise) and pump every event from 3 streams to that stream consumer.
then I print every element of the stream consumer to test
It works but I don't like that I add each element from source streams via StreamConsumer.add method. I wanted to use StreamController.addStream instead but that somehow does not work.
The following code gives me a Bad state: Cannot add event while adding a stream error and I understand why:
StreamController<String> merger = StreamController();
merger.addStream(tokens);
merger.addStream(twoWordTokens);
merger.addStream(threeWordTokens);
merger.stream.forEach(print);
This is per API documentation of the StreamController.addStream.
So I need to wait for each addStream returning future completion:
StreamController<String> merger = StreamController();
await merger.addStream(tokens);
await merger.addStream(twoWordTokens);
await merger.addStream(threeWordTokens);
await merger.stream.forEach(print);
But in this case I get nothing printed in the console.
If I do this:
StreamController<String> merger = StreamController();
merger.stream.forEach(print);
await merger.addStream(tokens);
await merger.addStream(twoWordTokens);
await merger.addStream(threeWordTokens);
Then only the 1-word tokens, i.e. elements of the original broadcast stream are printed. Elements of the derived streams are not.
I kind of understand why this happens, because all my streams are derived from the original broadcast stream.
Is there a better way to implement such a pipeline?
Probably my problem can be reformulated in terms of stream duplication/forking, but I can't see a way to clone a stream in Dart. If you can advice on that - please do.
I hope to allow concurrent addStream at some point, but until then, you need to handle the events indpendently:
var allAdds = [
tokens.forEach(merger.add),
twoWordTokens.forEach(merger.add),
threeWordTokens.forEach(merger.add)];
Future.wait(allAdds).then((_) { merger.close(); });
merger.stream.forEach(print);
That's if you want to control everything yourself. You can also use the StreamGroup class from package:async. It collects a number of streams and emits their events as a single stream.
This assumes that you have no error events.
I have below firestore structure and a method to get user feed from DB.
I need to chain my stream like
First all feed ID from User/FeedIDs collection
Then for every feedID, get documents for the feed details and return back to list of them.
I could find a way to solve this because toList() is not working or i am doing something wrong.
// User Collection
- User
- RandomDocumentID
- Feed
- FeedIDasDocumentID
- field1
- field2
.
.
// Feed Collection
- Feed
- RandomDocumentID
- field1
- field2
.
.
// Method in my repository to get feed for User
Observable<Feed> getCurrentUserFeed(String uid) {
return Observable(Firestore.instance
.collection('User')
.document(uid)
.collection("FeedIDs")
.snapshots()
.expand((snapshots) => snapshots.documents)
.map((document) => UserFeed.fromMap(document.data))
)
.flatMap((userFeed) => Firestore.instance
.collection("Feed")
.document(userFeed.id)
.snapshots()
)
.map((document) => Feed.fromMap(document.data));
// ????
// I tried to put .toList() and of the stream but it is not working,
// i wanna return List<Feed> instead of every single feed object
}
// in my BLoC
// I had to do that because I could acquire to get streams elements as a list
//
List<Feed> feedList = List();
FirebaseUser user = await _feedRepository.getFirebaseUser();
_feedRepository.getCurrentUserFeed(user.uid).listen((feed) {
feedList.add(feed);
dispatch(UserFeedResultEvent(feedList));
};
If there is any other approach for chaining, it will be really appreciated to share. Thank you
I think the issue here is that Firestore is set up to send updates whenever records change. When you query for snapshots it's a Stream that will never send a done event, because new updates could always come in.
Some of the methods on Stream that return a Future will never complete if the stream does not send a done event. These include .single and .toList(). You are probably looking for .first which will complete after the first event is sent through the stream (the current state of the records in the database) and stop listening for changes.
I'm trying to set timestamp into firebase realtime database but when I retrieve, not ordering by timestamp.
I did like so.
FirebaseDatabase.instance.reference().child('path').push().set({
'timestamp': ServerValue.timestamp
});
This is the node
Then I retrieve like so.
FirebaseDatabase.instance.reference().child('path').orderByChild('timestamp').once().then((snap) {
print(snap.value);
});
but output is this
{-LJhyfmrWVDD2ZgJdfMR: {timestamp: 1534074731794}, -LJhyWVi6LddGwVye48K: {timestamp: 1534074689667}, -LJhzDlvEMunxBpRmTkI: {timestamp: 1534074875091}
Those are not ordered by timestamp.
Am I missing something?
Otherwise is this firebase error or flutter?
The data is retrieved in the right order into a DataSnapshot. But when you call snap.value the information from the snapshot has to be converted into a Map<String, Object>, which not longer can hold information about the order of the child nodes.
To maintain the order, you have to process the child nodes from the DataSnapshot with a loop. I'm not an expert in Flutter (at all), but I can't quickly find a way to do this for a value event. So you might want to instead listen for .childAdded:
FirebaseDatabase.instance
.reference()
.child('path')
.orderByChild('timestamp')
.onChildAdded
.listen((Event event) {
print(event.snapshot.value);
})
I've been struggling with this for couple of hours and can't seem tof ind a solution. I'll appreciate any help.
I building a flutter app trying to follow the BLOC pattern.
I have a widget with two text fields: Country code and PhoneNumber.
I have define a Bloc with two Sink (one for each field) and a Stream with a state. The Stream State is a merge of the two sink such as:
factory AuthenticationBloc(AuthenticationRepository authRepository) {
final onPhoneChanged = PublishSubject<String>();
final onCountryChanged = PublishSubject<String>();
//oldState would be the last entry in the stream state.
final stateChangeFromThePhone = onPhoneChanged.map((newPhone)=>oldState.copyWith(phone:newPhone));
final stateChangeFromtheCountry = onCountryChanged.map((newCountry)=>oldState.copyWith(country:newCountry));
final state = Observable.merge[stateChangeFromThePhone, stateChangeFromtheCountry];
}
This is pseudo code but the idea is there. My question is how can I get access to the latest event from the state stream represented in the code by oldState?
I could define a variable on which I store this value on each new event in the state stream but looks ugly... :(
Any advice?
I had this app that fetches some data from a remote API. So the data that I am going to receive and display in a JSON Future:
{"status":200,"out":{"summary":[{"bc":"1876","wc":"488679","pc":"731904"}],"last":[{"id":"1877","place":"7","publisher":"-1","bookid":"01877","title":"Neither Civil Nor Servant","author":"Peh","region":"\u65b0\u52a0\u5761","copyrighter":"","translated":"0","purchdate":"2017-04-18","price":"200.00","pubdate":"2016-01-01","printdate":"2016-01-01","ver":"1.1","deco":"\u666e\u901a","kword":"0","page":"220","isbn":"978-981-4642-63-7","category":"","location":"","intro":"TT\u8d60\u4e66\u3002","instock":"1","p_name":"\uff08\u672a\u6307\u5b9a\uff09"}]}}
I will extract the out field from this JSON and assing summary and last to two variables:
initState() async {
var getter = createHttpClient();
String uri='http://api.rsywx.com/book/summary';
var res=await getter.get(uri);
Map data=JSON.decode(res.body);
var out=data['out'];
setState(() {
_today=formatDate(new DateTime.now());
_lb=out['last'][0];
_bs=out['summary'][0];
_lb['purchdate']=formatDate(DateTime.parse(_lb['purchdate']));
});
}
So _bs and _lb are all compound objects.
In my widget build function, I will display the contents of these two objects:
new TextSpan(
text: numFormatter.format(int.parse(_bs['bc'])),
style: aboutTextStyle,
),
The program compiles OK but when launched, a quick splash RED screen will appear:
And soon enough, the correct screen will appear:
I know that during the initial build, the object _bs, _lb is not there yet and the async call to a remote API is still trying to populate the returned response, so in this case, _bs['bc'] will definitely be not callable. Thus the non-blocking error pops out.
Workaround
I can eliminate this error by declaring a bunch of variables and assign them in the initState function; instead of rendering _bs['bc'], I will render a new variable _bookCoount. This way, the rendering will be done without this RED screen and the value of that variable will initially be null and soon be the correct value fetched from remote API.
But this is too cumbersome, if you get what I mean: A lot of used-only-once variables.
Or, shall I make the data fetched on the parent level, so that it will be passed to this widget as props? Not tried yet.
Would appreciate your best practice input.
Update
The issue really comes from int.parse. If I took out that call, the program runs peacefully.
So the question now becomes
I would suppress int.parse prompting an error before the value it is going to parse becomes valid.
Not sure what you mean with your workaround. In your example setState() won't be called before await getter.get(uri); returns a value.
I guess this should do
new TextSpan(
text: _bs != null && _bs['bc'] != null ? [numFormatter.format(int.parse(_bs['bc'])) : 0,
style: aboutTextStyle,
),