I have a widget to represent list of stores sorted by nearest to the user current locations also filtering should be applied.
Data in:
Stores data coming from stream of Firestore collection
Current user location from geolacator.
Filtering options from shared preferences
(can be changed any time)
List sorting mode selected by user
Data out: Filtered, sorted, list of stores.
What pattern is best practice in this case?
rxdart : https://pub.dartlang.org/packages/rxdart
if you wanna combine data together you can use
var myObservable = Observable.combineLatest3(
myFirstStream,
mySecondStream,
myThirdStream,
(firstData, secondData, thirdData) => print("$firstData $secondData $thirdData"));
you can combine from ( combineLatest2, combineLatest... combineLatest9 )
or
CombineLatestStream
like this example
CombineLatestStream.list<String>([
Stream.fromIterable(["a"]),
Stream.fromIterable(["b"]),
Stream.fromIterable(["C", "D"])])
.listen(print);
Numbers 2, 3 and 4 are inputs to the bloc that you'd send in through sinks. The bloc listens on those sinks and updates the Firestore query accordingly. This alone might be enough to make Firestore send the appropriate snapshots to the output stream the widget is listening to.
If you can't sort or filter how you want directly with Firestore's APIs, you can use stream.map or apply a StreamTransformer on it. The transformer gives you a lot of flexibility to listen to a stream and change or ignore events on the fly by implementing its bind method.
So you can do something like:
Stream<Store> get stores => _firestoreStream
.transform(filter)
.transform(sort);
Have a look at this page for streams in dart in general, and look into rxdart for more complex stream manipulations.
From personal experience I found having multiple inputs to a block leads to hard to test code. The implicit concurrency concerns inside the block lead to confusing scenarios.
The way I built it out in my Adding testing to a Flutter app post was to create a single input stream, but add markers to the messages notating which data stream the message was a part of. It made testing sane.
In this situation, I think there are multiple asynchronous processing. This implementation can be complicated. And there is a possibility of race condition.
I will implement as follows.
Separate streams of Model from Firestore and user-visible ViewModel in Bloc. Widgets listen to only ViewModel.(eg. with StreamBuilder)
Limit Business logic processing only in Bloc. First, relocate processing with SharedPreferences into Bloc.
Create UserControl class just for user input.
Branch processing depends on user input type of extended UserControl
I hope you this will help you.
For example:
import 'dart:async';
import 'package:rxdart/rxdart.dart';
class ViewModel {}
class DataFromFirestoreModel {}
abstract class UserControl {}
class UserRequest extends UserControl {}
class UserFilter extends UserControl {
final String keyWord;
UserFilter(this.keyWord);
}
enum SortType { ascending, descending }
class UserSort extends UserControl {
final SortType sortType;
UserSort(this.sortType);
}
class Bloc {
final controller = StreamController<UserControl>();
final viewModel = BehaviorSubject<ViewModel>();
final collection = StreamController<DataFromFirestoreModel>();
Bloc() {
controller.stream.listen(_handleControl);
}
_handleControl(UserControl control) {
if (control is UserRequest) {
_handleRequest();
} else if (control is UserFilter) {
handleFilter(control.keyWord);
} else if (control is UserSort) {
handleSort(control.sortType);
}
}
_handleRequest() {
//get location
//get data from sharedPreferences
//get data from firestore
ViewModel modifiedViewModel; // input modifiedViewModel
viewModel.add(modifiedViewModel);
}
handleSort(SortType sortType) {
final oldViewModel = viewModel.value;
//sorting oldViewModel
ViewModel newViewModel; // input sorted oldViewModel
viewModel.add(newViewModel);
}
handleFilter(String keyWord) {
//store data to sharedPreferences
//get data from Firestore
ViewModel modifiedViewModel; // input modifiedViewModel
viewModel.add(modifiedViewModel);
}
}
Related
I've a class that has all firebase related functions, and another class to manage state (bloc). How to make them work consistently?.
class UserFirebase {
Stream<List<User>> fetchUsers() {
// I want this function to return a Stream<List<User>> where i can
//listen to in UserBloc
return Firestore.instance.collection('users').snapshots();
}
}
class UserBloc {
UserFirebase _sService;
Observable<List<User>> get users => _sService.fetchUsers();
}
my approach might be not correct but i wanted to describe the problem.
You can watch this tutorial and just replace the source of the data with firebase:
https://www.youtube.com/watch?v=fahC3ky_zW0
It is what I have done an it works for me. The difference with the tutorial is that, I stream a "List" while the stream "UnmodifiableListView". Before you stream, you have to map your data to a list.
You can then use the bloc everywhere you want.
We are using the gcloud dart library to access the datastore. We would like to display the total number of users.
Of course, the simplest way to do this is:
db.query(User).run().length
but this would fetch all users.
Is there a way to run this query efficiently with the dart gcloud library? If not, will querying for all entities be a big performance issue and should we store the total number of users in a separate entity?
Google Cloud Datastore provides a number of special entity kinds which use the reserved __xxx__ names and can be used to query datastore metadata.
Using this mechanism it is possible to e.g. query all namespaces by using __namespace__, query all kinds by using __kind__. package:gcloud contains already the special Kind and Namespace kinds for this purpose.
Any other metadata kind can be just defined by the user, among others, for querying kind counts.
Here is a snippet which allows one to count entities in Dart:
import 'dart:async';
import 'package:appengine/appengine.dart';
import 'package:gcloud/db.dart';
Future main(List<String> args) async {
await withAppEngineServices(() async {
print(await getStats('Package'));
print(await getStats('PackageVersion'));
});
}
Future<Stats> getStats(String kind) async {
final query = dbService.query(Stats)..filter('kind_name =', kind);
final Stats stats = (await query.run().toList()).first;
return stats;
}
#Kind(name: '__Stat_Kind__', idType: IdType.String)
class Stats extends ExpandoModel {
#StringProperty(propertyName: 'kind_name')
String kindName;
#IntProperty()
int count;
String toString() => 'Stats(kind: "$kindName", count: $count)';
}
As a followup question to the following question and answer:
https://stackoverflow.com/questions/31156774/about-key-grouping-with-groupbykey
I'd like to confirm with google dataflow engineering team (#jkff) if the 3rd option proposed by Eugene is at all possible with google dataflow:
"have a ParDo that takes these keys and creates the BigQuery tables, and another ParDo that takes the data and streams writes to the tables"
My understanding is that ParDo/DoFn will process each element, how could we specify a table name (function of the keys passed in from side inputs) when writing out from processElement of a ParDo/DoFn?
Thanks.
Updated with a DoFn, which is not working obviously since c.element().value is not a pcollection.
PCollection<KV<String, Iterable<String>>> output = ...;
public class DynamicOutput2Fn extends DoFn<KV<String, Iterable<String>>, Integer> {
private final PCollectionView<List<String>> keysAsSideinputs;
public DynamicOutput2Fn(PCollectionView<List<String>> keysAsSideinputs) {
this.keysAsSideinputs = keysAsSideinputs;
}
#Override
public void processElement(ProcessContext c) {
List<String> keys = c.sideInput(keysAsSideinputs);
String key = c.element().getKey();
//the below is not working!!! How could we write the value out to a sink, be it gcs file or bq table???
c.element().getValue().apply(Pardo.of(new FormatLineFn()))
.apply(TextIO.Write.to(key));
c.output(1);
}
}
The BigQueryIO.Write transform does not support this. The closest thing you can do is to use per-window tables, and encode whatever information you need to select the table in the window objects by using a custom WindowFn.
If you don't want to do that, you can make BigQuery API calls directly from your DoFn. With this, you can set the table name to anything you want, as computed by your code. This could be looked up from a side input, or computed directly from the element the DoFn is currently processing. To avoid making too many small calls to BigQuery, you can batch up the requests using finishBundle();
You can see how the Dataflow runner does the streaming import here:
https://github.com/GoogleCloudPlatform/DataflowJavaSDK/blob/master/sdk/src/main/java/com/google/cloud/dataflow/sdk/util/BigQueryTableInserter.java
Problem:
I have a dart file defining some data structures, which I need to use both for the client and for the server. I'd like to make these data structures observable with Polymer. However, the server cannot include the file because of Polymer, because Polymer includes dart:html.
Context:
I am working on a client/server (REST-full) application, where I want the server to provide the data structures defined available as resources. The client should display these resources, and have the possibility to send the modifications to the server. For that, Polymer is invaluable.
The reason I want to have this library available for the server is that I want the server to be able to validate the resources to be stored.
Possible solutions:
I don't yet know the internals of Polymer enough, but if my data structures could inherit from Map, I could use toObservable in the client side code to make the data structure observable, but instead of accessing by dot notation, I'd have to access members by keys instead, making it rather fragile.
I was wondering if I could use mirrors.dart to add the observable annotation on the client.
Of course, managing duplicate code, is really not a solution.
You can use the observe package.
With ChangeNotifier you initiate the change notification yourself by calling notifyPropertyChange when a value changes. The changes get delivered synchronously.
Observable needs dirtyCheck() to be called to deliver changes.
Polymer calls Observable.dirtyCheck() repeatedly to get the changes automatically.
an example for each
import 'package:observe/observe.dart';
class Notifiable extends Object with ChangeNotifier {
String _input = '';
#reflectable
get input => _input;
#reflectable
set input(val) {
_input = notifyPropertyChange(#input, _input, val + " new");
}
Notifiable() {
this.changes.listen((List<ChangeRecord> record) => record.forEach(print));
}
}
class MyObservable extends Observable {
#observable
String counter = '';
MyObservable() {
this.changes.listen((List<ChangeRecord> record) => record.forEach(print));
}
}
void main() {
var x = new MyObservable();
x.counter = "hallo";
Observable.dirtyCheck();
Notifiable notifiable = new Notifiable();
notifiable.input = 'xxx';
notifiable.input = 'yyy';
}
I have class like this :
class BaseModel {
Map objects;
// define constructor here
fetch() {
// fetch json from server and then load it to objects
// emits an event here
}
}
Like backbonejs i want to emits a change event when i call fetch and create a listener for change event on my view.
But from reading the documentation, i don't know where to start since there are so many that points to event, like Event Events EventSource and so on.
Can you guys give me a hint?
I am assuming you want to emit events that do not require the presence of dart:html library.
You can use the Streams API to expose a stream of events for others to listen for and handle. Here is an example:
import 'dart:async';
class BaseModel {
Map objects;
StreamController fetchDoneController = new StreamController.broadcast();
// define constructor here
fetch() {
// fetch json from server and then load it to objects
// emits an event here
fetchDoneController.add("all done"); // send an arbitrary event
}
Stream get fetchDone => fetchDoneController.stream;
}
Then, over in your app:
main() {
var model = new BaseModel();
model.fetchDone.listen((_) => doCoolStuff(model));
}
Using the native Streams API is nice because it means you don't need the browser in order to test your application.
If you are required to emit a custom HTML event, you can see this answer: https://stackoverflow.com/a/13902121/123471
There's a package for it:
https://pub.dev/packages/event
This will be better than using Streams as 'event' is more readable