Angular Material Data Table - How To Setup filterPredicate For A Column With Type Ahead / Auto Complete Search? - angular-material

I've read the various implementations of filterPredicate on SO, Github, etc but they aren't helpful for me to understand what to do with type ahead searches.
I enter a letter into an input form field, say p, and I receive all the data with last names starting with p from the db. That part of my setup works fine. However, I don't want to hit the db again when I type the next letter, say r. I want to filter the data table for last names starting with pr. This is where the trouble starts.
When I type the second letter I have an if/else statement that tests if the var I'm using has >1 in the string. When it does I pass params to a function for the custom filtering on the table with the data already downloaded from the db. I'm avoiding a db call with every letter, which does work. I don't understand "(data, filter)". They seem like params but aren't. How do they work? What code is needed to finish this?
(I have `dataSource.filter = filterValue; working fine elsewhere.)
Params explained:
column = user_name
filterValue = pr...
The confusion:
public filterColumn(column: string, filterValue: string, dataSource) {
dataSource.filterPredicate = (data, filter) => {
console.log('data in filter column: ', data); // Never called.
// What goes here?
// return ???;
}
}
My dataSource object. I see filterPredicate, data, and filter properties to work with. Rather abstract how to use them.
dataSource in filterColumn: MatTableDataSource {_renderData: BehaviorSubject, _filter: BehaviorSubject, _internalPageChanges: Subject, _renderChangesSubscription: Subscriber, sortingDataAccessor: ƒ, …}
filterPredicate: (data, filter) => {…}arguments: [Exception: TypeError: 'caller', 'callee', and 'arguments' properties may not be accessed on strict mode functions or the arguments objects for calls to them
at Function.invokeGetter (<anonymous>:2:14)]caller: (...)length: 2name: ""__proto__: ƒ ()[[FunctionLocation]]: data-utilities.service.ts:43[[Scopes]]: Scopes[3]
filteredData: (3) [{…}, {…}, {…}]
sortData: (data, sort) => {…}
sortingDataAccessor: (data, sortHeaderId) => {…}
_data: BehaviorSubject {_isScalar: false, observers: Array(1), closed: false, isStopped: false, hasError: false, …}
_filter: BehaviorSubject {_isScalar: false, observers: Array(1), closed: false, isStopped: false, hasError: false, …}
_internalPageChanges: Subject {_isScalar: false, observers: Array(1), closed: false, isStopped: false, hasError: false, …}
_paginator: MatPaginator {_isInitialized: true, _pendingSubscribers: null, initialized: Observable, _disabled: false, _intl: MatPaginatorIntl, …}
_renderChangesSubscription: Subscriber {closed: false, _parentOrParents: null, _subscriptions: Array(1), syncErrorValue: null, syncErrorThrown: false, …}
_renderData: BehaviorSubject {_isScalar: false, observers: Array(1), closed: false, isStopped: false, hasError: false, …}data: (...)filter: (...)paginator: (...)sort: (...)__proto__: DataSource

I've included most of the component I made in Angular for typeahead search. The guts of the typeahead code is in the utilities shared component at the bottom. I used a shared component here because I'll use this in many places. However, I think it is a hack and a more elegant answer is possible. This works, it is easy, but not all that pretty. I can't afford more time to figure out pretty now. I suspect the answer is in RegEx.
In the typeahead.compoent in the .pipe you'll find how I call the code in the utility.
This code is in a shared component typeahead.component.ts
public searchLastName$ = new Subject<string>(); // Binds to the html text box element.
ngAfterViewInit() {
// -------- For Column Incremental Queries --------- //
// searchLastName$ binds to the html element.
this.searchLastName$.subscribe(result => {
this.queryLastName(result);
});
}
// --------- LAST NAME INCREMENTAL QUERY --------------- //
private queryLastName(filterValue) {
// Custom filter for this function. If in ngOnInit on the calling component then it applies
// to the whole calling component. We need various filters so that doesn't work.
this.membersComponent.dataSource.filterPredicate = (data: { last_name: string }, filterValue: string) =>
data.last_name.trim().toLowerCase().indexOf(filterValue) !== -1;
// When the first letter is typed then get data from db. After that just filter the table.
if (filterValue.length === 1) {
filterValue = filterValue.trim(); // Remove whitespace
// filterValue = filterValue.toUpperCase(); // MatTableDataSource defaults to lowercase matches
const lastNameSearch = gql`
query ($last_name: String!) {
lastNameSearch(last_name: $last_name) {
...membersTableFrag
}
}
${membersTableFrag}
`;
this.apollo
.watchQuery({
query: lastNameSearch,
variables: {
last_name: filterValue,
},
})
.valueChanges
.pipe(
map(returnedArray => {
// console.log('returnedArray in map: ', returnedArray); // All last_name's with the letter in them someplace.
const membersArray = returnedArray.data['lastNameSearch']; // extract items array from GraphQL JSON array
// For case insensitive search
const newArray = membersArray.filter(this.utilitiesService.filterBy(filterValue, 'last_name'));
return newArray;
})
)
.subscribe(result => {
this.membersComponent.dataSource.data = result;
});
} else {
// Filter the table instead of calling the db for each letter entered.
// Note: Apollo client doesn't seem able to query the cache with this kind of search.
filterValue = filterValue.trim(); // Remove whitespace
filterValue = filterValue.toLowerCase(); // MatTableDataSource defaults to lowercase matches
// Interface and redefinition of filterPredicate in the ngOnInit
this.membersComponent.dataSource.filter = filterValue; // Filters all columns unless modifed by filterPredicate.
}
}
utilities.service.ts
// -------------- DATABASE COLUMN SEARCH -------------
// Shared with other components with tables.
// For case insensitive search.
// THIS NEEDS TO BE CLEANED UP BUT I'M MOVING ON, MAYBE LATER
public filterBy = (filterValue, column) => {
return (item) => {
const charTest = item[column].charAt(0);
if (charTest === filterValue.toLowerCase()) {
return true;
} else if (charTest === filterValue.toUpperCase()) {
return true;
} else {
return false;
}
}
};

Related

Unsorted keys in note will be sorted

I'm creating a stave note with multiple keys:
const staveNote: vexflow.Flow.StaveNote = new this.VF.StaveNote({
keys: this.renderNotesSortedByPitch(placedChord.notes),
duration: chordDuration,
auto_stem: true,
clef: Clef.TREBLE
});
private renderNotesSortedByPitch(notes: Array<Note>): Array<string> {
const vexflowNotes: Array<string> = new Array<string>();
notes
// this.sortNotesByPitch(notes)
.forEach((note: Note) => {
vexflowNotes.push(this.renderNote(note));
});
return vexflowNotes;
}
private sortNotesByPitch(notes: Array<Note>): Array<Note> {
return notes.sort((noteA: Note, noteB: Note) => {
return noteA.pitch.chroma.value - noteB.pitch.chroma.value <--- No arithmetic operation on strings
});
}
and I get the following warning in the browser console:
Warning: Unsorted keys in note will be sorted. See https://github.com/0xfe/vexflow/issues/104 for details. Error
at Function.b.StackTrace (http://localhost:4200/vendor.js:93990:4976)
at Function.b.W (http://localhost:4200/vendor.js:93990:5134)
at http://localhost:4200/vendor.js:93990:255605
at Array.forEach (<anonymous>)
at e.value (http://localhost:4200/vendor.js:93990:255572)
at new e (http://localhost:4200/vendor.js:93990:250357)
at SheetService.vexflowRenderSoundtrack (http://localhost:4200/main.js:2083:51)
at SheetService.createSoundtrackSheet (http://localhost:4200/main.js:2004:14)
at SheetComponent.createSheet (http://localhost:4200/main.js:2465:35)
at SheetComponent.ngAfterViewInit (http://localhost:4200/main.js:2452:14)
I understand I need to provide the keys already sorted the way Vexflow is sorting them.
A similar issue is also described there.
How to sort the keys with the note.pitch.chroma.value being a string ?
It'd be nice to have some method in the same fashion as:
staveNote.setKeyStyle(0, { fillStyle: 'red' });
Say, some such method:
staveNote.setDotted(0);
Or:
staveNote.setKeyStyle(0, { fillStyle: 'red', dotted: true });
UPDATE: Following a suggestion I could create the methods to sort the notes before adding them as keys in the stave:
private getNoteFrequency(note: Note): number {
return Tone.Frequency(note.renderAbc()).toFrequency();
}
private sortNotesByPitch(notes: Array<Note>): Array<Note> {
return notes.sort((noteA: Note, noteB: Note) => {
return this.getNoteFrequency(noteA) - this.getNoteFrequency(noteB);
});
}
The Vexflow warning message was no longer displayed in the browser console.
Vexflow expects your notes to be sorted vertically, no way around that.
You need to write your own function to compare two notes given as strings.
here's a working note-string-comparison-function which doesn't take accidentals into account: repl.it/repls/WobblyFavorableYottabyte
edited for clarity, thanks #gristow for the correction!

Get error on GML options

I'm trying to create wfs-t service I have used the ol.format.WFS#writeTransaction method and serialize the WFS-t XML but my jslint always preview error at the GML format options. Is it possible that I am initializing the ol.format.WFS object incorrectly?
Or maybe I am passing the wrong options to the writeTransaction method? Or maybe it's a bug in OpenLayers4? this detail of my wfs-t service using angular http service:
private _transactWFS(feature: any, operation: any): any {
let payload;
try {
const formatWFS = new ol.format.WFS({});
const formatGML = new ol.format.GML({
featureNS: operation.featureNS,
featureType: operation.featureType,
srsName: operation.srsName
});
const xs = new XMLSerializer();
let node: any = null;
switch (operation.mode) {
case 'insert':
node = formatWFS.writeTransaction([feature], null, null, formatGML);
break;
case 'update':
node = formatWFS.writeTransaction(null, [feature], null, formatGML);
break;
case 'delete':
node = formatWFS.writeTransaction(null, null, [feature], formatGML);
break;
}
payload = xs.serializeToString(node);
} catch (error) {}
return payload;
}
lint message:
[ts]
Argument of type 'GML' is not assignable to parameter of type 'WFSWriteTransactionOptions'.
Property 'featureNS' is missing in type 'GML'.
From the OpenLayers WFS-T example:
// Word to the Wise from an anonymous OpenLayers hacker:
//
// The typename in the options list when adding/loading a wfs
// layer not should contain the namespace before, (as in the
// first typename parameter to the wfs consctructor).
//
// Specifically, in the first parameter you write typename:
// 'topp:myLayerName', and in the following option list
// typeName: 'myLayerName'.
//
// If you have topp included in the second one you will get
// namespace 14 errors when trying to insert features.
//
wfs = new OpenLayers.Layer.WFS(
"Cities",
"http://sigma.openplans.org/geoserver/wfs",
{typename: 'topp:tasmania_cities'},
{
typename: "tasmania_cities",
featureNS: "http://www.openplans.org/topp",
extractAttributes: false,
commitReport: function(str) {
OpenLayers.Console.log(str);
}
}
);
Seems to indicate you are building your WFS object wrong.
I'm give up using WFS Format for build WFS Transaction request so my problem was solved by myself, I found this lib geojson-to-wfs-t-2. This library is very legit for solving my problem.

Select2 search against current list of items

Using Select2's search defaults, I see it runs ajax calls with the query terms of the characters I typed. It does not filter against the current list of items. I didn't see anything in the docs to control this behavior. Is it possible to change it?
For ajax'd data, Select2 doesn't have an option to filter against the current, cached result set, or a freshly fetched set. It always gets a new data set.
My solution was to write my own filter on the fresh data. It works but for every search term, it makes a new ajax call, which just returns the same data, every time.
processResults: function (data, query) {
let normalizedData = hj.gic.swapFieldDataConditioner('name', 'text', data.results);
if(query.term === undefined) {
return {results: normalizedData};
} else {
let term = new RegExp(query.term, 'gi'), matchedResults = [];
normalizedData.forEach(function (item) {
if(item.text.match(term)) {
matchedResults.push(item);
}
});
return {results: matchedResults};
}
}

How to rereduce with Mongoid to aggrerate data from two distinct fields?

The context
Two documents of a mongodb mapped on rails/mongoid classes. The two class are Task and Subscription. For performance reasons, Subscription.current_task stores a Task::CurrentTask which is contains a subset of attributes of a Task, but the real current task matching a subscription is the one with the highest Task#pos for a given Task#subscription_id.
The problem
Some inconsistencies appeared between some attributes from the Subscription.current_task and the should be matching Task, notably the state field.
Goal
Listing all current tasks from Subscriptions which doesn't match the last Task for this subscription.
Solution aimed
First, map/reduce Task to get the last one for each subscription and storing that into a temporary collection. Thirdly rereduce with this temporary collection on Subscription to obtains for each subscription an object containing both the actual last task and the current embeded subset copy. Thirdly, create the report for elements where actual and copied task mismatch.
Difficulty encountered
While having read the official mongodb and mangoid documentation, and other example in misc. blog like MongoDB Map Re-Reduce and joins – performance tuning and MongoDB, Mongoid, MapReduce and Embedded Documents., I'm still unable to come with a working solution for the rereduce step.
The nonfunctional solution wrote so far:
# map/reduce of tasks to get the last one of each subscripton
last_task_map = %Q{
function() {
var key = this.subscription_id;
var value = {
task: {
pos: this.pos,
task_id: this._id,
state: this.state
},
current_task: null
};
emit(key, value);
}
}
last_task_reduce = %Q{
function(key, tasks) {
var last_task = tasks[0];
for ( var i=1; i < tasks.length; i++ ) {
if(tasks[i].pos > last_task.pos) {
last_task = tasks[i];
}
}
var value = {
task: {pos: last_task.pos, task_id: last_task.task_id, state: last_task.state},
current_task: null
};
return value;
}
}
# map/reduce of `current_task`s to merged with previous results
subscription_map = %Q{
function() {
if(!this.current_task) {
return;
}
var key = this._id;
var value = {
task: null,
current_task: {
pos: this.current_task.pos,
task_id: this.current_task.task_id,
state: this.current_task.state,
source: 'current_task',
}
};
emit(key, value);
};
}
reduce = %Q{
function(key, tasks) {
if(tasks[0].current_task == nill) {
return {task: tasks[0].task, current_task: tasks[1].current_task};
}
return {task: tasks[1].task, current_task: tasks[0].current_task};
}
}
buffer = 'current_task_consistency'
# temporary collection seems unremoved when serially calling the script with
# `load` in a `rails c` prompt, so we drop it to avoid unwanted glitch merge
Mongoid.default_client[buffer].drop
t = Task.map_reduce(last_task_map, last_task_reduce).out(replace: buffer)
s = Subscription.map_reduce(subscription_map, reduce).out(reduce: buffer)
t.each{ |e| puts e } # ok: `{"_id"=>BSON::ObjectId('592dd603e138236671587b04'), "value"=>{"task"=>{"pos"=>0.0, "task_id"=>BSON::ObjectId('592dd604e138236671587b0f'), "state"=>40.0}, "current_task"=>nil}}`
puts t.counts # ok: {"input"=>83900, "emit"=>83900, "reduce"=>36115, "output"=>28625}
s.each{ |e| puts e } # ko: {"_id"=>BSON::ObjectId('592dd603e138236671587b04'), "value"=>{"task"=>nil, "current_task"=>{"pos"=>0.0, "task_id"=>BSON::ObjectId('592dd604e138236671587b0f'), "state"=>40.0, "source"=>"current_task"}}}
puts s.counts # ko: {"input"=>28632, "emit"=>28624, "reduce"=>0, "output"=>28624}
The expected result for the second map/reduce is a merge of the current_task_consistency and the subscription_map results which should all pass within the reduce, when none is performed according to counts, and indeed the output from s elements reveals that no task key was assigned with the current_task_consistency value.
Questions related to the exposed problem
What does the implementation lakes?
As far as I understand, this solution do provide merge functions which are idempotent and which provides output consistent with the corresponding match function returns. What do I may misunderstand about how the out parameter works and how the rereduce input/output should be managed?
Additional remarks
The third step, to generate the report, is intended to be implemented as a finalize function applied on the second map/reduce. But maybe a third map/reduce might be a better way, or not. As a whole the implementation might be badly poorly structured at least from a performance point of view, and feed back is welcome on that point too.
A first problem of the proposed solution was simply a ruby/js syntax mix up, nill instead of null. Unfortunately the script was failing silently, at least in the pry console where I was running load current_task_consistency.rb.
Here is a working solution, with two map/reduce and a query on the resulting temporary collection.
# map/reduce of tasks to get the last one of each subscripton
last_task_map = %Q{
function() {
var key = this.subscription_id;
var value = {
task: {
pos: this.pos,
task_id: this._id,
state: this.state
},
current_task: null
};
emit(key, value);
}
}
last_task_reduce = %Q{
function(key, tasks) {
var last_task = tasks[0];
for ( var i=1; i < tasks.length; i++ ) {
if(tasks[i].pos > last_task.pos) {
last_task = tasks[i];
}
}
var value = {
task: {pos: last_task.pos, task_id: last_task.task_id, state: last_task.state},
current_task: null
};
return value;
}
}
# map/reduce of `current_task`s merged side by side with the corresponding
# subscription last task
subscription_map = %Q{
function() {
if(!this.current_task) {
return;
}
var key = this._id;
var value = {
task: null,
current_task: {
pos: this.current_task.pos,
task_id: this.current_task.task_id,
state: this.current_task.state,
}
};
emit(key, value);
};
}
subscription_reduce = %Q{
function(key, tasks) {
if(tasks[0].current_task == null) {
return {task: tasks[0].task, current_task: tasks[1].current_task};
}
return {task: tasks[1].task, current_task: tasks[0].current_task};
}
}
buffer = 'current_task_consistency'
# temporary collection seems unremoved when serially calling the script with
# `load` in a `rails c` prompt, so we drop it to avoid unwanted merge glitch
Mongoid.default_client[buffer].drop
Task.map_reduce(last_task_map, last_task_reduce).
out(replace: buffer).
execute
Subscription.
map_reduce(subscription_map, subscription_reduce).
out(reduce: buffer).
execute
ascertain_inconsistency = %Q{
this.value.current_task == null ||
this.value.current_task.state != this.value.task.state
}
inconsistencies = Mongoid.default_client['current_task_consistency'].
find( { "$where": ascertain_inconsistency } )

search functionality using relay

How to implement a search functionality with relay?
So, the workflow is
user navigate to search form.
there should not be any query (as in relay container) when initializing the view.
user fills the field values, and press the action/search button.
a relay query is sent to the server
results are received from the server.
page displays it and relay reconciles the filtered results with local cache.
I have not seen an example of ad hoc query but only part of a relay container (which it resolves before component initialization). So, how to model it. should it be like a mutation?
If I understand correctly you'd like to not send any query at all for the component until the user enters some search text, at which point the query should sent. This can be accomplished with the example posted by #Xuorig, with one addition: use GraphQL's #include directive to skip the fragment until a variable is set. Here's the extended example:
export default Relay.createContainer(Search, {
initialVariables: {
count: 3,
query: null,
hasQuery: false, // `#include(if: ...)` takes a boolean
},
fragments: {
viewer: () => Relay.QL`
fragment on Viewer {
# add `#include` to skip the fragment unless $query/$hasQuery are set
items(first: $count, query: $query) #include(if: $hasQuery) {
edges {
node {
...
}
}
}
}
`,
},
});
This query will be skipped initially since the include condition is falsy. Then, the component can call setVariables({query: someQueryText, hasQuery: true}) when text input is changed, at which point the #include condition will become true and the query will be sent to the server.
This is the way I've implemented simple search in my project:
export default Relay.createContainer(Search, {
initialVariables: {
count: 3,
title: null,
category: null,
},
fragments: {
viewer: () => Relay.QL`
fragment on Viewer {
items(first: $count, title: $title, category: $category) {
edges {
node {
...
}
}
}
}
`,
},
});
Your search form simply has to update the initialVariables using this.props.relay.setVariables and relay will query the new data.

Resources