Joining Firebase tables in React - join

I am hoping to display a list of user's notes from a Firebase DB inside of a React app.
After reading through the Firebase recommended approach on structuring data, I've created my database in the flattened format they recommend. The data structure looks something like this:
notes
- [noteKey]
- note: [noteData]
- created_at: [date]
- updated_at: [date]
...
users
- [userKey]
- name: [userName]
- notes
- [noteKey]: true
...
...
Each user has an array called notes, which lists the noteKeys of the notes that they own.
So far I've been able to get the full list of notes (from all users, not what I want), and the user's list of noteKeys. The issue that I'm having is combining those two. I have seen the question about joining tables, but I have more of a React focused question:
In which React function does the join happen?
Right now my code looks like this:
getInitialState: function(){
return {
notesList: []
};
},
componentWillMount: function() {
base = Rebase.createClass('https://appName.firebaseio.com');
base.syncState('notes', {
context: this,
state: 'notesList',
asArray: true,
queries: {
limitToLast: 20
}
});
this.state.notesList.map(function(item, i) {
base.child("notes/" + item['.key'] + "/note").on('child_added', function(snapshot) {
console.log(item['.key'])
});
});
},
I see two issues with this.
When the this.state.notesList.map function is called in componentWillMount, the array hasn't been populated with the Firebase data yet, so it looks like an empty array and returns an error.
Once I solve #1, I'm not sure how to get the user specific notes into it's own array that's accessible by the rest of the component.
--
In which React timeline function should the join be happening?
How do the second table items (the user's notes) get added to an array that is accessible by the rest of the component?

You're working with an async library (re-base) here but you've written synchronous code.
What this means is base.syncState is going to fire off a request to your Firebase instance and in the meantime, your JavaScript is going to just keep happily executing down the line with or without results. It follows that this.state.notesList.map is going to map over an empty array since JS is going to execute faster than a round trip to the server.
Looking at the options available for the syncState method, there's one called then that executes a callback.
then: (function - optional) The callback function that will be invoked when the initial listener is established with Firbase. Typically used (with syncState) to change this.state.loading to false.
This makes me think that it fires after you get your data from Firebase.
Try running your .map in there since you'll actually have the data you want.
componentWillMount: function() {
base = Rebase.createClass('https://appName.firebaseio.com');
base.syncState('notes', {
context: this,
state: 'notesList',
asArray: true,
queries: {
limitToLast: 20
},
then: function() {
this.state.notesList.map(function(item, i) {
base.child("notes/" + item['.key'] + "/note").on('child_added', function(snapshot) {
console.log(item['.key'])
});
});
}
});
}

Related

Would it be possible to do lazy loading in Relay?

I have a Parent component which contains many Child components as an array. Each child component contains a huge amount of data. So, I decided to not load all of them when Parent get loaded. The data fetched from Parent container is as following:
{
...
childs: [childId1, childId2, ...] // Array of child id
...
}
Then, I would like to send one request per child by passing the child's id to back-end apis. Each child will be show up on the UI whenever its data get back, otherwise, a spinner icon is displayed for indicating the loading data.
Would it be possible to achieve this in Relay?
UPDATED:
Here is an example of the option 1:
Child container:
export default Relay.createContainer(Child, {
initialVariables: {
id: null,
hasQuery: false
},
fragments: {
viewer: () => Relay.QL`
fragment on Viewer {
child(id: $id) #include(if: $hasQuery) {
...
}
}
`,
},
});
Child component:
const Child = React.createClass({
componentWillMount() {
ChildContainer.setVariables({ id: this.props.childId, hasQuery: true });
}
});
Parent container:
export default Relay.createContainer(Parent, {
fragments: {
viewer: () => Relay.QL`
fragment on Viewer {
childIds // Return an array of child's id
Child.getFragment('viewer')
}
`,
},
});
Parent component:
const Parent = React.createClass({
render() {
this.props.viewer.childIds.map(childId => {
<Child childId={childId} />
});
}
});
The problem is that when each Child got rendered, it fetched its data and replaced the last Child data with its own data. For example, if childIds = [1, 2, 3], it displayed data of 3 three times on the screen; 3 3 3
There are two typical patterns for delayed data fetching in open-source Relay:
Use #include or #skip directives where the condition is initially set to false. After the UI loads, or in response to user action, set the condition to true (e.g. with setVariables).
Use nested <Relay.Renderer> components. The top-level <RelayRenderer> would fetch the minimum "required" data in one round trip and then display it, which would render additional <RelayRenderer>s that would fetch more data.
The second option seems best-suited to your use case: the top-level renderer would fetch the list of IDs only. Then it would render a list of UI components, each of which fetched more data about its ID. List items would render as their data resolves.
One potential downside of this approach is that the data for all the items will be fetched in parallel; the first item in the list won't necessarily be the first one to get its data and render. To mitigate this an application would have to maintain greater control of the ordering of fetches; Relay accommodates this via an injectable network layer. For example, you could batch requests to the server and/or ensure ordering (for example by intentionally delaying resolving responses of "later" requests until previous queries have completed). You might check out the community-driven react-relay-network-layer which implements some of these ideas and supports pluggable middleware to help achieve the rest.

Delete objects in other classes when a User object is deleted Parse.com

I have an app in Parse.com which contains 3 simple classes: User, Alpha, and Beta.
User class is populated when new users sign up for the application.
Alpha class is populated when these users create stuff (upload photos, sounds, videos, etc)
Beta class is populated when other users perform activities on objects of Alpha class (share, favourite, etc).
Task at hand:
Find a way to delete all objects in Alpha and Beta classes for a particular user, when they are deleted from the User class. Right now, if a user is deleted, child objects in Alpha and Beta classes remain as orphan objects, and other users are still able to see these objects. This is causing inconsistencies in the application.
With my negligible understanding of the Parse backend, I think I should be using Cloud Code with Background Jobs (if I'm right), but I'm not sure how to go about it.
Before / after delete hook is the way to go, a little bit more particular advice relating to the description of the app, assuming that Alpha objects have a pointer to __User called "createdBy".
I think it's good to get in the habit of using small, promise returning functions to carry out the asynchronous steps. Something like the following...
Parse.Cloud.beforeDelete(Parse.User, function(request, response) {
var user = request.object;
deleteAlphasForUser(user).then(function(result) {
return deleteBetasForUser(user);
}).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
});
function deleteAlphasForUser(user) {
return alphasForUser(user).then(function(alphas) {
return Parse.Object.destroyAll(alphas);
});
}
function alphasForUser(user) {
var query = new Parse.Query("Alpha");
query.equalTo("createdBy", user);
return query.find();
}
I didn't supply deleteBetasForUser or the function that fetches betas, but they ought to be very similar to the functions for the Alpha classes.
CloudCode "afterDelete" triggers might be a good option. That way, the orphaned objects can be cleaned up immediately having an existing reference to the user being deleted. The link above includes a great example of a very similar solution.
Parse.Cloud.afterDelete("Post", function(request) {
query = new Parse.Query("Comment");
query.equalTo("post", request.object.id);
query.find({
success: function(comments) {
Parse.Object.destroyAll(comments, {
success: function() {},
error: function(error) {
console.error("Error deleting related comments " + error.code + ": " + error.message);
}
});
},
error: function(error) {
console.error("Error finding related comments " + error.code + ": " + error.message);
}
});
});
While a background job could work, it would have the disadvantage of having the clean up delayed until that background job is scheduled. Also, since the user was already deleted, the query for and processing of orphaned objects might be a bit inefficient.

Knockout mapping is not updating my model

I'm having trouble with a knockout model that is not binding on a subscribed update. I have a C# MVC page that delivers a model to the template which is parsed to Json and delivered raw as part of a ViewModel assignment for ko.applyBindings. I have a subscription to an observable that calls a method to perform an update of the viewModel's data. Irrelevant stuff pulled out and renamed for example usage:
var myViewModel = function (data) {
var self = this;
self.CurrentPage = ko.observable();
self.SomeComplexArray= ko.observableArray([]);
self.Pager().CurrentPage.subscribe(function (newPage) {
self.UpdateMyViewModel(newPage);
});
self.UpdateMyViewModel= function (newPage) {
var postData = { PageNumber: newPage };
$.post('/Article/GetMyModelSearchByPage', postData, function (data) {
ko.mapping.fromJS(data, {}, self);;
});
};
When I perform logging, I can see all of the data, and it all looks correct. The same method is used to produce both the initial model and the updated model. I've used this technique on other pages and it worked flawlessly each time. In this case however, I'm looking for it to bind/update SomeComplexArray, and that's just not happening. If I attempt to do it manually, I don't get a proper bind on the array I get blank. I'm wondering if there is something obvious that I'm doing wrong that I'm just flat out missing.
Edit: I don't know that ko.mapping can be pointed to as the culprit. Standard model changes are also not affecting the interface. Here is something that is not working in a bound sense. I have a p element with visible bound to the length of the array and a div element with a click bound to a function that pops items off of SomeComplexArray. I can see in the console log that it is performing its function (and subsequent clicks result in 'undefined' not having that function). However, the p element never displays. The initial array has only 2 items so a single click empties it:
<p data-bind="visible: SomeComplexArray().length === 0">nothing found</p>
<div data-bind="click: function() { UpdateArray(); }">try it manually</div>
-- in js model
self.UpdateArray = function () {
console.log(self.SomeComplexArray());
console.log(self.SomeComplexArray().pop());
console.log(self.SomeComplexArray());
console.log(self.SomeComplexArray().pop());
console.log(self.SomeComplexArray());
});
Edit 2: from the comment #Matt Burland, I've modified how the pop is called and the manual method now works to modify the elements dynamically. However, the ko.mapping is still not functioning as I would expect. In a test, I did a console.log of a specific row before calling ko.mapping and after. No change was made to the observableArray.
I created a test of your knockout situation in JSFiddle.
You have to call your array function without paranthesis. I tested this part:
self.UpdateArray = function () {
self.SomeComplexArray.pop();
};
It seems to be working on JSFiddle side.
I'm not really sure why, but it would seem that ko.mapping is having difficulty remapping the viewmodel at all. Since none of the fields are being mapped into self my assumption is that there is an exception occurring somewhere that ko.mapping is simply swallowing or it is not being reported for some other reason. Given that I could manually manipulate the array with a helpful tip from #MattBurland, I decided to backtrack a bit and update only the elements that needed to change directly on the data load. I ended up creating an Init function for my viewModel and using ko.mapping to populate the items directly there:
self.Init = function (jsonData) {
self.CurrentPage(0);
self.Items(ko.mapping.fromJS(jsonData.Items)());
self.TotalItems(jsonData.TotalItems);
// More stuff below here not relevant to question
}
The primary difference here is that the ko.mapping.fromJS result needed to be called as a function before the observableArray would recognize it as such. Given that this worked and that my controller would be providing an identical object back during the AJAX request, it was almost copy/past:
self.UpdateMyViewModel= function (newPage) {
var postData = { PageNumber: newPage };
$.post('/Article/GetMyModelSearchByPage', postData, function (data) {
self.Items(ko.mapping.fromJS(JSON.parse(data).Items)());
});
};
This is probably not ideal for most situations, but since there is not a large manipulation of the viewModel occurring during the update this provides a working solution. I would still like to know why ko.mapping would not remap the viewModel at the top level, but in retrospect it probably would have been a disaster anyway since there was "modified" data in the viewModel that the server would have had to replace. This solution is quick and simple enough.

Event listener for multiple elements - jQuery

In the ASP MVC page I'm currently working on, the values of three input fields determine the value of a fourth. Zip code, state code, and something else called a Chanel Code will determine what the value of the fourth field, called the Territory Code, will be.
I just started learning jQuery a couple weeks ago, so I would first think you could put a .change event that checks for values in the other two fields and, if they exists, call a separate method that compares the three and determines the Territory code. However, I'm wondering if there is a more elegant way to approach this since it seems like writing a lot of the same code in different places.
You can bind a callback to multiple elements by specifying multiple selectors:
$(".field1, .field2, .field3").click(function() {
return field1 +
field2 +
field3;
});
If you need to perform specific actions depending on which element was clicked, another option would be to create a function which performs the actual computation and then invoke that from each callback.
var calculate = function() {
return field1 +
field2 +
field3;
};
And then invoke this function when on each click:
$(".field1").click(function() {
// Perform field1-specific logic
calculate();
});
$(".field2").click(function() {
// Perform field2-specific logic
calculate();
});
// etc..
This means that you do not repeat yourself.
This works for me
jQuery(document).on('scroll', ['body', window, 'html', document],
function(){
console.log('multiple')
}
);
Adding another possibility, just in cased this may help someone. This version should work on dynamically created fields.
$("#form").on('change', '#Field1, #Field2, #Field3', function (e) {
e.preventDefault();
console.log('something changed');
});

jQuery UI Sortable: Revert changes if update callback makes an AJAX call that fails?

I am using the sortable widget to re-order a list of items. After an item is dragged to a new location, I kick off an AJAX form post to the server to save the new order. How can I undo the sort (e.g. return the drag item to its original position in the list) if I receive an error message from the server?
Basically, I only want the re-order to "stick" if the server confirms that the changes were saved.
Try the following:
$(this).sortable('cancel');
I just encountered this same issue, and for the sake of a complete answer, I wanted to share my solution to this problem:
$('.list').sortable({
items:'.list:not(.loading)',
start: function(event,ui) {
var element = $(ui.item[0]);
element.data('lastParent', element.parent());
},
update: function(event,ui) {
var element = $(ui.item[0]);
if (element.hasClass('loading')) return;
element.addClass('loading');
$.ajax({
url:'/ajax',
context:element,
complete:function(xhr,status) {
$(this).removeClass('loading');
if (xhr.status != 200) {
$($(this).data('lastParent')).append(this);
}
},
});
}
});
You'll need to modify it to suit your codebase, but this is a completely multithread safe solution that works very well for me.
I'm pretty sure that sortable doesn't have any undo-last-drop function -- but it's a great idea!
In the mean time, though, I think your best bet is to write some sort of start that stores the ordering, and then on failure call a revert function. I.e. something like this:
$("list-container").sortable({
start: function () {
/* stash current order of sorted elements in an array */
},
update: function () {
/* ajax call; on failure, re-order based on the stashed order */
}
});
Would love to know if others have a better answer, though.

Resources