Why isn't my updated observable List reflected in the template? - dart

I've got:
my-app
community-list
On attached, my-app gets the user and loads the app.user. In the meantime, community-list is attached (even before app.user is loaded) and so I haven't been able to get the user's starred communities yet. Therefore, the solution I'm working on is as follows.
In community-list.attached():
app.changes.listen((List<ChangeRecord> records) {
if (app.user != null) {
getUserStarredCommunities();
}
});
Elsewhere in community-list is said metho:
// This is triggered by an app.changes.listen.
void getUserStarredCommunities() {
// Determine if this user has starred the community.
communities.forEach((community) {
var starredCommunityRef = new db.Firebase(firebaseLocation + '/users/' + app.user.username + '/communities/' + community['id']);
starredCommunityRef.onValue.listen((e) {
if (e.snapshot.val() == null) {
community['userStarred'] = false;
} else {
community['userStarred'] = true;
}
});
});
}
Note that communities is an observable list in community-list:
#observable List communities = toObservable([]);
Which is initially populated in community-list.attached():
getCommunities() {
var f = new db.Firebase(firebaseLocation + '/communities');
var communityRef = f.limit(20);
communityRef.onChildAdded.listen((e) {
var community = e.snapshot.val();
// If no updated date, use the created date.
if (community['updatedDate'] == null) {
community['updatedDate'] = DateTime.parse(community['createdDate']);
}
// snapshot.name is Firebase's ID, i.e. "the name of the Firebase location"
// So we'll add that to our local item list.
community['id'] = e.snapshot.name();
// Insert each new community into the list.
communities.add(community);
// Sort the list by the item's updatedDate, then reverse it.
communities.sort((m1, m2) => m1["updatedDate"].compareTo(m2["updatedDate"]));
communities = communities.reversed.toList();
});
}
In summary, I load the list of communities even before I have a user, but once I have a user I want to update each community (Map) in the list of communities with the userStarred = true/false, which I then use in my community-list template.
Alas, it doesn't seem like the List updates. How do I achieve this?
This whole app.changes.listen business is expensive. What's the proper practice in a case like this, where an element is loaded before I load objects (like app.user) that will modify it in some way.

1)
toList() creates a copy of the list. You need to apply toObservable again to get an observable list.
communities = toObservable(communities.reversed.toList());
This also assigns a new list to communities which is covered by #observable.
I think it should trigger anyway
2) You update your communities explicitly. It shouldn't be necessary to listen for changes. You can call a method containing
if (app.user != null) {
getUserStarredCommunities();
}
explicitly each time you change the list.
You also call Firebase for each community when a change in communities occurs. I don't know Firebase but it seems you send a request to a server each time which is of course expensive.
You should remember for what user+community combination you already made the call and use the remembered result instead.
With app.changes.listen you listen to any updated of any #observable field in your component. If you have other observable fields beside communities this method might be called too often.
If you are only interested in changes to communities you should put this code into a method like
communitiesChanged(oldVal, newVal) {
if (app.user != null) {
getUserStarredCommunities();
}
}
but the better option is to not listen to changes and another method name and call it explicitly as state above anyways if possible.

Related

Listing WorkItem State Reasons programmatically

We have a customised TFS workflow, I want to be able to access the Reasons I can close a Bug (change the state from Active to Closed) from TFS so that we don't have to update our code every time we want to tweak our process.
This is what I have so far:
WorkItemType wiType = this.GetWorkItemStore().Projects[this.ProjectName].WorkItemTypes["Bug"];
var reason = wiType.FieldDefinitions["Reason"];
var state = wiType.FieldDefinitions["State"];
var filterList = new FieldFilterList();
FieldFilter filter = new FieldFilter(wiType.FieldDefinitions[CoreField.State], "Active");
filterList.Add(filter);
var allowedReasons = reason.FilteredAllowedValues(filterList);
However I'm not getting any results. I'd like to get a list of all the reasons why I can close a bug (Not Reproduceable, Fixed etc)
There isn't any easy way to get the transition via API directly as I know since the API read the allowed values from database directly.
The alternative way would be export the workitemtype definition via WorkItemType.Export() method and then get the information from it. Vaccano's answer in this question provided the entire code sample you can use.
Edited to give an example of how I solved this using the above recommendation:
public static List<Transition> GetTransistions(this WorkItemType workItemType)
{
List<Transition> currentTransistions;
// See if this WorkItemType has already had it's transistions figured out.
_allTransistions.TryGetValue(workItemType, out currentTransistions);
if (currentTransistions != null)
return currentTransistions;
// Get this worktype type as xml
XmlDocument workItemTypeXml = workItemType.Export(false);
// Create a dictionary to allow us to look up the "to" state using a "from" state.
var newTransistions = new List<Transition>();
// get the transistions node.
XmlNodeList transitionsList = workItemTypeXml.GetElementsByTagName("TRANSITIONS");
// As there is only one transistions item we can just get the first
XmlNode transitions = transitionsList[0];
// Iterate all the transitions
foreach (XmlNode transition in transitions)
{
XmlElement defaultReasonNode = transition["REASONS"]["DEFAULTREASON"];
var defaultReason = defaultReasonNode.Attributes["value"].Value;
var otherReasons = new List<string>();
XmlNodeList otherReasonsNodes = transition["REASONS"].SelectNodes("REASON");
foreach (XmlNode reasonNode in otherReasonsNodes)
{
var reason = reasonNode.Attributes["value"].Value;
otherReasons.Add(reason);
}
// save off the transistion
newTransistions.Add(new Transition
{
From = transition.Attributes["from"].Value,
To = transition.Attributes["to"].Value,
DefaultReason = defaultReason,
OtherReasons = otherReasons
});
}
// Save off this transition so we don't do it again if it is needed.
_allTransistions.Add(workItemType, newTransistions);
return newTransistions;
}

Umbraco Published Event Performance

I have a comments type structure where users are able to post replies to an Article. (One article can have many discussion replies). When a user adds a reply, I want the parent articles last updated date to also change so that the article is placed at the top of the list when viewed from the frontend indicating that it has had recent activity. To achieve this, the comment is added through a custom controller and then I have used the ContentService Published event to update the parent though am finding my event to be a bit of a bottle neck and taking up to six seconds to run
public void OnApplicationStarted(UmbracoApplicationBase umbracoApplication, ApplicationContext applicationContext)
{
ContentService.Published += ContentServicePublished;
}
private void ContentServicePublished(IPublishingStrategy sender, PublishEventArgs<IContent> e)
{
foreach (var node in e.PublishedEntities)
{
//Handle updating the parent nodes last edited date to address ordering
if (node.ContentType.Alias == "DiscussionReply")
{
var contentService = new Umbraco.Core.Services.ContentService();
var parentNode = contentService.GetById(node.ParentId);
int intSiblings = parentNode.Children().Count() + 1;
if(parentNode.HasProperty("siblings"))
{
parentNode.SetValue("siblings", intSiblings);
contentService.SaveAndPublishWithStatus(parentNode, 0, false);
}
}
}
}
Is there anything obvious with this code that may be causing the performance issue?
Many thanks,
You should be using the Services Singleton for accessing the various services including ContentService.
One way to do so is to access the Services on ApplicationContext.Current like so:
var contentService = ApplicationContext.Current.Services.ContentService;
However, your bottleneck is going to be in retrieving the parent node and it's properties which requires multiple calls to the database. On top of that, you're retrieving the parent's children here:
int intSiblings = parentNode.Children().Count() + 1;
The better solution is to use the PublishedContent cache which doesn't hit the database at all and provides significantly superior performance.
If you're using a SurfaceController use it's Umbraco property (and you also have access to Services as well):
// After you've published the comment node:
var commentNode = Umbraco.TypedContent(commentNodeId);
// We already know this is a DiscussionReply node, no need to check.
int intSiblings = commentNode.Parent.Children.Count() + 1;
if (commentNode.Parent.HasProperty("siblings"))
{
// It's only now that we really need to grab the parent node from the ContentService so we can update it.
var parentNode = Services.ContentService.GetById(commentNode.ParentId);
parentNode.SetValue("siblings", intSiblings);
contentService.SaveAndPublishWithStatus(parentNode, 0, false);
}
If you're implementing a WebApi based on UmbracoApiController then the same Umbraco and Services properties are available to you there as well.
I'm using Umbraco 7.3.4 and here's my solution:
// Create a list of objects to be created or updated.
var newContentList = new List<MyCustomModel>() {
new MyCustomModel {Id: 1, Name: "Document 1", Attribute1: ...},
new MyCustomModel {Id: 2, Name: "Document 2", Attribute1: ...},
new MyCustomModel {Id: 3, Name: "Document 3", Attribute1: ...}
};
// Get old content from cache
var oldContentAsIPublishedContentList = (new UmbracoHelper(UmbracoContext.Current)).TypedContent(ParentId).Descendants("YourContentType").ToList();
// Get only modified content items
var modifiedItemIds = from x in oldContentAsIPublishedContentList
from y in newContentList
where x.Id == y.Id
&& (x.Name != y.Name || x.Attribute1 != y.Attribute1)
select x.Id;
// Get modified items as an IContent list.
var oldContentAsIContentList = ApplicationContext.Services.ContentService.GetByIds(modifiedItemIds).ToList();
// Create final content list.
var finalContentList= new List<IContent>();
// Update or insert items
foreach(var item in newContentList) {
// For each new content item, find an old IContent by the ID
// If the old IContent is found and the values are modified, add it to the finalContentList
// Otherwise, create a new instance using the API.
IContent content = oldContentAsIContentList.FirstOrDefault(x => x.Id == item.Id) ?? ApplicationContext.Services.ContentService.CreateContent(item.Name, ParentId, "YourContentType");
// Update content
content.Name = item.Name;
content.SetValue("Attribute1", item.Attribute1);
finalContentList.Add(content);
// The following code is required
content.ChangePublishedState(PublishedState.Published);
content.SortOrder = 1;
}
// if the finalContentList has some items, call the Sort method to commit and publish the changes
ApplicationContext.Services.ContentService.Sort(finalContentList);

where in the breeze entity materialization pipeline can i set a non entity object's prototype?

this question is basically the same question I asked a few weeks ago... how to tap into mappingcontext.processAnonType... I marked the question as answered by mistake and since then have not been able to get any follow up.
Basically what I am trying to figure out is a location within the breeze pipeline that i can set a non entity object's prototype when the object is materialized from server results... when breeze processes results from the servers that are not entities, it ends up calling the method below of the helper MappingContext class... this method is as follows:
function processAnonType(mc, node) {
// node is guaranteed to be an object by this point, i.e. not a scalar
var keyFn = mc.metadataStore.namingConvention.serverPropertyNameToClient;
var result = {};
__objectForEach(node, function (key, value) {
var newKey = keyFn(key);
var nodeContext = { nodeType: "anonProp", propertyName: newKey };
visitNode(value, mc, nodeContext, result, newKey);
});
return result;
}
up above the value of "results" is what the client ends up receiving from breeze... this is a perfect place that I could do what it is i want to do just because i have access to the final object ("results") AND the node.$type property... i basically want to parse the node.$type property in order to figure out the prototype of the non entity object... unfortunately it does not appear processAnonType is an interception point within the pipeline... in the previous question i asked, i was directed to look at a custom jsonresultsadapter... i did that but i don't think it will work simply because the jsonresultsadapter does not ever appear to be in the position of changing the value of "results" (the final object returned)... so even if i implement a custom jsonresulsadapter and return new nodes, the value of "results" up above continues to be the same... can anyone please clue me in? thank you
EDIT #1: I already tried using a custom jsonresultsadapter… but this does NOT work for what I am SPECIFICALLY trying to do, UNLESS I am using a very old version of breeze (unlikely) or am missing something really obvious (more likely)... down below I have provided two snippets of breeze code that will hopefully help me explain my conclusion... the first snippet is the “visitandmerge” method of the mappingcontext… towards the bottom of that method you’ll see a call made to “jra.visitnode”… that’s great… it calls my custom jra implementation which returns a “node” property in the result such that the following line will use that node rather than the original one… so far so good… then at the end you’ll see that a call is made to “processmeta” that passes in my custom node… ok fine… but then if you look the “processmeta” code in my case the last “else” block ends up being invoked and a call is made to “processanontype”… this is where the problem is… at this point my custom node is discarded for purposes of creating/instantiating the final object returned to the client… I understand my custom node will be used to create properties for the final object but that's not what I am after... instead I need to manipulate the final object myself by setting its prototype… as I mentioned previously, if you look at the “processanontype” method it creates a new object (var result = {};) and that object is returned to the client, NOT my custom node, which is inline with what the documentation says… please see all the comments I left in the previous post... do you understand what my problem is? I am probably missing something here really obvious… can you please clue me in? thanks again
proto.visitAndMerge = function (nodes, nodeContext) {
var query = this.query;
var jra = this.jsonResultsAdapter;
nodeContext = nodeContext || {};
var that = this;
return __map(nodes, function (node) {
if (query == null && node.entityAspect) {
// don't bother merging a result from a save that was not returned from the server.
if (node.entityAspect.entityState.isDeleted()) {
that.entityManager.detachEntity(node);
} else {
node.entityAspect.acceptChanges();
}
return node;
}
var meta = jra.visitNode(node, that, nodeContext) || {};
node = meta.node || node;
if (query && nodeContext.nodeType === "root" && !meta.entityType) {
meta.entityType = query._getToEntityType && query._getToEntityType(that.metadataStore);
}
return processMeta(that, node, meta);
});
};
function processMeta(mc, node, meta, assignFn) {
// == is deliberate here instead of ===
if (meta.ignore || node == null) {
return null;
} else if (meta.nodeRefId) {
var refValue = resolveEntityRef(mc, meta.nodeRefId);
if (typeof refValue === "function" && assignFn != null) {
mc.deferredFns.push(function () {
assignFn(refValue);
});
return undefined; // deferred and will be set later;
}
return refValue;
} else if (meta.entityType) {
var entityType = meta.entityType;
if (mc.mergeOptions.noTracking) {
node = processNoMerge(mc, entityType, node);
if (entityType.noTrackingFn) {
node = entityType.noTrackingFn(node, entityType);
}
if (meta.nodeId) {
mc.refMap[meta.nodeId] = node;
}
return node;
} else {
if (entityType.isComplexType) {
// because we still need to do serverName to client name processing
return processNoMerge(mc, entityType, node);
} else {
return mergeEntity(mc, node, meta);
}
}
} else {
if (typeof node === 'object' && !__isDate(node)) {
node = processAnonType(mc, node);
}
// updating the refMap for entities is handled by updateEntityRef for entities.
if (meta.nodeId) {
mc.refMap[meta.nodeId] = node;
}
return node;
}
}
You should NOT need to modify the processAnonType method.
The parameters to the visitNode method in the jsonResultsAdapter have all of the information regarding the node being visited that you say you need. (See the link at the bottom of this post). The result from the visitNode is an object with the following properties:
entityType: you should return null for an anonymous type
nodeId and nodeRefId: ( probably not needed for an anonymous objects unless you plan to return multiple refs to the same object)
ignore: boolean - (if you want to completely ignore the node).
node: This is where you can take the incoming node ( the first parameter in the visitNode parameter list) and modify it, or return a completely new node object that represents your anonType instance. This object will be returned to the client unchanged, so you can create an new instance of your object with whatever prototype you want. If you don't set this property then the original incoming node will be used.
passThru: (avail in breeze versions > v 1.5.4) boolean - you should return true to return the node (above) intact without ANY further processing.
So your visitNode will look something like this:
visitNode: function(node, mappingContext, nodeContext) {
// 'isAnonType' is your method that determines if this is an anon type
var isAnon = isAnonType(node.$type);
if (isAnonType) {
// 'createCustomAnonNode' is your method where you create a copy of the node with whatever prototype you want.
// prototype you want.
var newNode = createCustomAnonNode(node);
return {
return { passThru: true, node: newNode };
}
} else {
// assuming that you kept track of the default JsonResultsAdapter;
return defaultAdapter.visitNode(node, mappingContext, nodeContext);
}
}
For more detail, see:
http://www.getbreezenow.com/documentation/jsonresultsadapters

Mahout Recomendaton engine recommending products and with its Quantity to customer

i am working on mahout recommendation engine use case.I precomputed recommendations and stored in database. now i am planning to expose with taste rest services to .net.i had limited customers and products.it is distributor level recommendation use case.my question is if new distributor comes in ,how would i suggests recommendations to him.and also how would i suggest the Quantity of Recommended product to each distributor.could you people give me some guidance.am i going to face performance issues..?
One way is, when new user comes, to precompute the recommendations from scratch for all the users or only for this user. You should know that this user might change the recommendations for the others too. Its up to your needs frequently do you want to do the pre-computations.
However, if you have limited number of users and items, another way is to have online recommender that computes the recommendations in real time. If you use the FileDataModel, there is a way to get the data from the new user periodically (See the book Mahout in Action). If you use in memory data model, which is faster, you can override the methods: setPreference(long userID, long itemID, float value) and removePreference(long userID, long itemID), and whenever new user comes and likes or removes some items you should call these methods on your data model.
EDIT: Basically you can get the GenericDataModel, and add this to the methods setPreference and removePreference. This will be your lower level data model. You can wrap it afterwards with ReloadFromJDBCDataModel by setting your data model in the reload() method like this:
DataModel newDelegateInMemory =
delegate.hasPreferenceValues()
? new MutableDataModel(delegate.exportWithPrefs())
: new MutableBooleanPrefDataModel(delegate.exportWithIDsOnly());
The overridden methods:
#Override
public void setPreference(long userID, long itemID, float value) {
userIDs.add(userID);
itemIDs.add(itemID);
setMinPreference(Math.min(getMinPreference(), value));
setMaxPreference(Math.max(getMaxPreference(), value));
Preference p = new GenericPreference(userID, itemID, value);
// User preferences
GenericUserPreferenceArray newUPref;
int existingPosition = -1;
if (preferenceFromUsers.containsKey(userID)) {
PreferenceArray oldPref = preferenceFromUsers.get(userID);
newUPref = new GenericUserPreferenceArray(oldPref.length() + 1);
for (int i = 0; i < oldPref.length(); i++) {
//If the item does not exist in the liked user items, add it!
if(oldPref.get(i).getItemID()!=itemID){
newUPref.set(i, oldPref.get(i));
}else{
//Otherwise remember the position
existingPosition = i;
}
}
if(existingPosition>-1){
//And change the preference value
oldPref.set(existingPosition, p);
}else{
newUPref.set(oldPref.length(), p);
}
} else {
newUPref = new GenericUserPreferenceArray(1);
newUPref.set(0, p);
}
if(existingPosition == -1){
preferenceFromUsers.put(userID, newUPref);
}
// Item preferences
GenericItemPreferenceArray newIPref;
existingPosition = -1;
if (preferenceForItems.containsKey(itemID)) {
PreferenceArray oldPref = preferenceForItems.get(itemID);
newIPref = new GenericItemPreferenceArray(oldPref.length() + 1);
for (int i = 0; i < oldPref.length(); i++) {
if(oldPref.get(i).getUserID()!=userID){
newIPref.set(i, oldPref.get(i));
}else{
existingPosition = i;
}
}
if(existingPosition>-1){
oldPref.set(existingPosition, p);
}else{
newIPref.set(oldPref.length(), p);
}
} else {
newIPref = new GenericItemPreferenceArray(1);
newIPref.set(0, p);
}
if(existingPosition == -1){
preferenceForItems.put(itemID, newIPref);
}
}
#Override
public void removePreference(long userID, long itemID) {
// User preferences
if (preferenceFromUsers.containsKey(userID)) {
List<Preference> newPu = new ArrayList<Preference>();
for (Preference p : preferenceFromUsers.get(userID)) {
if(p.getItemID()!=itemID){
newPu.add(p);
}
}
preferenceFromUsers.remove(userID);
preferenceFromUsers.put(userID, new GenericUserPreferenceArray(newPu));
}
if(preferenceFromUsers.get(userID).length()==0){
preferenceFromUsers.remove(userID);
userIDs.remove(userID);
}
if (preferenceForItems.containsKey(itemID)) {
List<Preference> newPi = new ArrayList<Preference>();
for (Preference p : preferenceForItems.get(itemID)) {
if(p.getUserID() != userID){
newPi.add(p);
}
}
preferenceForItems.remove(itemID);
preferenceForItems.put(itemID, new GenericItemPreferenceArray(newPi));
}
if(preferenceForItems.get(itemID).length()==0){
//Not sure if this is needed, but it works without removing the item
//preferenceForItems.remove(itemID);
//itemIDs.remove(itemID);
}
}
If by "new distributor" you mean that you have no data for them, no historical data. Then you cannot make recommendations using Mahout's recommenders.
You can suggest other items once they chose one. Use Mahout's "itemsimilarity" driver to calculate similar items for everything in your catalog. Then if they choose something you can suggest similar items.
The items that come from the itemsimilarity driver can be stored in you DB as a column value containing ids for similar items for every item. Then you can index the column with a search engine and use the user's first order as the query. This will return realtime personalized recommendations and is the most up-to-date method suggested by the Mahout people.
See a description of how to do this in this book by Ted Dunning, one of the leading Mahout Data Scientists. http://www.mapr.com/practical-machine-learning

BreezeJS - Using expand

I am querying the server to get an entity with expand
function _loadIncidents() {
var deffered = Q.defer(),
queryObj = new breeze.EntityQuery().from('Incidents').expand(['Deployments', 'IncidentComments', 'DTasks', 'ExtendedProperties', 'IncidentEvents']);
dataRepository.fetchEntitiesByQuery(queryObj, true).then(function (incidents) {
var query = breeze.EntityQuery.from("DTasks"),
incidentIds = dataRepository.getEntitiesByQuerySync(query);
deffered.resolve();
}, function(err) {
deffered.reject(err);
});
return deffered.promise;
};
I am getting the results and all is fine, how ever when I query breeze cache to get the entities - I am getting empty collection. So when using expand does the expanded entities are added to the cache?
Yes the related entities identified in the expand should be in cache ... if the query is "correct" and the server interpreted your request as you intended.
Look at the payload of the response from the first request. Are the related entities present? If not, perhaps the query was not well received on the server. As a general rule, you want to make sure the data are coming over the wire before wondering whether Breeze is doing the right thing with those data.
I do find myself wondering about the spelling of the items in your expand list. They are all in PascalCase. Are they these the names of navigation properties of the Incident type? Or are they the names of the related EntityTypes? They need to be former (nav property names), not the latter.
I Had problem with the navigation property - as I am not using OData webapi not using EF , there is problem with the navigation properties so for the current time i just wrote
Object.defineProperty(this, 'Deployments', {
get: function () {
return (this.entityAspect && this.entityAspect.entityManager) ?
this.entityAspect.entityManager.executeQueryLocally(new breeze.EntityQuery("Deployments").
where('IncidentID', 'eq', this.IncidentID)) :
[];
},
set: function (value) { //used only when loading incidents from the server
if (!value.results) {
return;
}
var i = 0,
dataRepository = require('sharedServices/dataRepository');
for (i; i < value.results.length; i++) {
dataRepository.addUnchangedEntity('Deployment', value.results[i]);
}
},
enumerable: true
});

Resources