Ember - Initializing table data from ember-data then updating via websocket - ruby-on-rails

I have a table of data that I want to update dynamically.
When the user goes to the page, I want to initialize the table with data from my rails backend. Easy with the model hook and ember data.
I then want to keep this information refreshed using the connected websocket stream.
How should I manage this. Should I be updating the model with the websocket updates (without committing the data to the backend)? The table data is an object array in the component, should I just initialize this from the model setupController function then keep the array updated directly?
Is there an easy way to map the websocket data JSON into the model or table array?

Yes you should be able to do this with Ember Data. Caveat: I have not tried this.
Somewhere you are opening your websocket stream and adding an "onmessage" handler. In that handler, you will receive the payload from the server, where you can use store.pushPayload() to update the record in the store. If the record (identified by the id field) is already in the store, it will be replaced. Otherwise it will be added as a new record. If you are displaying the record in the current template, you should just see the values change when the new record is pushed.
This Ember guide page describes this scenario, where you are streaming data from the server and you want to see instant updates to the user interface.
Additional reading: the API for pushPayload

Related

How to handle CRUD operation with Diffable Data Source & NSFetchedResultsController

I have a simple product store screen of 4 rows with each row containing 4-5 products.
Currently I have the following setup
Controller calls the backend server and gets the data
Save the models to Core Data
This triggers an update of NSFetchedResultsController
Apply the new snapshot using Diffable Data Source
This setup works great when new products are added to any row or are updated in some way.
But I am unable to find a way on how to handle the scenario when a product is removed in the API response ? because whenever I receive a response I save it in Core Data base and then rely on the FRC trigger to apply the snapshot.
So If an existing product is removed in the response the product still continues to show in the store as it also needs to be removed in the DB as well. So before saving into DB I always have to delete all the existing data and then save the new data for changes to take effect.
Can anybody suggest a change in my setup to handle this scenario or any particular flow to handle it ?

Dynamics365 Operations: Created/Updated timestamps with Data Entities

I am new to Dynamics FnO, and recently followed the articles to access data through oData, and was successful.
What I see missing in the data objects that I normally receive in integrations out of the Microsoft World is the created/updated timestamps.
I am trying to put a synchronous data flow from FnO to my NodeJs application, so that my app keeps polling data from FnO whenever there is a change. This can be achieved easily if there were timestamps with the data that flows in.
Is there a way to setup those timestamps somewhere?
You have to make sure that the underlying table that you are querying has the fields added on it, and also that the data entity you are accessing through odata has the fields setup up on it as well.
Make sure this is setup on the table:
And then you have to drag and drop the field(s) from the datasource field list to the exposed field list in the data entity:
After this, you will have these fields

Create an activity from an update to an already created object in Stream-Rails?

I am using the high-level stream-rails ROR client for Stream to create a notifications section on my web app, specifically for friend requests.
I understand that when an AR model instance of something like a FriendRequest model is created, it is stored in the feed as an activity. However, I would also like an activity to be added to the feed when that FriendRequest instance is modified (ex. updating an attribute). Is there a way to do this?
Your approach could differ based on what you're updating in the activity. Normally you'd send us a foreign_id and object_id that points at the object in your database. You can certainly send metadata in the activity itself of course. If it's something changing that isn't already in the activity payload then the enrichment should take care of it by pulling your update out of the database later when you fetch the feed.
If it IS metadata within the activity: save the same activity data (with whatever updated metadata you want to track) with the same foreign_id and timestamp (you could also track an updated date in a metadata field if that's important). Our addActivity API call acts as an upsert as long as the foreign_id and timestamp are the same as before and will overwrite your previous activity with any new data.

How to disable cache in odata v2 model sapui5

oData v2 model has the feature to cache the used data and read data. I want to disable the cache feature. I don't want to keep the data in my model. Any suggestions?
You cannot disable cache. But you also don't really need to.
If you're afraid that the cached data is old, just use the refresh method to get the model to retrieve new data from the server. When you execute the refresh method, each binding reloads its data from the server. For list or element bindings, a new request to the backend is triggered.

How to avoid an unsaved entity while fetching data in Core Data?

This is my first project where I am using Core Data with sqlite as the backing store.
Here are the quick details of the scenario:
There is a feedback form that gets filled in one screen.
There is a screen where I can see saved forms.
The form data can be synced with back-end server.
I am using MKNetworkingKit for interacting with REST API. (Looks like I should have looked at RestKit, but I dont have time to go back)
When I save the form I save the data in the main managedObjectContext of the application to the persistentStore.
In the form screen I have a sync button that syncs the application data with backend.
Also while saving the data locally, I check for connectivity and push the rest of the unsynced data to the server.
In the screen where I have saved forms, there also I have a sync button to sync data.
My Problem is that in the screen while I am filling a form and the form info in entity is not complete, and its in an inconsistent state.
I use the same method of my dataManager singleton to do the syncing.
In other screens where I sync data, my managed object context is in consistent state and I can sync the data, but while filling the form I want to avoid the entity I am working on and have not saved it.
What should I be doing now to get things done quickly?
Also what should be the ideal way of designing such application using core data?
Don't create the actual entity until the form data is complete and validated. If you need an intermediate place to store it while editing is happening, invent an object with the same data fields but which isn't a managed object. (Java people used to do this regularly with the Data Transfer Object pattern.)

Resources