Typeorm: Last update date with childs - typeorm

We have an application that manage what we call "jobs".
If a job is updated on backend, we want to trigger a synchronization to our front.
It is done thanks to a polling mechanism (websocket is forbidden because of security purpose).
We consider that a job must be sync if the last update date of the job is greater that the last synchronization date.
We are using "#UpdateDateColumn" to keep track on last update date on each entities.
However, to get the last update date in a "job", we have to check the last update date on the job but also on all sub entities (relations).
As we want to keep our system as performant as possible, we want to avoid complex requests on polling requeste.
That's why we wanted to implement a mechanism to keep the job last update date updated each time that a child entity is updated
For implementation, we thought first about SQL triggers (postgres) but it is not yet supported by typeorm.
Then, we had a look on "#EventSubscriber" but there are some limitations: work for remove but not delete, save but not update, can miss update done through custom requests...
Do you think about any other way to solve this problem ?

Related

Zapier: How to make sure that I return items only once for polling trigger

I am implementing a Zapier Integration's polling trigger. I have built a trigger and an API which serves the data correctly. However my concern is about: how to make sure that I provide the new data only, when zapier polls.
I know about the deduplication mechanism. I provide ids in all the items and Zapier makes sure that one item is used only once. However in my application the items can go into hundreds very quickly and in months they will be in thousands and beyond. I want an optimised solution where I serve only the items which will eventually be used by Zapier, thus reducing the memory usage in my application.
Some timestamp can be save for every call, which I can store inside my application but that will not be a foolproof solution. Same API can be used by user in multiple zaps, plus there are sample calls etc.
Great question! The simplest way to do this is to add a date parameter to your API that lets you filter for items created after that date.
Then, in your Zapier code, provide that param for all trigger calls. I'd set the time to 24 hours ago. So, when a trigger fires, it'll only get items created in the last 24 hours. That could be a big list, but items will cycle out after a day.

Automatically remove entry from Firebase that is out of date

I have an iOS social app that uses Firebase as the main database to store all the posts with time stamp included
What I want to achieve is to remove anything that is > 10 days old from my database.
Currently, I am checking this with this super inefficient way (The only way I know). Every time the user queries the firebase, I have swift code that also queries the ENTIRE database and delete all entries that is > 10 days old. This works but it is really inefficient...
What you're trying to do is currently best done on a server you control with a job that runs periodically to scan and delete the old items. You can use the admin SDK for that.
You should also have a index on the time field that you're using to determine how old it is, in order to optimize the query that generates the results.

Logging data changes for synchronization

I am looking for solution of logging data changes for public API.
There is a need to tell client app which tables form db has changed and need to be synchronised since the app synchronised last time and also need to be for specific brand and country.
Current Solution:
Version table with class_names of models which is touched from every model on create, delete, touch and save action.
When we are touching version for specific model we also look at the reflected associations and touch them too.
Version model is scoped to brand and country
REST API is responding to a request that includes last_sync_at:timestamp, brand and country
Rails look at Version with given attributes and return class_names of models which were changed since lans_sync_at timestamp.
This solution works but the problem is performance and is also hard to maintenance.
UPDATE 1:
Maybe the simple question is.
What is the best practice how to find out and tell frontend apps when and what needs to be synchronized. In terms of whole concept.
Conditions:
Front end apps needs to download only their own content changes not whole dataset.
Does not invoked synchronization when application from different country or brand needs to be synchronized.
Thank you.
I think that the best solution would be to use redis (or some other key-value store) and save your information there. Writing to redis is much faster than any sql db. You can write some service class that would save the data like:
RegisterTableUpdate.set(table_name, country_id, brand_id, timestamp)
Such call would save given timestamp under key that could look like i.e. table-update-1-1-users, where first number is country id, second number is brand id, followed by table name (or you could use country and brand names if needed). If you would like to find out which tables have changed you would just need to find redis keys with query "table-update-1-1-*", iterate through them and check which are newer than timestamp sent through api.
It is worth to rmember that redis is not as reliable as sql databases. Its reliability depends on configuration so you might want to read redis guidelines and decide if you would like to go for it.
You can take advantage of the fact that ActiveModel automatically logs every time it updates a table row (the 'Updated at' column)
When checking what needs to be updated, select the objects you are interested in and compare their 'Updated at' with the timestamp from the client app
The advantage of this approach is that you don't need to keep an additional table that lists all the updates on models, which should speed things up for the API users and be easier to maintain.
The disadvantage is that you cannot see the changes in data over time, you only know that a change occurred and you can access the latest version. If you need to track changes in data over time efficiently, than I'm afraid you'll have to rework things from the top.
(read last part - this is what you are interested in)
I would recommend that you use the decorator design pattern for changing the client queries. So the client sends a query of what he wants and the server decides what to give him based on the client's last update.
so:
the client sends a query that includes the time it last synched
the server sees the query and takes into account the client's nature (device-country)
the server decorates (changes accordingly) the query to request from the DB only the relevant data, and if that is not possible:
after the data are returned from the database manager they are trimmed to be relevant to where they are going
returns to the client all the new stuff that the client cares about.
I assume that you have a time entered field on your DB entries.
In that case the "decoration" of the query (abstractly) would be just to add something like a "WHERE" clause in your query and state you want data entered after the last update.
Finally, if you want that to be done for many devices/locales/whatever implement a decorator for the query and the result of the query and serve them to your clients as they should be served. (Keep in mind that in contrast with a subclassing approach you will only have to implement one decorator for each device/locale/whatever - not for all combinations!
Hope this helped!

What are CourseCompletions and when are they created?

I see the entries in the API documentation for getting "CourseCompletion" objects. But do not see how these are entered in the Learning Environment. Can you explain what these objects are?
CourseCompletion records are essentially meta-data type notes that you can attach to a user/course-offering combination to make a record of a user having "completed" a course on such-and-such a date. The course completion record can also carry an expiry date for when the "completion" becomes out of date or no longer relevant. These features are not heavily used by D2L customers, and are not exposed through the Web UI.
I don't believe there is any automation within the back-end service around the creation or modification of these records (for example, there isn't an event in the system when a course completion record would get created: a client would need to manually create such a record when it wants one to exist).

CoreData Entity Updates at App Login

My app talk to webServer. At login, I pull down JSON and make up CoreData with 4 Entities (about 1000 rows each). The data changes on Server, So with every login, I have to update my existing CoreData.
What is the best approach to find out if records exist and insert new ones if need be?
To be smart on update (not blindly update every time), you need some intelligence on the server side.
One idea I would do.
Server has master table that records the timestamp of modified date of the 4 entities. It also has API to expose the master table. Every time change occurs to one of the 4 entities, master table's corresponding entry has to be updated as well.
You create the same copy of master table in application side as well.
On application launch, you query API in 1. and compare with the value in 2. to see if the timestamp has updated on the server side.
If YES, then download and replace the corresponding entity.
Another one which allows finer control.
Add timestamp column to the 4 entities on the server side. Every time entry is added/updated, the timestamp is updated.
Prepare an API for each entity that filters only newer items than the specified timestamp
On application launch, you query API in 2. and update.
The hole of the second approach is that it cannot handle deletion on the server side. Maybe you can combine something like the first approach to support this.

Resources