TypeORM using SoftTDelete #BeforeRemove - typeorm

Why is the #BeforeRemove trigger only executed with the manager.remove() method and it permanently deletes the record from the database table?
Why is this #BeforeRemove trigger not executed by manager.softRemove() for example?
It makes no sense to fill a deleted_at column and then delete the record using manage.remove()!
#BeforeRemove()
delete() {
this.deleted_by = 'MyName'
}
// Run only with manage.remove
async remove(request: Request, _response: Response, next: NextFunction) {
let userToRemove: any = await this.userRepository.findOne(request.params.id);
return await this.userRepository.softRemove(userToRemove);
}
// softRemove does not run #BeforeRemove /:

https://github.com/typeorm/typeorm/releases/tag/0.2.42
Resolved in the release 0.2.42
https://github.com/typeorm/typeorm/pull/8403

Related

Firestore batch.commit adding more than 500 documents at a time

I am new to Firestore, trying to figure out a fast way to add some documents in Firestore using Dart.
I used the code below. I had about 3000 strings in a list, and it ended up adding all the 3000 documents, but it took a long time, about 10 minutes, and I also got an error message after batch.commit, that the 500 limit was exceeded, even though it added all 3000 documents.
I know I should break it down into 500 at a time, but the fact that it added all the documents in the list does not make sense to me. I checked in Firestore Console, and all the 3000 documents were there.
I need to create a document id every time I add a document. What am I doing wrong? Is it ok to use the add to get a document reference, and then batch.setData?
Future<void> addBulk(List<String> stringList) async {
WriteBatch batch = Firestore.instance.batch();
for (String str in stringList) {
// Check if str already exists, if it does not exist, add it to Firestore
if (str.isNotEmpty) {
QuerySnapshot querySnapshot = await myCollection
.where('name', isEqualTo: str)
.getDocuments(source: Source.cache);
if (querySnapshot.documents.length == 0) {
UserObject obj = UserObject(name: str);
DocumentReference ref = await myCollection.add(obj.toMap());
batch.setData(ref, UserObject.toMap(), merge: true);
}
}
}
batch.commit();
}
Your code is actually just adding each document separately, regardless of what you're doing with the batch. This line of code does it:
DocumentReference ref = await myCollection.add(obj.toMap());
If you remove that line (which, again, is not touching your batch), I'm sure you will just see a failure due to the batch size.
If you are just trying to generate a unique ID for the document to be added in the batch, use document() with no parameters instead:
DocumentReference ref = myCollection.document();

How to check if a record was updating using Zend Framework's 2 Sql Adapter Class

I'm trying to test to see if an update query was successful with Zend Framework 2. I'm using the getAdapter()->query() methods but I'm unsure of how to actually test to see if anything was returned or if it actually executed. I know it is executing (as I can see the update working via mysql workbench) but I'm not sure on how to actually count or verify. Here is the code I have in place (which I know is wrong but I don't know what else to do):
$update = $this->update->table('stores')
->set(array('number_of_items' => $number))->where(array('store_name' => $this->store_name));
$query = $this->sql->getAdapter()->query($this->sql->buildSqlString($update), Adapter::QUERY_MODE_EXECUTE);
if ($query->count() > 0) {
// insert the items into the items table
$insert = $this->insert->into('items')
->columns(array('store_id', 'price', 'description'))
->values(array($row['store_id'], $price, $item_desc));
$query = $this->sql->getAdapter()->query(
$this->sql->buildSqlString($insert),
Adapter::QUERY_MODE_EXECUTE
);
if ($query->count() > 0) {
return true;
} else {
throw new \Exception("Error adding your item to the items table, please try again.");
}
} else {
// this is the exception being thrown
throw new \Exception("An error occurred while adding your item(s) to the store, please try again");
}
Now I know most likely count() will only work on select queries but I am unsure of how to test to see if the update and insert were successful.
Any help would be appreciated
Thanks
To test if update and insert were successful.
As per your code
try {
$affetedRows = $this->insert->into('items')
->columns(array('store_id', 'price', 'description'))
->values(array($row['store_id'], $price, $item_desc));
}catch (\Exception $e) {
var_dump($e->getMessage());exit; // see if any exaption Or error in query
}
}
var_dump($affetedRows ) // it will return number of affected rows.
Same for delete and update, after successfull execution delete and updateare also returns number of affected rows.
so if there is successfull exceution, you can check success of your query.
Thanks.

Service Workers and IndexedDB

In a simple JavaScript Service Worker I want to intercept a request and read a value from IndexedDB before the event.respondWith
But the asynchronous nature of IndexDB does not seem to allow this.
Since the indexedDB.open is asynchronous, we have to await it which is fine. However, the callback (onsuccess) happens later so the function will exit immediately after the await on open.
The only way I have found to get it to work reliably is to add:
var wait = ms => new Promise((r, j) => setTimeout(r, ms));
await wait(50)
at the end of my readDB function to force a wait until the onsuccess has completed.
This is completely stupid!
And please don't even try to tell me about promises. They DO NOT WORK in this circumstance.
Does anyone know how we are supposed to use this properly?
Sample readDB is here (all error checking removed for clarity). Note, we cannot use await inside the onsuccess so the two inner IndexedDB calls are not awaited!
async function readDB(dbname, storeName, id) {
var result;
var request = await indexedDB.open(dbname, 1); //indexedDB.open is an asynchronous function
request.onsuccess = function (event) {
let db = event.target.result;
var transaction = db.transaction([storeName], "readonly"); //This is also asynchronous and needs await
var store = transaction.objectStore(storeName);
var objectStoreRequest = store.get(id); //This is also asynchronous and needs await
objectStoreRequest.onsuccess = function (event) {
result = objectStoreRequest.result;
};
};
//Without this wait, this function returns BEFORE the onsuccess has completed
console.warn('ABOUT TO WAIT');
var wait = ms => new Promise((r, j) => setTimeout(r, ms));
await wait(50)
console.warn('WAIT DONE');
return result;
}
And please don't even try to tell me about promises. They DO NOT WORK in this circumstance.
...
...
...
I mean, they do, though. Assuming that you're okay putting the promise-based IndexedDB lookups inside of event.respondWith() rather than before event.respondWith(), at least. (If you're trying to do this before calling event.respondWith(), to figure out whether or not you want to respond at all, you're correct in that it's not possible, since the decision as to whether or not to call event.respondWith() needs to be made synchronously.)
It's not easy to wrap IndexedDB in a promise-based interface, but https://github.com/jakearchibald/idb has already done the hard work, and it works quite well inside of a service worker. Moreover, https://github.com/jakearchibald/idb-keyval makes it even easier to do this sort of thing if you just need a single key/value pair, rather than the full IndexedDB feature set.
Here's an example, assuming you're okay with idb-keyval:
importScripts('https://cdn.jsdelivr.net/npm/idb-keyval#3/dist/idb-keyval-iife.min.js');
// Call idbKeyval.set() to save data to your datastore in the `install` handler,
// in the context of your `window`, etc.
self.addEventListener('fetch', event => {
// Optionally, add in some *synchronous* criteria here that examines event.request
// and only calls event.respondWith() if this `fetch` handler can respond.
event.respondWith(async function() {
const id = someLogicToCalculateAnId();
const value = await idbKeyval.get(id);
// You now can use `value` however you want.
const response = generateResponseFromValue(value);
return response;
}())
});

Wait for future to complete

I use my postgres database query to determine my next action. And I need to wait for the results before I can execute the next line of code. Now my conn.query returns a Future but I can't manage to get it async when I place my code in another function.
main() {
// get the database connection string from the settings.ini in the project root folder
db = getdb();
geturl().then((String url) => print(url));
}
Future geturl() {
connect(db).then((conn) {
conn.query("select trim(url) from crawler.crawls where content IS NULL").toList()
.then((result) { return result[0].toString(); })
.catchError((err) => print('Query error: $err'))
.whenComplete(() {
conn.close();
});
});
}
I just want geturl() to wait for the returned value but whatever I do; it fires immediately. Can anyone point me a of a piece of the docs that explains what I am missing here?
You're not actually returning a Future in geturl currently. You have to actually return the Futures that you use:
Future geturl() {
return connect(db).then((conn) {
return conn.query("select trim(url) from crawler.crawls where content IS NULL").toList()
.then((result) { return result[0].toString(); })
.catchError((err) => print('Query error: $err'))
.whenComplete(() {
conn.close();
});
});
}
To elaborate on John's comment, here's how you'd implement this using async/await. (The async/await feature was added in Dart 1.9)
main() async {
try {
var url = await getUrl();
print(url);
} on Exception catch (ex) {
print('Query error: $ex');
}
}
Future getUrl() async {
// get the database connection string from the settings.ini in the project root folder
db = getdb();
var conn = await connect(db);
try {
var sql = "select trim(url) from crawler.crawls where content IS NULL";
var result = await conn.query(sql).toList();
return result[0].toString();
} finally {
conn.close();
}
}
I prefer, in scenarios with multiple-chained futures (hopefully soon a thing of the past once await comes out), to use a Completer. It works like this:
Future geturl() {
final c = new Completer(); // declare a completer.
connect(db).then((conn) {
conn.query("select trim(url) from crawler.crawls where content IS NULL").toList()
.then((result) {
c.complete(result[0].toString()); // use the completer to return the result instead
})
.catchError((err) => print('Query error: $err'))
.whenComplete(() {
conn.close();
});
});
return c.future; // return the future to the completer instead
}
To answer your 'where are the docs' question: https://www.dartlang.org/docs/tutorials/futures/
You said that you were trying to get your geturl() function to 'wait for the returned value'. A function that returns a Future (as in the example in the previous answer) will execute and return immediately, it will not wait. In fact that is precisely what Futures are for, to avoid code doing nothing or 'blocking' while waiting for data to arrive or an external process to finish.
The key thing to understand is that when the interpreter gets to a call to then() or 'catchError()' on a Future, it does not execute the code inside, it puts it aside to be executed later when the future 'completes', and then just keeps right on executing any following code.
In other words, when using Futures in Dart you are setting up chunks of code that will be executed non-linearly.

Grails, Promise API and two open sessions

I am trying to clear out a collection and update it at the same time. It has children and finding the current items in the collection and deleting them asynchronously would save me a lot of time.
Step 1. Find all the items in the collection.
Step 2. Once I know what the items are, fork a process to delete them.
def memberRedbackCriteria = MemberRedback.createCriteria()
// #1 Find all the items in the collection.
def oldList = memberRedbackCriteria.list { fetchMode("memberCategories", FetchMode.EAGER) }
// #2 Delete them.
Promise deleteOld = task {
oldList.each { MemberRedback rbMember ->
rbMember.memberCategories.clear()
rbMember.delete()
}
}
The error message is: Illegal attempt to associate a collection with two open sessions
I am guessing that because I find the items, then fork, this creates a new session so that the collection is built before forking and a new session is used to delete the items.
I need to collect the items in the current thread, otherwise I am not sure what the state would be.
Note that using one async task for all the deletions is effectively running all the delete operations in series in a single thread. Assuming your database can handle multiple connections and concurrent modification of a table, you could parallelize the deletions by using a PromiseList, as in the following (note untested code follows).
def deletePromises = new PromiseList()
redbackIds.each { Long rbId ->
deletePromises << MemberRedback.async.task {
withTransaction {
def memberRedbackCriteria = createCriteria()
MemberRedback memberRedback = memberRedbackCriteria.get {
idEq(rbId)
fetchMode("memberCategories", FetchMode.EAGER) }
memberRedback.memberCategories.clear()
memberRedback.delete()
}
}
}
deletePromises.onComplete { List results ->
// do something with the results, if you want
}
deletePromises.onError { Throwable err ->
// do something with the error
}
Found a solution. Put the ids into a list and collect them as part of the async closure.
Note also that you cannot reuse the criteria as per http://jira.grails.org/browse/GRAILS-1967
// #1 find the ids
def redbackIds = MemberRedback.executeQuery(
'select mr.id from MemberRedback mr',[])
// #2 Delete them.
Promise deleteOld = task {
redbackIds.each { Long rbId ->
def memberRedbackCriteria = MemberRedback.createCriteria()
MemberRedback memberRedback = memberRedbackCriteria.get {
idEq(rbId)
fetchMode("memberCategories", FetchMode.EAGER) }
memberRedback.memberCategories.clear()
memberRedback.delete()
}
}
deleteOld.onError { Throwable err ->
println "deleteAllRedbackMembers An error occured ${err.message}"
}

Resources