How to subscribe to and see events from Hyperledger Composer transactions - hyperledger

I am running an NodeJS server with the following code to connect to the Hyperledger Runtime:
const BusinessNetworkConnection = require("composer-client")
.BusinessNetworkConnection;
this.businessNetworkConnection = new BusinessNetworkConnection();
this.CONNECTION_PROFILE_NAME = "hlfv1";
this.businessNetworkIdentifier = "testNetwork";
this.businessNetworkConnection
.connect(
this.CONNECTION_PROFILE_NAME,
this.businessNetworkIdentifier,
"admin",
"adminpwd"
)
.then(result => {
this.businessNetworkDefinition = result;
console.log("BusinessNetworkConnection: ", result);
})
.then(() => {
// Subscribe to events.
this.businessNetworkConnection.on("events", events => {
console.log("**********business event received**********", events);
});
})
// and catch any exceptions that are triggered
.catch(function(error) {
throw error;
});
I see data returned after the connection has been made in the result object and it is the correct network data that has been deployed.
However, when I submit transactions and made request VIA my generated REST APIs no events are seen by my server. In the Historian, I can see that events are emitted. Is there something else that I should be doing to see those events emitted by my transactions?

I tried same kind of test and I could receive events. I compared my test code and yours, and I found following difference:
this.bizNetworkConnection.on('events'
this.bizNetworkConnection.on('event'
I hope it helps.

Related

Background sync taking too much time after getting internet connection, workbox

I am using service worker to achieve background sync functionality. Following is my code:
importScripts( 'https://storage.googleapis.com/workbox-cdn/releases/3.6.3/workbox-sw.js' )
const queue = new workbox.backgroundSync.Queue('registerQueue', { callbacks: {
queueDidReplay: function(requestArray) {
let requestSynced = 0
requestArray.forEach(item => {
if (!item.error) {
requestSynced++
}
})
if (requestSynced > 0) {
new BroadcastChannel('backgroundSynBroadCastChannel').postMessage(
requestSynced
)
}
} } }) const GraphQLMatch = /graphql(\S+)?/
self.addEventListener('fetch', event => { if (
null !== event.request.url.match(GraphQLMatch) &&
navigator.onLine === false ) {
const promiseChain = fetch(event.request.clone()).catch(err => {
return queue.addRequest(event.request)
})
event.waitUntil(promiseChain) } })
self.addEventListener('message', event => { if (!event.data) {
return }
switch (event.data) {
case 'skipWaiting':
self.skipWaiting()
break
default:
break } })
workbox.precaching.precacheAndRoute([])
/* * Alternate for navigateFallback & navigateFallbackBlacklist */ workbox.routing.registerNavigationRoute('/index.html', { blacklist: [/^\/__.*$/] })
On internet disconnect, the requests are queued on the indexed DB. But the problem is After acquiring the connection back, the background sync is made at least 5-10 mins later. Is there any way to do the background sync immediately upon internet re-connection or at least reduce the time for syncing.
Thanks in advance.
You could manually trigger a replay of a queue as soon as your connection is back by sending an event to the service worker.
In your service worker:
self.addEventListener('message', (event) => {
if (event.data.type === 'replayQueue') {
queue.replayRequests();
}
});
In your app (using workbox-window):
if ('serviceWorker' in navigator) this.wb = new Workbox('/service-worker.js');
window.addEventListener(‘online’, function(event){
this.wb.messageSW({type: 'replayQueue'});
});
Unfortunately, doesn't look like it's possible right now to change the timing for the sync. According to Google's Workbox documentation:
Browsers that support the BackgroundSync API will automatically replay
failed requests on your behalf at an interval managed by the browser,
likely using exponential backoff between replay attempts.
If Google's documentation is correct (at least for Chrome) that also means that the longer the user has been offline, the probability of a longer wait for the sync event increases.
#cyril-hanquez's idea is good as long as the user is still utilizing your site when they come back online. You might also want to add a "fetchDidFail" callback to handle more network outage edge cases. Along those lines: you might want to avoid relying on the status of "navigator.onLine", since it doesn't always do what one would expect.

How to dispatch a Paypal IPN to a Google Cloud function?

I've read here that it's possible to send an IPN directly to a Google cloud function. I have my Google Cloud functions running on Firebase on an index.js file.
I've set up my Paypal buttons to send the IPN to a page on my webapp.
Here is an example of one of the functions I'm running off Google Cloud Functions/Firebase:
// UPDATE ROOMS INS/OUTS
exports.updateRoomIns = functions.database.ref('/doors/{MACaddress}').onWrite((change, context) => {
const beforeData = change.before.val();
const afterData = change.after.val();
const roomPushKey = afterData.inRoom;
const insbefore = beforeData.ins;
const insafter = afterData.ins;
if ((insbefore === null || insbefore === undefined) && (insafter === null || insafter === undefined) || insbefore === insafter) {
return 0;
} else {
const updates = {};
Object.keys(insafter).forEach(key => {
updates['/rooms/' + roomPushKey + '/ins/' + key] = true;
});
return admin.database().ref().update(updates); // do the update}
}
return 0;
});
Now question:
1) I want to add another function to process IPN from Paypal as soon as I have a transaction. How would I go about this?
I'll mark the answer as correct if solves this first question.
2) how would that Google cloud function even look like?
I'll create another question if you can solve this one.
Note I am using Firebase (no other databases nor PHP).
IPN is simply a server that tries to reach a given endpoint.
First, you have to make sure that your firebase plan supports 3rd party requests (it's unavailable in the free plan).
After that, you need to make an http endpoint, like so:
exports.ipn = functions.http.onRequest((req, res) => {
// req and res are instances of req and res of Express.js
// You can validate the request and update your database accordingly.
});
It will be available in https://www.YOUR-FIREBASE-DOMAIN.com/ipn
Based on #Eliya Cohen answer:
on your firebase functions create a function such as:
exports.ipn = functions.https.onRequest((req, res) => {
var reqBody = req.body;
console.log(reqBody);
// do something else with the req.body i.e: updating a firebase node with some of that info
res.sendStatus(200);
});
When you deploy your functions go to your firebase console project and check your functions. You should have something like this:
Copy that url, go to paypal, edit the button that's triggering the purchase, scroll down to Step 3 and at the bottom type:
notify_url= paste that url here
Save changes.
You can now test your button and check the req.body on your firebase cloud functions Log tab.
Thanks to the answers here, and especially to this gist: https://gist.github.com/dsternlicht/fdef0c57f2f2561f2c6c477f81fa348e,
.. finally worked out a solution to verify the IPN request in a cloud func:
let CONFIRM_URL_SANDBOX = 'https://ipnpb.sandbox.paypal.com/cgi-bin/webscr';
exports.ipn = functions.https.onRequest((req, res) => {
let body = req.body;
logr.debug('body: ' + StringUtil.toStr(body));
let postreq = 'cmd=_notify-validate';
// Iterate the original request payload object
// and prepend its keys and values to the post string
Object.keys(body).map((key) => {
postreq = `${postreq}&${key}=${body[key]}`;
return key;
});
let request = require('request');
let options = {
method: 'POST',
uri : CONFIRM_URL_SANDBOX,
headers: {
'Content-Length': postreq.length,
},
encoding: 'utf-8',
body: postreq
};
res.sendStatus(200);
return new Promise((resolve, reject) => {
// Make a post request to PayPal
return request(options, (error, response, resBody) => {
if (error || response.statusCode !== 200) {
reject(new Error(error));
return;
}
let bodyResult = resBody.substring(0, 8);
logr.debug('bodyResult: ' + bodyResult);
// Validate the response from PayPal and resolve / reject the promise.
if (resBody.substring(0, 8) === 'VERIFIED') {
return resolve(true);
} else if (resBody.substring(0, 7) === 'INVALID') {
return reject(new Error('IPN Message is invalid.'));
} else {
return reject(new Error('Unexpected response body.'));
}
});
});
});
Also thanks to:
https://developer.paypal.com/docs/classic/ipn/ht-ipn/#do-it
IPN listener request-response flow: https://developer.paypal.com/docs/classic/ipn/integration-guide/IPNImplementation/
To receive IPN message data from PayPal, your listener must follow this request-response flow:
Your listener listens for the HTTPS POST IPN messages that PayPal sends with each event.
After receiving the IPN message from PayPal, your listener returns an empty HTTP 200 response to PayPal. Otherwise, PayPal resends the IPN message.
Your listener sends the complete message back to PayPal using HTTPS POST.
Prefix the returned message with the cmd=_notify-validate variable, but do not change the message fields, the order of the fields, or the character encoding from the original message.
Extremely late to the party but for anyone still looking for this, PayPal have made a sample in their JS folder on their IPN samples Github repo.
You can find this at:
https://github.com/paypal/ipn-code-samples/blob/master/javascript/googlecloudfunctions.js

Atomic update of Realtime Database from Google Cloud Functions

I use Google Cloud Functions to create an API endpoint for my users to interact with the Realtime Database.
The problem I have is that I'm not sure how the code works. I have a helper function doSomething that I need to call only once, but I have a suspicion that there are cases where it can be called twice or possibly more (when multiple users call the API at the same time and the update operation hasn't been processed by the DB yet). Is it possible? Does it mean I need to use a transaction method? Thank you!
DB structure
{
somePath: {
someSubPath: null
}
}
Google Cloud Functions code
const functions = require('firebase-functions')
const admin = require('firebase-admin')
const cors = require('cors')({origin: true});
admin.initializeApp(functions.config().firebase)
// API ENDPOINT
exports.test = functions.https.onRequest((req, res) => {
cors(req, res, () => {
admin.database().ref('/somePath/someSubPath').once('value')
.then(snapshot => {
const value = snapshot.val()
if (value) return res.status(400).send({ message: 'doSomethingAlreadyCalled' })
doSomething()
const updates = { '/somePath/someSubPath': true }
return admin.database().ref().update(updates)
.then(() => res.status(200).send({ message: 'OK' }))
})
.catch(error => res.status(400).send({ message: 'updateError' }))
})
})
// HELPERS
const doSomething = () => {
// needs to be called only once
}
I believe you were downvoted due to the above pseudocode not making complete sense and there being no log or output of what your code is actually doing in your question. Not having a complete picture makes it hard for us to help you.
Just Going from your structure in the question, your actual code could be calling twice due to function hoisting. Whenever I have this issue, I’ll go back to the api documentation and try to restructure my code from rereading.
HTH

Service Worker caching not recognizing timeout as a function

I was watching Steve Sanderson's NDC presentation on up-and-coming web features, and saw his caching example as a prime candidate for an application I am developing. I couldn't find the code, so I have typed it up off the Youtube video as well as I could.
Unfortunately it doesn't work in Chrome (which is also what he is using in the demo) It fails with Uncaught TypeError: fetch(...).then(...).timeout is not a function
at self.addEventListener.event.
I trawled through Steve's Github, and found no trace of this, nor could I find anything on the NDC Conference page
//inspiration:
// https://www.youtube.com/watch?v=MiLAE6HMr10
//self.importScripts('scripts/util.js');
console.log('Service Worker script running');
self.addEventListener('install', event => {
console.log('WORKER: installing');
const urlsToCache = ['/ServiceWorkerExperiment/', '/ServiceWorkerExperiment/scripts/page.js'];
caches.delete('mycache');
event.waitUntil(
caches.open('mycache')
.then(cache => cache.addAll(urlsToCache))
.then(_ => self.skipWaiting())
);
});
self.addEventListener('fetch', event => {
console.log(`WORKER: Intercepted request for ${event.request.url}`);
if (event.request.method !== 'GET') {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
console.log(`WORKER: Updating cached data for ${event.request.url}`);
var responseClone = networkResponse.clone();
caches.open('mycache').then(cache => cache.put(event.request, responseClone));
return networkResponse;
})
//if network fails or is too slow, return cached data
//reference for this code: https://youtu.be/MiLAE6HMr10?t=1003
.timeout(200)
.catch(_ => {
console.log(`WORKER: Serving ${event.request.url} from CACHE`);
return caches.match(event.request);
})
);
});
As far as I read the fetch() documentation, there is no timeout function, so my assumption is that the timeout function is added in the util.js which is never shown in the presentation... can anyone confirm this? and does anyone have an Idea about how this is implemented?
Future:
It's coming.
According to Jake Archibald's comment on whatwg/fetch the future syntax will be:
Using the abort syntax, you'll be able to do:
const controller = new AbortController();
const signal = controller.signal;
const fetchPromise = fetch(url, {signal});
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000);
const response = await fetchPromise;
// …
If you only wanted to timeout the request, not the response, add:
clearTimeout(timeoutId);
// …
And from another comment:
Edge & Firefox are already implementing. Chrome will start shortly.
Now:
If you want to try the solution that works now, the most sensible way is to use this module.
It allows you to use syntax like:
return fetch('/path', {timeout: 500}).then(function() {
// successful fetch
}).catch(function(error) {
// network request failed / timeout
})

Pass custom data to service worker sync?

I need to make a POST request and send some data. I'm using the service worker sync to handle offline situation.
But is there a way to pass the POST data to the service worker, so it makes the same request again?
Cause apparently the current solution is to store requests in some client side storage and after client gets connection - get the requests info from the storage and then send them.
Any more elegant way?
PS: I thought about just making the service worker send message to the application code so it does the request again ... but unfortunately it doesn't know the exact client that registered the service worker :(
You can use fetch-sync
or i use postmessage to fix this problem, which i agree that indexedDB looks trouble.
first of all, i send the message from html.
// send message to serviceWorker
function sync (url, options) {
navigator.serviceWorker.controller.postMessage({type: 'sync', url, options})
}
i got this message in serviceworker, and then i store it.
const syncStore = {}
self.addEventListener('message', event => {
if(event.data.type === 'sync') {
// get a unique id to save the data
const id = uuid()
syncStore[id] = event.data
// register a sync and pass the id as tag for it to get the data
self.registration.sync.register(id)
}
console.log(event.data)
})
in the sync event, i got the data and fetch
self.addEventListener('sync', event => {
// get the data by tag
const {url, options} = syncStore[event.tag]
event.waitUntil(fetch(url, options))
})
it works well in my test, what's more you can delete the memory store after the fetch
what's more, you may want to send back the result to the page. i will do this in the same way by postmessage.
as now i have to communicate between each other, i will change the fucnction sync into this way
// use messagechannel to communicate
sendMessageToSw (msg) {
return new Promise((resolve, reject) => {
// Create a Message Channel
const msg_chan = new MessageChannel()
// Handler for recieving message reply from service worker
msg_chan.port1.onmessage = event => {
if(event.data.error) {
reject(event.data.error)
} else {
resolve(event.data)
}
}
navigator.serviceWorker.controller.postMessage(msg, [msg_chan.port2])
})
}
// send message to serviceWorker
// you can see that i add a parse argument
// this is use to tell the serviceworker how to parse our data
function sync (url, options, parse) {
return sendMessageToSw({type: 'sync', url, options, parse})
}
i also have to change the message event, so that i can pass the port to sync event
self.addEventListener('message', event => {
if(isObject(event.data)) {
if(event.data.type === 'sync') {
// in this way, you can decide your tag
const id = event.data.id || uuid()
// pass the port into the memory stor
syncStore[id] = Object.assign({port: event.ports[0]}, event.data)
self.registration.sync.register(id)
}
}
})
up to now, we can handle the sync event
self.addEventListener('sync', event => {
const {url, options, port, parse} = syncStore[event.tag] || {}
// delete the memory
delete syncStore[event.tag]
event.waitUntil(fetch(url, options)
.then(response => {
// clone response because it will fail to parse if it parse again
const copy = response.clone()
if(response.ok) {
// parse it as you like
copy[parse]()
.then(data => {
// when success postmessage back
port.postMessage(data)
})
} else {
port.postMessage({error: response.status})
}
})
.catch(error => {
port.postMessage({error: error.message})
})
)
})
At the end. you cannot use postmessage to send response directly.Because it's illegal.So you need to parse it, such as text, json, blob, etc. i think that's enough.
As you have mention that, you may want to open the window.
i advice that you can use serviceworker to send a notification.
self.addEventListener('push', function (event) {
const title = 'i am a fucking test'
const options = {
body: 'Yay it works.',
}
event.waitUntil(self.registration.showNotification(title, options))
})
self.addEventListener('notificationclick', function (event) {
event.notification.close()
event.waitUntil(
clients.openWindow('https://yoursite.com')
)
})
when the client click we can open the window.
To comunicate with the serviceworker I use a trick:
in the fetch eventlistener I put this:
self.addEventListener('fetch', event => {
if (event.request.url.includes("sw_messages.js")) {
var zib = "some data";
event.respondWith(new Response("window.msg=" + JSON.stringify(zib) + ";", {
headers: {
'Content-Type': 'application/javascript'
}
}));
}
return;
});
then, in the main html I just add:
<script src="sw_messages.js"></script>
as the page loads, global variable msg will contain (in this example) "some data".

Resources