posthog - is "Filter Out Plugin" drops events before or after they are added to the db? - posthog

i dont want some of my events going into posthog due to security . i thought using the "Filter Out Plugin" .
the question is: how does the plugin work? couldnt find spefici info about it .
is it like a middleware that drops the events before they are added to the db ? or after ?
thnx

The filter out app runs in the Posthog app/plugin server.
That means any events filtered by it are filtered before they are stored in the database. But those events will have passed through API code and a Kafka topic.
The plugin server is similar to a functions-as-a-service platform. It reacts to each event ingested. The plugins let you edit or reject each event before the events are stored. You can read more about the plugin server here https://posthog.com/docs/runbook/services/plugin-server
If you need even more security you can avoid sending the events at all. You can read more about how to do that in the PostHog documentation
Since that tutorial was written PostHog has added allowlists in the JS SDK to help further reduce automatic capture https://github.com/PostHog/posthog-js/pull/481
(Full disclosure - I work at PostHog)

Related

Embedded orbeon: blank test page

I have an (java/jsp) embedded form builder (version 2019.2.0.201912301747) and I would like to use the test button, but I get an empty iframe on the popup window and indeed, the embedded fb-test-iframe iframe's src is an about:blank.
Although the same is true when I drop the orbeon.war to an empty tomcat and run the form builder there, there the test page works correctly (I have to add that I have implemented a custom persistence API for my orbeon instance, maybe I should have specified somewhere some related url, I have specified only the oxf.fr.persistence.${my persistence id}.uri .)
Unfortunately there is no network activity in the browser debug (maybe because of the iframe) and no browser error, it just does not work :(
Unfortunately the documentation does not really explain the way of working of the test button.
What and where should I configure in Orbeon to force it to, I don't know, raise an event that triggers something in orbeon to load something (ideally a form runner instance that loads the currently edited form) to there?
Update #1:
After switching to orbeon-2018.2.4.201911212304-PE.zip (but keeping the 2019 libs where it is embedded), I was able to get it to communicate, so I am a bit further. Now as it is visible on the picture, the embedded orbeon sends data to its backend, and the request arrived the orbeon backend according to its logs.
orbeon.war log
But there is no answer to the last request, this is the key I think, but I don't understand why. As if there would be a further call (maybe towards the orbeon CRUD API backend, and there would be no connect/read timeout for that connection).
But there is no further communication towards the crud api, there is no new entries in the crud api backend log.
My properties-local-prod.xml
Thanks in advance.
OP mentioned in a comment that this doesn't happen with an out-of-the-box install of Orbeon Forms, and thus suspects the issue they are having was due to some change they inadvertently made to Orbeon Forms.

Send Docker Entrypoint logs to APP in realtime

I'm looking for ideas to send Docker Logs for each runs to be sent to my application in realtime. I'm looking ways this can be done. Please let me know how this can be done.
Let me know if you have done this already or know how this can be achieved. I want to build feature similar to Netlify or vercel where they show you all build log on UI in realtime. I want something similar for my node application.
You can achieve this with Vercel and Log Drains.
Log Drains make it easy to collect logs from your deployments and forward them to archival, search, and alerting services by sending them via HTTPS, HTTP, TLS, and TCP once a new log line is created.
At the time of writing, we currently support 3 types of Log Drains:
JSON
NDJSON
Syslog
Along with Log Drains, we are introducing two new open-source integrations with logging services for you to start using them today: LogDNA and Datadog.
Install the integration: https://vercel.com/integrations?category=logging
See the announcement blog post: https://vercel.com/blog/log-drains
Note that Vercel does not allow Docker deployments, but does support Serverless Functions.

Can I use PubNub to update and delete messages without "Storage&Playback" feature?

I am trying to build a realtime CRUD app. I have a rails app integrated with angularJS. It uses postgresql database, and angular is connected to the backend via JSON api. I'm running PubNub commands from angular.
So far, I got the PubNub subscribe and publish working, and the published data gets saved on the postgresql backend. In other words, I got the "create" and "read" part done, and I have the "update" and "delete" to implement.
I've been searching on google and pubnub, but the only examples I got were either for a project with PubNub+BackBone.js or I had to discard my backend with postgresql+JSON api and use PubNub's Storage&Playback feature.
Is there anyway that I can implement update/edit and delete on my current setup?
PubNub doesn't currently support update and delete. History keeps a full record of all publishes and they get automatically deleted based on the storage policy (keep for 1 day, 3 days, 30 days, forever).
PubNub stores all messages per channel with the Storage & Playback feature add-on. Since it stores all messages in a time series, and doesn’t expose Updating and Deleting, you need a simple pattern to implement this functionality. You can find details of this implementation here.
So basically, you will need storage and playback, and use this Message Updates and Deletes pattern to implement update and delete. PubNub doesn't support the functions directly, but you can use these patterns to "mark" message id's as deleted.

SignalR not working properly after deploying on azure

I had gone through questions over here but not found solution . I am using the persistent connection SignalR to broadcast to particular clients for updating the page (some grids) . It works well on localhost and particular parts got updated .
When it got deployed over azure it works fine initially but if user is idle for some duration and log in again after few hours the page is not getting updated through signalR .
So if user waits even for long time no updates over the page .
It looks that its not broadcasting to particular client .
Is it related to servicebus issue(as mentioned in one article over internet) ?
or am i doing something wrong ?
Need some suggestion or help .
Specifically for Azure, there's a Nuget package and instructions that help you get set up scaling SignalR using the Azure Service Bus to keep things in sync. Here's a link to the detailed instructions specifically for Azure, though #dfowler's link works, too, if you want to scale it out using another method.
http://www.asp.net/signalr/overview/performance-and-scaling/scaleout-with-windows-azure-service-bus
Are you using multiple roles? If you are you need to use scaleout. You can learn more about scaling out here http://www.asp.net/signalr/overview/performance/scaleout-in-signalr

Can I use github-services hook to post my feeds to other services?

Github has developed github-services hook to push commits to other services like bugzilla, campfire, basecamp ..
Can one use the same github-services hook to push my application data to other services? If yes how may I integrate github-services to my Rails application.
Any Help ? Any suggestion ?
Update Can I integrate github-services hook source code as Sinatra application inside my Rails application ? How may I call other services(bugzilla, campfire, basecam, twitter) hooks from my application triggers ?
As example, When one user post something on other user's wall than message should be sent to the other services like bugzilla,campfire,basecamp, twitter ...
The Post-Receive Url is the simplest hook to perform such notification. It triggers a POST to a pre-configured Url whenever a pushis performed on the repository.
You could start with this Github.help page on testing web hooks to understand the format of what is being POSTed and how the service reacts. This is done thanks a very useful service: PostBin.
This help page gives a simple example of what one would have to implement on a Sinatra server to parse the POSTed JSON:
post '/' do
push = JSON.parse(params[:payload])
"I got some JSON: #{push.inspect}"
end
This gist goes a little further and show some really basic JSON data extraction.
If you want to go further, you can configure, through the GitHub API, some additional hooks to listen to more events (new issue, new fork, download, ...).
I think you are looking for an easy way to post your app's data to many other web services.
github-services is designed to take git commit information and push it to other services that accept that commit information... so if your app's data looks enough like github's payload, then those other services that work with github-services will work with your app.
But I suspect your app is not like github and your data is different than a git commit. In that case, you could make use of the code in 'services/' as examples of how to implement event handlers in your app. This one for Campfire uses the Tinder gem, for example: https://github.com/github/github-services/blob/master/services/campfire.rb
Then your WallPostsController#create could call a method that posts data in the format you choose to the various services. If you're going to post to many services, you may want to do it in an asynchronous job (DelayedJob, resque, etc.) because calls to many external services will take quite a while.

Resources