Using mixpanel to build custom analytics dashboard for users - ruby-on-rails

I love graphs.
I'd love to get my hands on some data and make it look pretty. But alas, I'm a little lost on what would be considered best practice.
I've selected mixpanel (only as an example) as I seems wonderfully easy to track custom events, and doesn't have any subdomain limitation like Google Analytics.
Say I had 100-1000+ users who have an account (which is publicly facing), and I'm currently tracking the public interactions their pages get. With mixpanel, I can see the data which is lovely, and I've segmented it to individual accounts. So far, so good!
But then, I want to show my users this information. And here my head begins to hurt. Do I schedule a cron jobs, pulling in the data from mixpanel and writing it to their respective accounts? Or is there a better way? I've looked into mixpanel's api (I'm using Ruby), but they keep telling me I should use the javascript api. But in using JS, how does one prevent others getting the data (ie. what's stopping someone faking mixpanel api-posts in the console, or viewing my private key?).
What would you consider a practical solution in such a case?

You can achieve this by storing the user specific events of each user with a $bucket property attached which has a value unique to each user as explained in the mixpanel docs here Mixpanel docs. If you want to still use ruby to serve the events, have a look at Mixpanel's recommended ruby client libraries
mixpanel_client looks like the much maintained option of the 2 mentioned. If you go with that then you can serve user specific events as shown in the example below(which is also in the gem's readme):
data = client.request do
# Available options
resource 'events/properties'
event '["test-event"]'
name 'hello'
values '["uno", "dos"]'
timezone '-8'
type 'general'
unit 'hour'
interval 24
limit 5
bucket 'contents'
from_date '2011-08-11'
to_date '2011-08-12'
on 'properties["product_id"]'
where '1 in properties["product_id"]'
buckets '5'
end

You could try a service like Keen IO that will allow you to generate encrypted scoped write and read API keys. Keen IO is built for customizable and programmatic analytics features such as exposing analytics to your customers, where as MixPanel is more for exploring your data in their UI. The idea with an encrypted scoped key is they will never be able to access your account, only the data you want them to see. You could easily tag your events with a customer ID and then use the Scoped Keys to ensure that you only ever show customers their own data.
https://keen.io/docs/security/#scoped-key
Also, Keen IO has an "importer" which allows you to export your mixpanel events into your Keen IO database.

Related

Ruby on Rails. Using Google Client API to parse emails

I am new to Ruby and have a question about how to do a specific task on Rails.
I have a list of inventory and each item has a specific Stock ID that is emailed to my personal Gmail account. I want my web application to listen for emails from a specific email account. When my gmail receives an email from that specific account I want my application to parse it for a couple of fields and insert the stock ID into my database.
For example:
Let's say my database has an item with style code: A5U31 and size:10.
The email will say something like item with style code: A5U31 and size:10 has Stock ID:329193020.
I want my Rails application to search the database for an entry with that specific style code and size, and when it finds an entry to simply insert the stock ID into the row.
I am trying to using the Google-API-Client gem to this, but I am struggling, because I am still a beginner. So far I have done this quick-start guide to authenticate my gmail account with my rails app.
https://developers.google.com/gmail/api/quickstart/ruby?authuser=2
If someone could help me figure out how to write this sort of code and where to put it in my app(models, controllers, views) that would be very appreciated. Thanks!
I know it's been several months since you posted this, so you probably already have it worked out, but in case you still need a solution (or if someone else wants to do something similar), I'll share my thoughts:
At a high level, it sounds like your plan is
Identify when a new email has come in (either by polling or by using a push notification).
Grab the new email's content.
Parse the email's content in order to extract relevant data.
Use the data to query and update a database.
Based on the documentation for the Gmail API, it does look like you should be able to set up push notifications, so you won't have to poll the endpoint to get the information you need.
However, my big takeaway from this list is that none of the items on it really require Rails, since you're not exposing an external web API for requests. I suppose that you could leverage ActiveRecord to create an item model and use that to manage the database; however, since it seems like you'd only need to make some basic SQL queries (and the same ones each time), I'm not sure that bringing in ActiveRecord adds much value.
If I were trying to solve this problem myself, I would probably create a simple Ruby program that (a) uses the gem you mentioned to handle push notifications from the Gmail API, and (b) uses another gem to connect to whatever kind of database you're using (e.g. pg for Postgres) and make the necessary queries.
(All of this assumes, of course, that you aren't specifically using Rails for some other reason, e.g. adding this feature to an existing Rails application).

Static : Creating a solution for document rating (Redis/Rails/NodeJS ?)

I'm building a static website with 10000+ pages generates from json file with middleman.
Each page is a document (pdf) with it's own id, summary and a download link.
I need to give anonymous users the ability to rate a document and show global rating for each document.
Since the website is static, i've been looking for a solution like disqus that handles pages rating and ability to get counts via api. I didn't find any.
Let's say I'll have to create a seperate server that handles rating. Which technology should I use ? I think about Redis, but the big problem is what if a visitor creates a scripts that rates up/down a document million times. How can I make sure there will be no flooding?
I know about captcha, but will it be effective in this scenario?
Are there other solutions?
So here are my questions:
Is Redis the right choice for this?
NodeJS/Rails ? I'm pretty confortable with Rails, but NodeJS is faster ?
Is captcha enough to be sure there will be no (minimum) flooding ? Other solution?
Thank you.
Redis is an awesome solution for it. you can use redis abilities to store each rating given and another key to store the calculated rating for an easy fetch
If you are familiar with Javascript, Node JS is a fun choice to go. if you are more familiar with ruby - Sinatra would do the trick just fine.
as for the captcha solution - keeping anon users from flooding your rating queue is a PITA. you should probably allow only registered / logged in users to rate - and only after a successful download request.in that case - a captcha would be just fine.

Simperium multiple users accessing data

In the Simperium documentation/help section there is the following text:
All the data that is created seems like it must be tied to a user - is
that correct? Is it possible to have data that isn't tied to a user -
say a database of locations or beers?
Yes, though this isn't very clear yet. You can create a public user
(i.e., a public namespace) with an access token you share with other
users of your app so anyone can read/write to that namespace.
It's possible to limit this to read-only access as well if you need to
authoritatively publish data from a backend service.
Is there an actual example with this?
The scenario I have is as follows
My app will have a calendar
The primary user can add and remove data from the calendar
They will want to invite other users to add and remove data, my thought is that they can give them a token, the user can use their email address and this token to sign in
Am I on the right track?
You're definitely on the right track, but a little too far ahead on that track. The scenario you described is a great fit for Simperium, but sharing and collaboration features aren't yet released.
The help text you quoted is for authoritatively pushing content, for example from a custom backend to all users of your app. An example of this would be a news stream that updates on all clients as new content is added.
This is quite different than sharing calendar data among a group of users who have different access permissions, which is actually a better use of Simperium's strengths. We have a solution for this that we've tested internally, but we're using what we've learned to build a better version of it that will be more scalable for production use.
There's no timeline for this yet, but you'll see it announced on your dashboard at simperium.com.

Integrating twitter,facebook and other services in one single site

I need to develop an application which should help me in getting all the status,messages from different servers like Twitter,facebook etc in my application and also when i post a message it should gets updated in all the services. I am using authlogic for authentication. Can anyone suggest me what gems/plug-ins i can use..
I need API help to get all the tweets/messages to be displayed in my application and also ways to post the messages to the corresponding services by posting it from my application. Can anyone help me from design point.
Walk through what you'd want to do in your head. Imagine the working site, imagine your webapp working before you start. So your user logs in (handled by authlogic) and sees a textbox called "What are you doing right now?". The user fills in a status message and clicks "post". The status message appears at the top of their previously posted messages.
Start with the easy part. Create a class that posts to two services. Use the twitter gem and rfacebook to post to two already defined services. In the future, you'll want to let the user associate services to their account and you would iterate through the associated services and post the message to each. Once you have this working, you can refactor or polish the UI a bit to round out this feature. I personally would do the "add a social media account to my profile" feature towards the end.
Harder is the reading of the data (strangely enough) because you're going to have to figure out how to store it. You could store nothing but I suspect you'd run into API limits just searching all the time (could design around this). I would keep a little cache of posts associated to the user's social media account. In this way, the data model would look like this:
A user has many social media accounts.
A social media account has many posts. (cache)
Of course, now you need to schedule the caching of the posts. This could be done manually, based on an event (like when they login) or time based. So when the update happens, you load up the posts for that social media account and the user will see the posts the next time they hit the page. For real-time push to the client's browser while they stare at the screen, use faye (non-trivial) and ajax to pull the new posts to the top of the social media stream view.
The time based one is tricky because you'd either have to have a cron job run or have rails handle it all with a gem like clockwork. But then you have to leave rails running. I've also solved this by having a class in /lib do all the work and a simple web call kicks off the update. But it wasn't in a multi-user use case. So that might not work. In any case, you'll want to have some nice reusable code for these problems since update requests can come from many different sources.
You'll also have to deal with the API limits. When pulling down content from twitter, you won't get everything. That will just have to be known by the user or you'll have to indicate a "break in time" somehow.
The UI should be pretty easy (functionally anyway), because you know which source the post/content is coming from. It'd be easy to throw a little icon next to the post to display which social media site it's coming from.
Anyway, good luck, sounds like a fun project.

Rails - Tracking Referrals to Conversions

We just launched and are looking to better understand where the users who are converting to registered users are actually coming from. We can see our traffic sources and referrals via Google Analytics and our other web statistics programs, but in volume, it's difficult to tie these specifically to which users in our database have converted and from where.
We have several "goals" in Google Analytics setup to better help track conversions, but what are others doing to associate user signups with inbound traffic sources?
One thought we've been kicking around - capturing the referral on the first page load and pass it along in the session into the registration form where you store it into the user record.
Any other solutions that are working successfully for you?
Thanks!
Indeed, I would suggest storing the referrer in the user record. Then you can write some code to sensibly draw out additional data from the URL. For instance, you could parse Google URL's to determine the keywords used to discover your site. And your code could detect things like referrals from ad runs, specific SEO campaigns you're running, or partner deals you have going.
It would be beneficial to spend some time building an admin-only page to visualize these conversions to help you better learn what is working and what isn't. And when things are going well, such a page is encouraging for the whole team!
Capturing referral is a good start. You should capture it to persistent cookie instead of a session so that if user returns tomorrow it still has the same referral information.
I've created a gem to automate tracking and saving referral infos. See https://github.com/holli/referer_tracking for more info.
Some notes when designing tracking (I've tried to catch these with the gem already)
It might be better to save tracking data to separate table. So that when you delete user account you won't delete information about how that user account was created. You get the answer like "where does bogus user accounts come from?"
Save also cookies to db. If you are using Google Analytics you can parse Googles cookies to get additional information about the visitor. Like the number of visits or campaign information.
It's good to save also user_agents etc to be able to differ between mobile and desktop browsers etc.
In the end its good to visualize the information and conversions. But in the beginning its hard to know what data you want to visualize and how. So try to capture as much data as possible and then later decide how to crunch that data with scripts.

Resources