Structure of a rails app that uses third party API's - ruby-on-rails

I'm new RoR and I can't seem to grasp how to structure my app.
I have an app that pulls data from Google Analytics using garb. After doing some number crunching with the data, the app will populate a Report model and display the report to the user.
Right now, I'm separating the Google Analytics logic using concerns. In my concerns folder, I have a GoogleAnalytics module that is responsible for pulling the data. The Report model includes the GoogleAnalytics module. Before the number crunching in the Report model takes place, I need to clean up and reformat the data. Should this be a responsibility of the GoogleAnalytics module or maybe a helper?
Is there a better practice for including third party services?

The reformatting should go on whatever is responsible for pulling the data from Google Analytics. None of the rest of your app should have to know the format of how Google Analytics returns it's data - the module should convert it into a sensible, standard interface and hide all of that from everyone else.
I would also strongly consider putting this stuff into a service object rather than a module. Including modules gets messy because you when you call a method on an object you don't know where that method is defined. I would only use this pattern if you were including the same module in lots of other models and it was a true DRY play.
A service object would look something like (depending on what params you need to use to pull the data):
class GoogleAnalyticsDataFetcher
attr_accessor :data
def new ga_id
#ga_id = ga_id
end
def fetch
#data = do_some_stuff
end
end
and then you could call it either from your controller or wrap it up inside the Report model somewhere. Then when you go GoogleAnalyticsDataFetcher.new(id).fetch it's incredibly obvious what is going on and where it is defined.

Related

Can ActiveRecord sub classes be used to hold data from other sources such as Facebook too?

I am creating a new application that allow the users to either create content in the local application database or directly on Facebook.
Imagine the user have a CRUD interface for a Post. I have a created a model for Post that sub classes ActiveRecord::Base. Objects of this class has methods for saving the post to the local database.
However, the user is also able to "tick" and option in my application that says "connect to Facebook". When it is checked the content will not be stored in my local database but it will go directly to Facebook Graph API.
The service layer and controller layer is not aware of where the data actually goes. At least this is my idea.
My question is if I can use the same Post class for both local data and Facebook data? Some of the methods in the Post class would not make sense when the post object contains data from Facebook; such as the save method.
Does that design seem stupid? Should I create another Post class that is just a normal Ruby class without sub classing ActiveRecord::Base? Or are there other better ways?
When designing a class you should make it as lean as possible. A good way to look at it is counting the nouns and verbs that are included in your model. A post can be saved or deleted, but if your save and delete start having logic related to Facebook it's a good sign that this should belong to a different class altogether.
One more note related to Facebook: the new guidelines don't allow posting 'pre-written' posts for a user. Meaning that you won't be able to make a post to a users wall without prompting him with Facebook either way.
I don't see any problems with having Post < ActiveRecord::Base - that is the standard Rails way and it does look like you should implement the standard ways of storing data to your DB and look into the Facebook posting from another angle.
There are some definite problems with that approach - the first is that you will end up with alot of tight couplings to the Facebook API. I have had tons of grief from the FB API's changing without warning or not working as documented.
Another issue is performance - loading data from FB is often painfully slow even compared to reading from a DB across the wire. It is also prone to outrages (at least in my experience).
I would definitely use "proxy objects" which are stored in your database and regularly hydrated from Facebook instead.

API and Rails - Best way to go

Currently my application uses the FlickRaw gem to pull data from Flickr. I want to refactor and tidy up parts of it more and remove the FlickRaw dependancy.
So, What I am using now is HttParty and calling the Flickr API methods directly.
I have made a Flickr class and in there I have methods such as self.collection, self.photoset and so on to pull the appropriate stuff.
My concern is that from the locations in my app where I actually need to integrate with Flickr using those methods, I have to call Flickr.collection(id). If Flickr suddenly seized to exist, I would have to go and find all those references to Flickr and alter them. I would prefer to rename my Flickr class to something like PhotoHost and have methods in there which DO call Flickr but as the Flickr specific part is all together, it should be easier to alter in the future if required. Does this sound a sensible way to deal with this? How would you go about it?
Also, would the methods in that class map directly 1:1 with the real API methods or would you sort of make you own methods and build an array of data that you actually want before using it elsewhere in the app or would you just send the flickr response to those other areas and deal with it there? - Actually, after typing that it seems the best way to go would be to write a method called (for example) self.photos_for_album(album_id) and then call the appropriate flickr api methods to build up an array of photos and then have the method return that. I guess that would allow me to write the other code around the app to interact with the photos and if I ever change the photo host from Flickr then as long as the new methods return an array in the same format all should be good.
Sorry, This question has been a bit of a braindump, but I would be very interested in hearing how others would go about this?
What you really want to use is something like a Strategy or Adapter pattern. You should create a FlickrPhotoHost module that implements methods that make sense for your app (not necessarily 1-1 with Flickr's api):
module FlickrPhotoHost
def get_albums
# fetch from flickr
end
def get_photo(album_id)
# fetch from flickr
end
end
And then you should have a PhotoHost class that includes this Flickr adapter:
class PhotoHost
include FlickrPhotoHost
end
And then use it wherever it makes sense in your app.
source = PhotoHost.new
album = source.get_albums.first
photos = source.get_albums(album.id)
# and so on...
When flickr dies, you can replace it with InstagramPhotoHost:
module InstagramPhotoHost
def get_albums
# fetch from instagram
end
def get_photo(album_id)
# fetch from instagram
end
end
And the only part of the rest of your app you'll need to change is PhotoHost:
class PhotoHost
include InstagramPhotoHost
end
This may be a bit of an overkill for just one host, but imagine you wanted to implement Instagram and Flickr at the same time? Let PhotoHost decide on runtime:
class PhotoHost
def initialize type
extend case type
when :instagram then InstagramPhotoHost
when :flickr then FlickrPhotoHost
end
end
end
And on instantiation, you can do this for flickr albums:
PhotoHost.new(:flickr).get_albums
Then BAM, instagram:
PhotoHost.new(:instagram).get_albums
Abstracting to the next level of higher abstraction is often a fine idea. Eg Your app would use PhotoHost rather than Flickr.
The opposing view is to watch out for pre-mature optimization. Eg Does your app have an active need to replace Flickr with an alternative? What else could you be spending your time on that might provide more features/benefits to your app and its users?
Also: think about how your abstract PhotoHost will enable your app to talk with more than one implementation at a time (being able to easily display both Flicker and InstaGram photos).
You also asked about the API methods.
I would be careful about changing methods just for the sport of it. Hopefully some thought went into the Flickr api methods.
But you should certainly change methods to add additional functionality that your app can make use of. Eg if the Flickr api is synchronous, you might want to add a set of asynchronous methods that would enable your end users to multitask (or at least get feedback) while the PhotoHost completes the requested operations.

Using mixpanel to build custom analytics dashboard for users

I love graphs.
I'd love to get my hands on some data and make it look pretty. But alas, I'm a little lost on what would be considered best practice.
I've selected mixpanel (only as an example) as I seems wonderfully easy to track custom events, and doesn't have any subdomain limitation like Google Analytics.
Say I had 100-1000+ users who have an account (which is publicly facing), and I'm currently tracking the public interactions their pages get. With mixpanel, I can see the data which is lovely, and I've segmented it to individual accounts. So far, so good!
But then, I want to show my users this information. And here my head begins to hurt. Do I schedule a cron jobs, pulling in the data from mixpanel and writing it to their respective accounts? Or is there a better way? I've looked into mixpanel's api (I'm using Ruby), but they keep telling me I should use the javascript api. But in using JS, how does one prevent others getting the data (ie. what's stopping someone faking mixpanel api-posts in the console, or viewing my private key?).
What would you consider a practical solution in such a case?
You can achieve this by storing the user specific events of each user with a $bucket property attached which has a value unique to each user as explained in the mixpanel docs here Mixpanel docs. If you want to still use ruby to serve the events, have a look at Mixpanel's recommended ruby client libraries
mixpanel_client looks like the much maintained option of the 2 mentioned. If you go with that then you can serve user specific events as shown in the example below(which is also in the gem's readme):
data = client.request do
# Available options
resource 'events/properties'
event '["test-event"]'
name 'hello'
values '["uno", "dos"]'
timezone '-8'
type 'general'
unit 'hour'
interval 24
limit 5
bucket 'contents'
from_date '2011-08-11'
to_date '2011-08-12'
on 'properties["product_id"]'
where '1 in properties["product_id"]'
buckets '5'
end
You could try a service like Keen IO that will allow you to generate encrypted scoped write and read API keys. Keen IO is built for customizable and programmatic analytics features such as exposing analytics to your customers, where as MixPanel is more for exploring your data in their UI. The idea with an encrypted scoped key is they will never be able to access your account, only the data you want them to see. You could easily tag your events with a customer ID and then use the Scoped Keys to ensure that you only ever show customers their own data.
https://keen.io/docs/security/#scoped-key
Also, Keen IO has an "importer" which allows you to export your mixpanel events into your Keen IO database.

How many ways to share data among activities in monodroid?

I need to share some sensitive data among activities.
I have two EditText which are basically username and password
I am consuming a webservice which on the base of provided username and password return some user info (DataType:String). Like userid,useremail etc.. which is basically in CSV format
I need these piece of information throughout my application.But i can't figure out which is the better way.
-- One way i could found out so far is to use sqlite with MonoAndroid
-- Other way i found out is using Application class
I just started to learn android today , but i want to know if there are some other ways to share data ?
As you mentioned, a global Application class and the database are two good ways to share application-wide data. One thing to be careful with is that your Application class could be recycled when the app is in the background, so you would lose any data that hasn't been persisted to something more permanent.
In addition to the database, you can also persist data down to the filesystem as well. This recipe from Xamarin has an example of writing directly to a file. Most of the classes you'll need to do file access are found in the System.IO namespace. Mono for Android also supports isolated storage, which provides a higher level API for reading and writing files.
If you simply need to pass data directly between activities, you can do so by adding it as an extra to the intent. This recipe explains how to do that.
If you want to wrap up access to a particular resource in a more managed fashion that can be accessed by either other parts of your application or even external applications, you can look into implementing a content provider. Android itself provides several built-in content providers for resources like contacts and media, if you need an example of what it's like to use one. This recipe explains how to read from the contacts provider.

How to test google analytics (garb) API with Rspec?

I'm using the garb gem to pull some basic stats, like pageviews, from Google Analytics. Everything's working correctly but I can't figure out the best way to test my API calls. Here's a paired down version of my Analytics class:
class Analytics
extend Garb::Model
metrics :pageviews
dimensions :page_path
Username = 'username'
Password = 'password'
WebPropertyId = 'XX-XXXXXXX-X'
# Start a session with google analytics.
#
Garb::Session.login(Username, Password)
# Find the correct web property.
#
Property = Garb::Management::Profile.all.detect {|p| p.web_property_id == WebPropertyId}
# Returns the nubmer of pageviews for a given page.
#
def self.pageviews(path)
Analytics.results(Property, :filters => {:page_path.eql => path}).first.pageviews.to_i
end
# ... a bunch of other methods to pull stats from google analytics ...
end
Pretty simple. But beyond ensuring that the constants are set, I haven't been able to write effective tests. What's the best way to test something like this? Here are some of the problems:
I'd prefer not to actually hit the API in my tests. It's slow and requires an internet connection.
The stats obviously change all the time, making it difficult to set an expectation even if I do hit the API when testing.
I think I want a mock class? But I've never used that pattern before. Any help would be awesome, even just some links to get me on the right path.
Fakeweb is a good place to start. It can isolate your SUT from the network so that slow connections don't affect your tests.
It's hard to know what else to say without knowing more about Garb. Obviously you'll need to know the format of the data to be sent and received from the API, so you can make the appropriate mocks/stubs.
I would suggest creating a testing interface that mimicks the actual calls to the google API. The other option would be to use mocks to create sample data.
I agree that it's best to not hit the actual API, since this does not gain you anything. A call to the actual API might succeed one day and fail the next because the API owners change the response format. Since GA probably won't change it's versioned API I think it's safe to create an interface that you can use in your test environments for faster testing.

Resources