Integrating twitter,facebook and other services in one single site - ruby-on-rails

I need to develop an application which should help me in getting all the status,messages from different servers like Twitter,facebook etc in my application and also when i post a message it should gets updated in all the services. I am using authlogic for authentication. Can anyone suggest me what gems/plug-ins i can use..
I need API help to get all the tweets/messages to be displayed in my application and also ways to post the messages to the corresponding services by posting it from my application. Can anyone help me from design point.

Walk through what you'd want to do in your head. Imagine the working site, imagine your webapp working before you start. So your user logs in (handled by authlogic) and sees a textbox called "What are you doing right now?". The user fills in a status message and clicks "post". The status message appears at the top of their previously posted messages.
Start with the easy part. Create a class that posts to two services. Use the twitter gem and rfacebook to post to two already defined services. In the future, you'll want to let the user associate services to their account and you would iterate through the associated services and post the message to each. Once you have this working, you can refactor or polish the UI a bit to round out this feature. I personally would do the "add a social media account to my profile" feature towards the end.
Harder is the reading of the data (strangely enough) because you're going to have to figure out how to store it. You could store nothing but I suspect you'd run into API limits just searching all the time (could design around this). I would keep a little cache of posts associated to the user's social media account. In this way, the data model would look like this:
A user has many social media accounts.
A social media account has many posts. (cache)
Of course, now you need to schedule the caching of the posts. This could be done manually, based on an event (like when they login) or time based. So when the update happens, you load up the posts for that social media account and the user will see the posts the next time they hit the page. For real-time push to the client's browser while they stare at the screen, use faye (non-trivial) and ajax to pull the new posts to the top of the social media stream view.
The time based one is tricky because you'd either have to have a cron job run or have rails handle it all with a gem like clockwork. But then you have to leave rails running. I've also solved this by having a class in /lib do all the work and a simple web call kicks off the update. But it wasn't in a multi-user use case. So that might not work. In any case, you'll want to have some nice reusable code for these problems since update requests can come from many different sources.
You'll also have to deal with the API limits. When pulling down content from twitter, you won't get everything. That will just have to be known by the user or you'll have to indicate a "break in time" somehow.
The UI should be pretty easy (functionally anyway), because you know which source the post/content is coming from. It'd be easy to throw a little icon next to the post to display which social media site it's coming from.
Anyway, good luck, sounds like a fun project.

Related

Create an internal message system with Rails

I have Users that can create DinnerEvent that contain Food. User specify preferred Food using a join table. Would like to create an internal message system that automatically sends out a notice to other Users who "prefer" the Food in the DinnerEvent that was created. Can anyone provide some guidance as to how I can go about approaching this or if there are any good reference resources out there (haven't had much luck searching)? Thought about ActiveMailer but decided I wouldn't want people to get spammed all the time in their email inbox. Would prefer to only use Rails to achieve this.
There's a lot of options here and many use cases to think through. Maybe you can start with something very simple that:
Tracks the last date/time of login for each user
On some page (specific to the logged in user), display all DinnerEvents created since last login that match their Food preferences. Should be simple Active Record to pull this.
Continue to show this list until they dismiss it (record this date/time) or login again
A full blown messaging system will probably require more complex stuff like queues for each user that are subscribed to a master queue. And, possibly an additional backend data store like Redis. I'm purposefully leaving out the details of something like this for now; it's a much bigger topic.

Twitter api - trigger on new post?

I have never worked with the twitter api, so I have no idea if this is possible. What I want to do is to trigger a url everytime something new happens on a users timeline (?). Is this possible, and if so, how do I do it?
Yes, but it takes a bit of work. You need to use the twitter streaming API, specifically the follow option.
From twitter:
Example: Create a file called ‘following’ that contains, exactly and
excluding the quotation marks: “follow=12,13,15,16,20,87” then
execute:
curl -d #following https://stream.twitter.com/1/statuses/filter.json
-uAnyTwitterUser:Password.
Basically you pass a list of user ids you want to follow, open a long-lived connection, and twitter sends back to you anything that the user posts publicly. You can monitor this connection and do things when a user posts something.
You have another option, called a User Stream , which gets you way more information about when a user does anything, but it requires the user's approval, and a much more complex authentication process via oAuth. So I would only use that if you need it.
How you're going to be keeping a persistant connection open to twitter is something very much dependent on your programming language and software. In Python, I really like tweepy, but even for python there are several different libraries, or you can just use curl or pycurl and do it yourself like in the example above.

A web app with logins in a kiosk situation

We are looking at setting up an internal web application (ASP.NET MVC) as a kiosk for the employees that don't have a dedicated computer. We currently do not have this kiosk setup. Each employee will have their own login to look at some basic payroll information and request leaves of absence. This same web application will be used by the office workers with a dedicated PC at their desk.
I am going to go out on a limb and say that no matter how many times we tell the employees, the employees will not click log off when they walk away from the kiosk. What would you do to help prevent this from happening?
lets try to fix the users instead of the code :) , i guess that your log out button is like the one here on stackoverflow. its a little text link "logout" some where in the upper right corner. thats perfect for people who use webapps day by day and are aware of the fact that they need to logout before someone comes along a does havoc to thier facebook profile, but less tech savy users wont think of that and walk away.
you need to the get the attention of your users to this logout-button and teach them that logging-out is a good thing.
try the following
give the logout button more visual weight then usally make it bigger, make it a real button instead of a textlink and even change its color to something more alerting (red, orange, ... whatever fits your ci)
if they dont loggout, use the session timeout and some javascript the refresh the page after any amount of inactivity, but also set a flag that this user has not logged out after his last visit. that way you can greet him on his next login with a nice confirmation dialog, and tell him once again why logging out is so important and where your logout-button is located.
The naive solution would be to enforce a timeout. If there's no activity from the user within a certain time limit (say, a minute or so), log them out. Of course, this won't prevent someone from walking up immediately after an employee is done and seeing how much money they make.
ATMs handle this, I think, by timing out after a minute or two, which isn't super-secure but at least offers some minimal security.
If the employees have any kind of RFID card or other security token, you could require them to put it in a reader slot, and log them out whenever the card disappears. Handling this within a web app, though, could get complicated.
The simple way is to use a little javascript.
Just have it set to something like 30 seconds of inactivity. If the user hasn't clicked on anything have the javascript send it back to a login page.
Here's a link to get you started.
Assuming you've already thought of the obvious (aggressive session timeouts, non-persistent authentication cookies, etc); how about a bit of an "out there" suggestion?
I'm not sure how do-able this would be with a web-based interface; but what about using some form of IR sensor with a usb/serial interface and an API you can tie into? This may make it possible to invoke some form of "logout" operation when someone walks away from the kiosk.
Perhaps someone has a better suggestion for external hardware, but this was the first thing that lept to my mind as a out-of-the-box approach.
I found a jQuery version that seems to work quite well. I'll start by using that and see how that goes.

Searching for a song while using multiple API's

I'm going to attempt to create an open project which compares the most common MP3 download providers.
This will require a user to enter a track/album/artist name i.e. Deadmau5 this will then pull the relevant prices from the API's.
I have a few questions that some of you may have encountered before:
Should I have one server side page that requests all the data and it is all loaded simultaneously. If so, how would you deal with timeouts or any other problems that may arise. Or should the page load, then each price get pulled in one by one (ajax). What are your experiences when running a comparison check?
The main feature will to compare prices, but how can I be sure that the products are the same. I was thinking running time, track numbers but I would still have to set one source as my primary.
I'm making this a wiki, please add and edit any issues that you can think of.
Thanks for your help. Look out for a future blog!
I would check amazon first. they will give you a SKU (the barcode on the back of the album, I think amazon calls it an EAN) If the other providers use this, you can make sure they are looking at the right item.
I would cache all results into a database, and expire them after a reasonable time. This way when you get 100 requests for Britney Spears, you don't have to hammer the other sites and slow down your application.
You should also make sure you are multithreading whatever requests you are doing server side. Curl for instance allows you to pull multiple urls, and assigns a user defined callback. I'd have the callback send a some data so you can update your page with as the results come back. GETTUNES => curl callback returns some data for each url while connection is open that you parse it on the client side.

Rails - Tracking Referrals to Conversions

We just launched and are looking to better understand where the users who are converting to registered users are actually coming from. We can see our traffic sources and referrals via Google Analytics and our other web statistics programs, but in volume, it's difficult to tie these specifically to which users in our database have converted and from where.
We have several "goals" in Google Analytics setup to better help track conversions, but what are others doing to associate user signups with inbound traffic sources?
One thought we've been kicking around - capturing the referral on the first page load and pass it along in the session into the registration form where you store it into the user record.
Any other solutions that are working successfully for you?
Thanks!
Indeed, I would suggest storing the referrer in the user record. Then you can write some code to sensibly draw out additional data from the URL. For instance, you could parse Google URL's to determine the keywords used to discover your site. And your code could detect things like referrals from ad runs, specific SEO campaigns you're running, or partner deals you have going.
It would be beneficial to spend some time building an admin-only page to visualize these conversions to help you better learn what is working and what isn't. And when things are going well, such a page is encouraging for the whole team!
Capturing referral is a good start. You should capture it to persistent cookie instead of a session so that if user returns tomorrow it still has the same referral information.
I've created a gem to automate tracking and saving referral infos. See https://github.com/holli/referer_tracking for more info.
Some notes when designing tracking (I've tried to catch these with the gem already)
It might be better to save tracking data to separate table. So that when you delete user account you won't delete information about how that user account was created. You get the answer like "where does bogus user accounts come from?"
Save also cookies to db. If you are using Google Analytics you can parse Googles cookies to get additional information about the visitor. Like the number of visits or campaign information.
It's good to save also user_agents etc to be able to differ between mobile and desktop browsers etc.
In the end its good to visualize the information and conversions. But in the beginning its hard to know what data you want to visualize and how. So try to capture as much data as possible and then later decide how to crunch that data with scripts.

Resources