How twitter updates retweets and favorites without refreshing the page - ios

Does anyone who use the current version of twitter right now know what they used to update retweets and favorites without having the user refresh the page? Or does anyone at least know how i can do this with swift? Would think be possibly by using ajax on the backend?

There are a couple of ways to get that working. You can run an script that every second hits an url and gets the new retweets and stats, and updates the info on the page.
There is something about a relatively new technology called WebRTC, that provides a different way to connect two devices through the web (RTC stands for Real Time Communication).
Of course, you can use some ready plugins, or maybe some frameworks that are proper to do that work, with no pain, but the ideal is that you know how these things works under the hood first.
I hope I've helped.

Related

Database for Real Time Queries/Push Notifications

I am looking to build an iOS app and website that work 'together'.
What the plan is for each:
On the iOS side, it will be pushing information to the server in the form of a post. The users will then be able to vote up and down on the posts as well; which also implies they will be able to see the other users information (in real time).
The website will be viewing this information in real time and using the posts. If a post gets enough down votes the server should tell the website and apps to remove it.
I have experience with SQL. Although SQL does not seem to be the appropriate server choice - for what I want to do - given my experience with it. (I could definitely be wrong.)
I would like to host the information myself, however have heard that Parse is good about holding information for iOS apps. I just don't know whether it gives you enough freedom to work with websites as well.
TL;DR: What kind of database/datastore should I use for a real time queries that allows for push notifications?
All suggestions are welcome. Thank you.
Try Using FireBase
firebase.google.com
Documentation

What is a good strategy for staying up-to-date with external API's?

My project is reliant on several API's, like Twitter and Youtube for example. Recently, Youtube deprecated their old API, and it caused issues with my team's iPad app.
We could have stayed ahead of the change if we were paying attention to Youtube's announcements of the upcoming deprecation. But alas, we were not and the idea of staying up to date with all of our dependencies manually(browsing the web) seems exhausting and inefficient.
I have found the following tool to help notify when changes occur with external library dependencies, https://libraries.io. However, this does not help with API dependencies.
Besides checking the API source webpages every so often, I was wondering if anyone had suggestions on how to stay notified and up-to-date with news regarding updates to a specified list of external API's?
After some time looking at different options, I have found a solution that is not perfect, but seems to work best at fitting this need.
Solution Description
This solution uses a combination of Twitter, Google Scripts, and website blogtrottr.com. I am creating a twitter list of reliable dev handles that often post updates on new API. For example I made a list that contained #twitterapi and #YouTubeDev. Used Google Scripts to create an online feed out of the twitter list. Then used blogtrottr to email me every time that feed gets a new posting.
Steps to Implement
Create a twitter list of reliable handles that often post about updates to their API
Create an RSS Feed from that Twitter list. The details for how to do this can be found here.
Plug that url that you get from Google Script into blogtrotter.
I did find some other ways to do this, but so far this is the only solution that was 100% free!

Tracking users' clicks and page visits in Rails

I would like to monitor users' page visits and clicks in my Rails app to make recommendations. My questions are:
Is there a Rails gem for this, or Google Analytics is the standard? If latter is true, then how should I link a page visit to a particular user profile?
It is typical in Rails to have a section in application.html.erb, which is shared for all pages. If I add Google Analytics pageview tracking code to in application.html.erb, will it be able to track all individual pages?
There are other ways, but the vast majority probably use Google Analytics. Several gems exist that help you integrate with GA to get at the data. See here: https://www.ruby-toolbox.com/categories/Web_Analytics.
Based on your first question, it seems you may want more insight than GA can provide. I've used ClickTale (http://www.clicktale.com) and Woopra (http://www.woopra.com) before, to good effect. This article lists several other alternatives, too - notice the high marks for Clicky: http://imimpact.com/web-stats-alternatives-to-google-analytics/.
Google Analytics (and almost all of these others) will take care of your second question automatically whenever the user loads a new page, since it keyed by URL. That means that, although you put the GA script code in a single place, each unique page is tracked individually.
If you have AJAX requests that change that page without changing the URL, you'll need to dig in to the GA script API. Essentially you'll need to push a new url (possibly with a # in it) whenever you want to track an AJAX-driven link/button click. See here: http://davidwalsh.name/ajax-analytics
I am biased, but I would recommend checking out impressionist, if you need to integrate the page views into the app in real-time. With analytics you will always have some lag time and you are also relying on an external dependency. Impressionist is good if you need this kind of control, but if you are just looking for simple metrics and don't need to pull them into the app, then analytics is probably the way to go.
Check out Ahoy, at https://github.com/ankane/ahoy. With just a few lines of code in your app, you can track page views and tie them to user accounts.
You can further customize Ahoy to track custom events, both the client (with JavaScript) and server.
Ahoy does not depend on any third-party services.

When did someone follow - Twitter API

I've been attempting to go over the Twitter API, albeit it has taken me a while and I'm being thrown back and forth between the old and the new site - however I was wondering if there is a date at all for when a user has decided to follow or; or if your able to tell when a user stopped following you?
I've been looking through here https://apiwiki.twitter.com/Twitter-API-Documentation to no avail, but I wondered if anyone knew of a way of doing it (outside of a separate monitoring system of course!)
Cheers,
Dan
The Twitter API doesn't explicitly provide dates for when a user started following you or stopped following you. This is something that you would need to monitor in some fashion.

Best way to display a Twitter feed (with history) on a Rails site

On a Rails site, I'd like to display a certain Twitter feed, with pagination so the visitor can see previous tweets (as far back as needed).
I implemented it using the Twitter gem, using Search method, which has a nice pagination method, but hit a limitation that Twitter will only return the statutes from the last two weeks. So after going back a couple of pages, it won't fetch anymore.
I could use the user_timeline method, with max_id and then do my own pagination (passing the max_id of the last item viewed back to the controller to fetch the next batch).
Or, I could have a rake task that polls the Twitter feed frequently (with cron), and stores the tweets in the DB. The site would serve those up from the DB instead of querying Twitter at all.
Which would be the best or recommended method? I don't like having to store the Tweets in the DB, but that would also take care of the latency problem of querying Twitter (though I could use fragment caching to overcome that except that I haven't been able to get it working with Ajax).
Thanks for the advice.
I take the opposite view here, storing tweets in your database is not a good idea for a range of reasons.
you can never be sure that you got all the recently added tweets as a whole bunch could be added in quick succession. Sure, you can just make the cron job run more frequently, but then we get to the next problem.
if tweets are deleted, for whatever reason, your app will still cache them, which too me is not bad practice as they would have been removed for a reason.
To be honest, I would not have your app serve the tweets, but have a 'widget' (jquery or such) on the page which would love them once the page has loaded, and look at implementing some form of pagination there instead.
I'd go for storing the tweets in the database.
So even if twitter is offline you won't have some long load problems. You'll just rely on your database and the tweets will be appropriately displayed.
Only your background job will fail because twitter is unavailable. But that's not really a problem.
We download and store the tweets in a local database. I recently wrote a blog post about how we achieved this:
http://www.arctickiwi.com/blog/16-download-you-twitter-feed-using-ruby-on-rails-with-oauth
You can then use will_paginate to handle your pagination and go back as far as you want.

Resources