I want to start to create an app that allows the user to read feeds from multiple sites. The user can add a new feed and all the news from all stored feeds will be presented in a tableview. Thus, i wanted to ask if it is possible to a way to load the content from multiple xml feeds into one feed and order it by date of publish?
What would be the best approach?
Thank you very much!
Grouping all feeds into one is usually a bad call if you want to have reasonable latency as grouping usually means caching of each individual feeds before they're grouped.
Related
I am new to Twitter and need some tips.
I need to display tweet feed from multiple users on some webpage.
The first thing I stumbled upon is Embedded Timelines. It allows to display tweets from list of users but the gotcha is that those lists should be maintained on Twitter-side (i.e. I cannot specify #qwe and #asd only on my side and get timeline without adding those users into list on Twitter-side).
The thing is that list of users that should be included into timeline is dynamic and managing those lists through Twitter API will probably be painful. Not to mention that my website will probably generate tons of those lists and I feel that I will violate some api quotas sooner or later.
So, my question is - am I stuck with using Embedded Timelines that refer some user list on Twitter-side and managing those lists through, say Twitter REST api, or there is a simplier way to do what I want?
It's pretty simple to display tweets for multiple users.
Links to start with
This post explains some of the search queries you can make
This post is a simple library to make requests to the twitter API that 'just works'
Your Query
Okay, so you want multiple users. The endpoint you're looking at using is the search/tweets one: https://api.twitter.com/1.1/search/tweets.json.
The query string uses :from and you can interpolate multiple froms with AND/OR.
An example query for the GET request:
?q=from:user1+OR+from:user2
Read more about the search API queries here.
Your "over-the-quote" issue
This is something you're going to need to figure out yourself - depending on the number of requests you expect to make, and the twitter imposed limits, maybe some sort of caching or saving information when you hit your limit, and only pull back from the cache whilst you're hitting your limit..
Is there a way you can dynamically combine multiple Twitter timelines into a single display? For example, I want to allow a user to set a preference for which timelines they want to be displayed, and then have the results displayed in a single table.
I have seen the posts regarding creating a list as a way to combine multiple twitter timelines into one request, and displaying that. But this is me predefining which timelines the user gets, and they display all or nothing.
I'd like each user to be able to pick between TimelineA, TimelineB, TimelineC. And then the table dynamically update to display only those chosen. I was hoping there was a means to manipulate the GET statuses/user_timeline parameter so that it would return results from multiple screen_name. But, I've not been able to sort it out.
I'm targeting iOS 6, using Twitter 1.1 API, and currently have a single timeline displaying successfully in a table, thanks to the Techotopia tutorial.
Thanks for any suggestions.
You can create an array containing the tweets from all the timelines sorted by a date parameter like created_at. You can sort this array using something like:
How to sort an NSMutableArray with custom objects in it?
This array would be your data source for a UITableView.
I am looking for a solution that would provide subscription-like responses that would contain results for particular tag from twitter.
I saw plenty of REST/ STREAM scripts for node.js but these scripts connect to twitter just once.
I would like not to be worried about rate limits.
Basically what I want to do is set up a notification (ex. console log) if there is a new search result for 20 different tags.
Is that possible?
With the streaming API you are only supposed to use a single connection, however it can contain multiple keywords (comma delimited) in the track attribute, and can be combined with other search parameters such as location and user filters.
The idea is that you use the one stream to collect all the data you need, and then process/filter that data independently to display or store as you please.
I've seen sites like this (http://www.tradename.net/) on the web that seem to be nothing more than a collection of news articles pulled in from different places - all seemingly automated... I would like to know how can I create something like this that:
(a) either automatically, one its own pulls data from different news feeds and creates these articles/news-conent, OR
(b) I run a program periodically to update all its content
I am looking for a ready-to-run software / module that I can take and put in either the keywords or links to news feeds and get it to work... I'm not interested in one of those paid template sites.
Another example: http://www.limitedliability.org/
You can just make your own website like that. Just use rss-feeds from topics / newswebsites that you like to show your users. Customize your website like how you want it yourself using one of the scripting languages. It's not very hard to loop through all news flashes in a rss-feed and show them to your users.
You can use PHP
Or .NET
Or Javascript
Ans obviously there are more ways to do this. Just take a good look around and check with what scripting language you feel most comfortable.
Create a script that parses the rss-feeds from the news sites, and only store the ones that you are interested in.
Or just create your own Google News feed and add it to your site. There are free feeds for non-commercial use.
Available Google News Feeds
RSS Feeds: Incorporate feeds onto my site
I'm using RSS library so i can parse Atom and RSS in Ruby and Rails and store it in a model.
I've looked at the standard RSS library, but is there one library that will auto-detect that there is a new rss feed so i can update my database ?
what are the best practice to trigger an instruction in order to store the new rss feed ?
should i use threads to handle that problem ?is it going to be slow?
thank you for your help
OK heres the deal.
If you want a real fast feed parser go for Feedzirra. Does not work on windows. http://github.com/pauldix/feedzirra
Autodiscovery?
-Theres truffle-hog if you don't want to do GET redirects. http://github.com/pauldix/truffle-hog
-Theres feedbag if you want to do GET redirects to find feeds from given urls. This is slower though. http://github.com/damog/feedbag
Feedzirra is the best bet if you want to poll for new entries for your feed. But if you want a more non-polling solution to your problem then i would suggest going through the pubsubhubbub spec. Make sure while parsing your feeds they are pubsubhubbub enabled. Check for the link tag. If it points to pubsubhubbub.appspot.com or any other pubsub enabled hub then just subscribe to the feed by sending a subscription request to the hub. You can then define a endpoint in your app which will in turn receive updated entry pings for your feed subscription from the hub. Just read the raw POST data and store it in your database. Stats are that 95% of the blogger blogs are pubsub enabled. That is a lot of data in your hands already. :)
If you are polling for changes then you should check the last-modified or etag from the header rather than parse the entire feed again. Saves you from wasting resources. Feedzirra takes care of this for you.
I am not sure what you mean by "auto-detect" a new feed?
Are you looking for code that can discover when someone creates a new feed on a site? Or, do you mean discover when an existing feed has a new article?
The first is tough because your code needs to know what site to look at so it needs some sort of auto-discovery of sites with new feeds. Searching the google for "new rss feeds" doesn't return anything that looks useful, at least not on the first page. If you, or your users, know of a new site then you can have an interface to add new sites to search. Then you grab the page at that URL, look for the RSS/Atom auto-discovery links, and go from there. Auto-discovery links can open a can of worms because of duplicate content being served using different protocols (RDF, RSS and Atom), so you have to determine which to use, or multiple feeds with alternate content listed.
If you mean you want to discover when an existing feed has new articles, then you have to keep track of the last time your code looked at the feed, and the last article that was seen, then retrieve the feed and see if any articles were not in your list of previously seen articles. Your code needs to be sensitive to the time-to-live information in a lot of feeds too. Hitting the feed every fifteen minutes when they update once a week is bad form. Most aggregation code can do those things already but you might need to configure a database and tell the code how to find it.
Generally, for this sort of task I set up a crontab entry on a production Linux or Unix system and fire off the job periodically, looking in the database for feeds whose last-run-time plus the stored time-to-live value is in the past.
Does that help any?
Very easy solution is to use Dynamic attribute-based finders
When you are filling your model with RSS feed data, instead of Model.create(...) use Model.find_or_create_by_column(value, :other_column => other_value).
You can specify a date as unique value or RSS message title ... (whatever you want)
I think this is pretty easy. You can make some cron task to fill your model once per hour for example. Only new feeds will be added.
There is no chance to get some "event" when RSS is updated without downloading whole RSS feed again.