Does Twitter make Twitter outage data available? - twitter

I'm looking to find Twitter outage data, a dataset that can tell you when, where, and for how long Twitter becomes offline or difficult to use/access. Does Twitter make this information available somewhere, maybe through its APIs?
Thanks!

No, there is no Twitter API for that. I suspect you could use something like Down Detector but I don’t know what historical data it has available (and we are in the realms of recommending software, which is off topic for Stack Overflow)

Related

Downloading Twitter corpus

I am working on a data mining system and one of the requirements is it being able to perform the analysis without the use of API. Is there a way to download the Twitter database (or a big part of it, at least) and work with it locally?
There is a paper about creating corpora from twitter. It is called “TWORPUS – An Easy-to-Use Tool for the Creation of Tailored Twitter Corpora”. I recommend to read it because it also covers licensing issues etc. They also provide there code on Github.
In fact, you cannot download the twitter data dumps directly. I can download single tweets and stored them in a corpus. But, it is also not allowed to share that data. Therefore, the authors built the Tworpus client to create private twitter corpora.
APIs are the official way of getting Twitter data and they work really well so it is not comprehensible why you do not want to use APIs. The web scraping is a work around but not recommended, in addition you would like to get a big part of it, so I do not think you will be satisfied with it. You can also buy the data from Gnip.

How to crawl Twitter data

Ive searched through stack, but answers are dated. I was wondering if anyone knows what it is to crawl a topic like security. How do I do this by using Twitter? Do I just follow people who tweet about this topic, re-tweet and tweet new things, or is there an exact way of doing this? I then need to make statistical analysis on the data I gather.
You can use Puppeteer to crawl twitter data.
Checkout their github repository here.
This is a repository that crawls twitter data using Puppeteer .
How about using twitter search api (https://dev.twitter.com/docs/api/1.1/get/search/tweets)
You need to create an app first(or better say register an app) on dev.twitter.com and use search api to query for tweets that contain security (assuming I understood your crawling in the right way). Once you have your tweets you can do statistical analysis on the gathered data.
I use twitteR package on R to crawl twitter data (https://github.com/geoffjentry/twitteR) . It includes simple and useful functions to get twitter data.

Getting twitter replies

Is it possible to get replies (tweets) for a given tweet in twitter? I am searching for a API in twitter but couldn't find the same. Can some one help me on this?
Thanks
https://api.twitter.com/1/related_results/show/172019363942117377.json?include_entities=1
That is an experimental API.
By experimental API, this means that until we officially document it on dev.twitter.com, it's not necessarily production-ready and could be unstable both in the parameters it takes and the format of its responses. It also may just disappear one day.
As for related_results itself, it won't necessarily return every reply for a tweet nor are its responses necessarily limited to just replies. That said, for your own personal use or experimentation you may find some utility in the method. If you choose to use it in any software you're developing, I would proactively wrap its use with significant exception handling.

Twitter Data Archive

is there any service from where we can download tweets?
UPDATE!!!
Googling for sometime gave me this result
a.) http://snap.stanford.edu/data/twitter7.html
b.) http://140kit.com/datasets
Yes, there is. It's called the Twitter API.
As we have access to limited tweets by Twitter-API, we should make use of third-party resellers like Topsy for just the past data, GNIP just for streaming data, or DataSift for both streaming data as well as past data.
You might also want to check the following sites:
http://www.infochimps.com/collections/twitter-census
http://www.tweetarchivist.com/
Twitter API allows provides partial results, it gives you the last 100 or even 500 tweets fo every search. If you need to keep tweets long term, twitter API shows its limits.
I had same need as you apparently hae and I developed a tool that queries twitter API periodically and stores search results on a Wordpress database.
I called the tools twittcorder and you can find a live demo on twittcorder.com
I hope this helps.
These other data sources are probably shared against the Twitter TOS. I wouldn't want to invest my time and effort building something on datasets that are non-repeatable. The Twitter Streaming API allows collection of a sample of Tweets.
There's also Gnip: http://gnip.com/.
Sysomos is there for complete data analysis including twitter, faecbook and various boards and forums

How to aggregate analytics from Google, Twitter, YouTube, Facebook, etc [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I have a video blog for which I would like to track certain statistics, including stats from Google Analytics, Twitter, YouTube, Facebook, etc.
The problem is that the various stats are on different websites, which require different logins, etc. It takes a long time to actually view everything. I am looking for a way to be able to aggregate all of this information in one place.
I have searched quite a bit on Google, Mashable, Delicious, etc and I haven't found any websites that do what I want. Are my searching skills bad, or does this really not exist?
The data in which I am interested appears to be available in readily parsable forms (see below), but I am hesitant to write an app to do this myself, because of an already more than full workload.
Data I want to aggregate:
Google Analytics -- tracking on my website
number of visitors
traffic sources
use Data Export API -- http://code.google.com/apis/analytics/docs/gdata/gdataDeveloperGuide.html
Twitter
number of followers
number of retweets
new # messages
new direct messages
Twitter API -- (sorry, I can only post one hyperlink because I am new)
Facebook fan page
number of fans
new posts on wall
Facebook API -- (sorry, I can only post one hyperlink because I am new)
Tumblr
number of followers
Video
number of views
view location
number of comments
number of channel subscribers
do this for
YouTube -- CSV report available at (sorry, I can only post one hyperlink because I am new)
MetaCritic
Feed burner (RSS)
number of subscribers
CSV report available at (sorry, I can only post one hyperlink because I am new)
SEO stuff
Google PageRank
Alexa rankings
So is there an app that does this already, or should I do this myself? I would like a quick and dirty way to do this -- I was thinking something like Yahoo pipes, but it appears to not be up to the task. I could probably get it done in Grails, but that might be more trouble than it's worth. Other ideas?
I have a better answer. YQL has community data tables for all the services you listed. You can pull in all the different values through their API.
http://developer.yahoo.com/yql/
You could try creating a Google Spreadsheet and use their external data import tools.
http://docs.google.com/support/bin/answer.py?hl=en&answer=75507
The biggest problem will probably be access authenticated APIs.
Presumably that all of the services above has fashioned a statistics API, I would advice you to write it yourself rather than battling an integration war with a bunch of aggregating programs.
Here's an iphone app that does at least a bit of this:
http://ego-app.com/
I don't know a single tool that can do this, off the top of my head. But you can chain a few tools together to do this.
1- If you're on Windows, use Website Watcher. It has a macro-recording tool to login a webpage, a regex-based tool to filter content and a scripting language that let you email/export the result. IMO, this will let you extract data from just any web page/RSS/forums.
2- Then use Dropbox to automatically upload the result files to your Dropbox's public folder (because you will need the public link to these file).
3- Use Yahoo Pipes to consolidate/aggregate the result files.
I suggest you try Metricly http://metricly.com/ that is natively intergating Facebook & Google Analytics data. It is extensible by nature and with a little bit of tweaking you can push any meric to it. I enjoy it.
I originally suggested this as an edit to abraham's answer but it was rejected:
Mikael Thuneberg has written a freely available google script for pulling GA data into Google Docs using the GA API: http://www.automateanalytics.com/2010/04/google-analytics-data-to-google-docs.html
I use it for creating client dashboards all the time. I suspect there may be others for pulling in twitter/facebook data etc.
And Google have just released this tool for importing GA data into Google Docs:
http://analytics.blogspot.co.uk/2012/08/automate-google-analytics-reporting.html
Also see SEOTools for Excel which can pull some facebook and twitter data as well as Google Analytics through the API.
YouTube has a public API http://developers.google.com/youtube/analytics to retrieve reports for your videos and channels.

Resources