Bigger Twitter user profile image - twitter

I'm developing an app that displays users' tweets along with their image. But the image being displayed is very small. I want the image (usual 128*128) one that is usually displayed on their profiles. Here's my relevant code:
foreach($ret1->results as $x)
{
echo "<div class='ttl'><div class='ttlpadding'><div class='item'><img src=\"",$x->profile_image_url,"\" title=\"", $x->from_user." (".$x->from_user_name.")", "\" />\n";
$text = preg_replace('/\s+#(\w+)/',' #$1', $x->text);
echo "<div class='clr'></div>";
echo "<div class='tweet'>".$text."</div></div></div></div><div class='clrflt'></div>";
}

You may try
substr($x->profile_image_url,[index-start],strlen($x->profile_image_url)[index-end]).'jpg'
This will return the original images uploaded by the users.

For fetching big images
call
http://api.twitter.com/1/users/profile_image/username.json?size=original
Replace username by the twitter handle

Related

Bookmarks parsing issue

I have a LARGE number of bookmarks and wanted to export them and share them with a group I work with. The issue is that when I export them, there are ADD_DATE and LAST_MODIFIED fields added by the browser (Firefox). I was hoping to just use cut or awk to pull the fields I want but the lack of a space before the >(website_name) is making that difficult. And my regex skills are weak.
How do I add a single space before the second to last > at the end of the line so that I can use cut or awk to pull out the fields I want into a new file?
Ex: 123456">SecurityTrails would become 123456 >SecurityTrails
Please see below for examples of what I'm working with. Any help is greatly appreciated!
<DT>SecurityTrails
i use firefox myself. it frequently also embeds favicon into the exported bookmarks.html file via base64 encoding. so to account for the different scenarios (than just the one mentioned by OP), maybe something like
{mawk/mawk2/gawk} 'BEGIN { FS = "\042" } $1 = $1'
then do whatever cutting that you want. That's just assuming OP wanted to keep every bit of it, and simply remove the quotations.
Now, if the objective is just to take out URL+Name of it,
{mawk/mawk2/gawk} 'BEGIN { DBLQT="\042"; FS = "(<A HREF=" DBLQT "|>)" } /<A HREF=/ {
url = substr($2, 1, index($2, DBLQT) - 1);
sitename = $(NF-1);
sub(/<\/A$/, "", sitename) ;
print url " > " sitename ; }' # or whatever way you want the output to be
I just typed it in extra verbosity to show what \042 meant - the ascii octal for double quote.

How to get "bigger-size" user image with Twitter API 1.1?

I'm trying to get last 5 tweets from a person. I did it, but profile picture is not looking normal, resolution is corrupted. like that. ! http://i.hizliresim.com/wLQEJZ.jpg
var $twitter = $('#twitter');
$.getJSON('http://www.demo.net/twitter.php?username=yeniceriozcan&count=5', function(data){
var total = data.length,
i = 0;
$twitter.html(''); // önce içindekini temizle sonra tweetleri yazdır.
for ( i; i < total; i++ ){
var tweet = data[i].text; // tweet
var date = parseTwitterDate(data[i].created_at); // tarih
var image = data[i].user.profile_image_url; // profil resmi
var url = 'https://twitter.com/' + data[i].user.screen_name +'/status/' + data[i].id_str;
$twitter.append('<div class="tweet"><a target="_blank" href="' + url + '"><img src="' + image + '" alt="" class="profile-image" />' + tweet + '</a> <span class="tweet-date">(' + date + ')</span></div>');
}
});
This is my code. I tried to that for to get profile picture,
var image = data[i].user.profile_image_url;
And also in other tweets file,
$tweets = $twitter->get('https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name='.$username.'&count='.$count);
print json_encode($tweets);
I used this api.
but I can not view pictures in normal resolutions. How can I fix it?
Thanks.
I found that using bigger still brought the picture back in a relatively small format. Using this answer enables you to resize the image allowing you to have a big image without distortion.
If you want to get the image full size just get rid of the "_normal" completely
http://pbs.twimg.com/profile_images/429221067630972918/ABLBUS9o_normal.jpeg
Goes to
http://pbs.twimg.com/profile_images/429221067630972918/ABLBUS9o.jpeg
Note: The urls in this answer are modified so as to not reflect my twitter details hence why entering them will give you a "no page exists"
When you read the url from data[i].user.profile_image_url, replace "_normal" with "_bigger". Here's the explanation from the Twitter docs:
User Profile Images and Banners
Update December 22nd 2020
This is still the case - works.
Update May 6th 2017
When you get the user's profile image via profile_image_url or profile_image_url_https (see User Profile Images and Banners), try to replace "_normal" at the end of the filename with "_400x400".
Seems that they are now scaling images down to 400x400 and then delete the original file.
In case you still need this questions answered (since none of the previous ones are marked as an accepted answer) this is how I managed to deal with the Twitter API image URL sizing:
Grabbing the URL from the API will return the "http" and "https" URL's in the following format(s):
http:
http://pbs.twimg.com/profile_images/378812345851234567/Ay2SHEYz_normal.png
https:
https://pbs.twimg.com/profile_images/378812345851234567/Ay2SHEYz_normal.png
For whatever reason Twitter decided that a 48x48 png was good enough.
If you want the full resolution you need to remove the "_normal" tag from either URL, as angryTurtle stated above. This will give you the URL of the full size image. Here is quick example work around of accomplishing this:
/// remove '_normal' from picture url to get full size
NSString *TWTRPicStringF = [twitterProfilePictureURLStringN stringByReplacingOccurrencesOfString:#"_normal" withString:#""];
Hopefully this helps you and you can mark one of these as the accepted answer to close the question!
FYI: I also have modified the URL contents to protect my own Twitter information, explaining the "no page exists" when using one of the provided URL's above.

New Google Spreadsheets publish limitation

I am testing the new Google Spreadsheets as there is a new feature I really need: the 200 sheets limit has been lifted (more info here: https://support.google.com/drive/answer/3541068).
However, I can't publish a spreadsheet to CSV like you can in the old version. I go to 'File>Publish to the web' and there is no more options to publish 'all sheets' or certain sheets and you can't specify cell ranges to publish to CSV etc.
This limitation is not mentioned in the published 'Unsupported Features' documentation found at: https://support.google.com/drive/answer/3543688
Is there some other way this gets enabled or has it in fact been left out of the new version?
My use case is: we retrieve Bigquery results into the spreadsheets, we publish the sheets as a CSV automatically using the "publish automatically on update" feature which then produces the CSV URL which gets placed into charting tools that read the CSV URL to generate the visuals.
Does anyone know how to do this?
The new Google spreadsheets use a different URL (just copy your <KEY>):
New sheet : https://docs.google.com/spreadsheets/d/<KEY>/pubhtml
CSV file : https://docs.google.com/spreadsheets/d/<KEY>/export?gid=<GUID>&format=csv
The GUID of your spreadsheet relates to the tab number.
/!\ You have to share your document using the Anyone with the link setting.
Here is the solution, just write it like this:
https://docs.google.com/spreadsheets/d/<KEY>/export?format=csv&id=<KEY>
I know it's weird to write the KEY twice, but it works perfectly. A teammate from work discovered this by opening the excel file in Google Docs, then File -> Download as -> Comma separated values. Then, in the downloads section of the browser appears a link to the CSV file, like this:
https://docs.google.com/spreadsheets/d/<KEY>/export?format=csv&id=<KEY>&gid=<SOME NUMBER>
But it doesn't work in this format, what my friend did was remove "&gid=<SOME NUMBER>" and it worked! Hope it helps everyone.
If you enable "Anyone with the link sharing" for spreadsheet, here is a simple method to get range of cells or columns (or whatever your feel like) export in format of HTML, CSV, XML, JSON via the query:
https://docs.google.com/spreadsheet/tq?key=YOUR-KEY&gid=1&tq=select%20A,%20B&tqx=reqId:1;out:html;%20responseHandler:webQuery
For tq variable read query language reference.
For tqx variable read request format reference.
Downside to this is that your doc is still availble in full via the public link, but if you want to export/import data to say Excel this is a perfect way.
It's not going to help everyone, but I've made a PHP script to read the HTML into an array.
I've added converting back to a CSV at the end. Hopefully this will help some people who have access to PHP.
$html_link = "https://docs.google.com/spreadsheets/d/XXXXXXXXXX/pubhtml";
$local_html = "sheets.html";
$file_contents = file_get_contents($html_link);
file_put_contents($local_html,$file_contents);
$dom = new DOMDocument();
$html = #$dom->loadHTMLFile($local_html); //Added a # to hide warnings - you might remove this when testing
$dom->preserveWhiteSpace = false;
$tables = $dom->getElementsByTagName('table');
$rows = $tables->item(0)->getElementsByTagName('tr');
$cols = $rows->item(0)->getElementsByTagName('td'); //You'll need to edit the (0) to reflect the row that your headers are in.
$row_headers = array();
foreach ($cols as $i => $node) {
if($i > 0 ) $row_headers[] = $node->textContent;
}
foreach ($rows as $i => $row){
if($i == 0 ) continue;
$cols = $row->getElementsByTagName('td');
$row = array();
foreach ($cols as $j => $node) {
$row[$row_headers[$j]] = $node->textContent;
}
$table[] = $row;
}
//Convert to csv
$csv = "";
foreach($table as $row_index => $row_details){
$comma = false;
foreach($row_details as $value){
$value_quotes = str_replace('"', '""', $value);
$csv .= ($comma ? "," : "") . ( strpos($value,",")===false ? $value_quotes : '"'.$value_quotes.'"' );
$comma = true;
}
$csv .= "\r\n";
}
//Save to a file and/or output
file_put_contents("result.csv",$csv);
print $csv;
Here is another temporary, non-PHP workaround:
Go to an existing NEW google sheet
Go to "File -> New -> Spreadsheet"
Under "File -> Publish to the web..." now has the option to publish a csv version
I believe this is actually creating an old Google sheet but for my purposes (importing google sheet data from clients or myself into R for statistical analysis) it works until they hopefully update this feature.
I posted this in a Google Groups forum also, please find it here:
https://productforums.google.com/forum/#!topic/docs/An-nZtjaupU
The correct URL for downloading a Google spreadsheet as CSV is:
https://docs.google.com/spreadsheets/export?id=<ID>&exportFormat=csv
The current answers do not work anylonger. The following has worked for me:
Do File -> "Publish to the web" and select 'start publishing' and the format. I choose text (which is TSV)
Now just copy the URL there which will be similar to https://docs.google.com/spreadsheet/pub?key=YOUR_KEY&single=true&gid=0&output=txt
That new feature appears to have disappeared. I don't see any option to publish a csv/tsv version. I can download tsv/csv with the export, but that's not available to other people with merely the link (it redirects them to a google docs sign-in form).
I found a fix! So I discovered that old spreadsheets before this change were still allowing only publishing certain sheets. So I made a copy of an old spreadsheet, cleared the data out, copy and pasted my current info into it and now I'm happily publishing just a single sheet of my large spreadsheet. Yay
I was able to implement a query to the result, see this table
https://docs.google.com/spreadsheets/d/1LhGp12rwqosRHl-_N_N8eTjTwfFsHHIBHUFMMyhLaaY/gviz/tq?tq=select+A,B,I,J,K+where+B%3E=4.5&pli=1
the spreadsheet fetches data from earthquake, but I just want to select MAG 4.5+ earthquakes so it makes the query and the columns, just a problem:
I cannot parse the result, I tried to decode as json but was not able to parse it.
I would like to be able to show this as HTML or CSV or how to parse this ? for example to be able to plot it on a Google Map.

Fetching bigger images of Twitter users

I'm developing an app that displays users' tweets along with their image. But the image being displayed is very small. I want the image (usual 128*128) one that is usually displayed on their profiles to be shown instead f the small one. I'm using JSON. Here's my relevant code:
...
...
$url = "http://search.twitter.com/search.json?q=".$qe."&geocode=".$geo."&rpp=10";
...
...
foreach($ret1->results as $x)
{
echo "<div class='ttl'><div class='ttlpadding'><div class='item'><img src=\"",$x->profile_image_url,"\" title=\"", $x->from_user." (".$x->from_user_name.")", "\" />\n";
$text = preg_replace('/\s+#(\w+)/',' #$1', $x->text);
echo "<div class='clr'></div>";
echo "<div class='tweet'>".$text."</div></div></div></div><div class='clrflt'></div>";
}
For fetching big images
call
http://api.twitter.com/1/users/profile_image/username.json?size=original
Replace username by the twitter handle
The twitter api about getting profile images
Similar stack overflow question here. Avoid Reposting questions.

Using get_video on YouTube to download a video

I am trying to get the video URL of any YouTube video like this:
Open
http://youtube.com/get_video_info?video_id=VIDEOID
then take the account_playback_token token value and open this URL:
http://www.youtube.com/get_video?video_id=VIDEOID&t=TOKEN&fmt=18&asv=2
This should open a page with just the video or start a download of the video. But nothing happens, Safari's activity window says 'Not found', so there is something wrong with the URL. I want to integrate this into a iPad app, and the javascript method to get the video URL I use in the iPhone version of the app isn't working, so I need another solution.
YouTube changes all the time, and I think the URL is just outdated. Please help :)
Edit: It seems like the get_video method doesn't work anymore. I'd really appreciate if anybody could tell me another way to find the video URL.
Thank you, I really need help.
Sorry, that is not possible anymore. They limit the token to the IP that got it.
Here's a workaround by using the get_headers() function, which gives you an array with the link to the video. I don't know anything about ios, so hopefully you can rewrite this PHP code yourself.
<?php
if(empty($_GET['id'])) {
echo "No id found!";
}
else {
function url_exists($url) {
if(file_get_contents($url, FALSE, NULL, 0, 0) === false) return false;
return true;
}
$id = $_GET['id'];
$page = #file_get_contents('http://www.youtube.com/get_video_info?&video_id='.$id);
preg_match('/token=(.*?)&thumbnail_url=/', $page, $token);
$token = urldecode($token[1]);
$get = $title->video_details;
$url_array = array ("http://youtube.com/get_video?video_id=".$id."&t=".$token,
"http://youtube.com/get_video?video_id=".$id."&t=".$token."&fmt=18");
if(url_exists($url_array[1]) === true) {
$file = get_headers($url_array[1]);
}
elseif(url_exists($url_array[0]) === true) {
$file = get_headers($url_array[0]);
}
$url = trim($file[19],"Location: ");
echo 'Download video';
}
?>
I use this and it rocks: http://rg3.github.com/youtube-dl/
Just copy a YouTube URL from your browser and execute this command with the YouTube URL as the only argument. It will figure out how to find the best quality video and download it for you.
Great! I needed a way to grab a whole playlist of videos.
In Linux, this is what I used:
y=http://www.youtube.com;
f="http://gdata.youtube.com/feeds/api/playlists/PLeHqhPDNAZY_3377_DpzRSMh9MA9UbIEN?start-index=26";
for i in $(curl -s $f |grep -o "url='$y/watch?v=[^']'");do d=$(echo
$i|sed "s|url\='$y/watch?v=(.)&.*'|\1|"); youtube-dl
--restrict-filenames "$y/watch?v=$d"; done
You have to find the playlist ID from a common Youtube URL like:
https://www.youtube.com/playlist?list=PLeHqhPDNAZY_3377_DpzRSMh9MA9UbIEN
Also, this technique uses gdata API, limiting 25 records per page.
Hence the ?start-index=26 parameter (to get page 2 in my example)
This could use some cleaning, and extra logic to iterate thru all sets of 25, too.
Credits:
https://stackoverflow.com/a/8761493/1069375
http://www.commandlinefu.com/commands/view/3154/download-youtube-playlist (which itself didn't quite work)

Resources