How to receive updates to the Twitter social graph? - twitter

I just got some crazy ideas for analyzing the Twitter social graph (i.e., representing follow-relations as the edges of a graph). Interestingly, the Twitter API provides methods for creating the graph. It is possible to read out a static snapshot of the social graph, whereas Twitter is a very dynamic network. It would be great if one could dynamically update the graph. So my question is: Is there any way to get notified by Twitter when anyone starts or stops to follow anyone?

I believe that the documentation you linked to would definitely mention that.
I'm quite certain that you need to do your own follower-list checking, and compare results on a regular basis.

I do this if someone follows me or not and how many followers they have and i generate this chart
public function existsFriendship($username,$friend)
{
try
{
if ($this->twitter->existsFriendship($username, $friend))
return true;
}
catch(Exception $e)
{
$this->debug($e->getMessage());
}
}
for the chart generation i use pchart.
in smarty template the code looks like this;
include("pChart/pData.class");
include("pChart/pChart.class"); ![alt text][1]
// Initialise the graph
$Test = new pChart(700,230);
$Test->setFontProperties("Fonts/tahoma.ttf",13);
$Test->setGraphArea(40,30,680,200);
$Test->drawGraphArea(252,252,252,TRUE);
$Test->drawScale($DataSet->GetData(),$DataSet->GetDataDescription(),SCALE_NORMAL,150,150,150,TRUE,0,2);
$Test->drawGrid(4,TRUE,230,230,230,70);
// Draw the line graph
$Test->drawLineGraph($DataSet->GetData(),$DataSet->GetDataDescription());
$Test->drawPlotGraph($DataSet->GetData(),$DataSet->GetDataDescription(),3,2,255,255,255);
// Finish the graph
$Test->setFontProperties("Fonts/tahoma.ttf",12);
$Test->drawLegend(45,35,$DataSet->GetDataDescription(),255,255,255);
$Test->setFontProperties("Fonts/tahoma.ttf",12);
$Test->drawTitle(60,22,"Twitter Graph",50,50,50,585);
$example = $Test->Render("templates/example1.png");
$smarty->assign("example",$example);
$smarty->display('index.tpl');
finaly the result
alt text http://img691.imageshack.us/img691/6749/example1k.png

Related

YouTube API - retrieve more than 5k items

I just want to fetch all my liked videos ~25k items. as far as my research goes this is not possible via the YouTube v3 API.
I have already found multiple issues (issue, issue) on the same problem, though some claim to have fixed it, but it only works for them as they don't have < 5000 items in their liked video list.
playlistItems list API endpoint with playlist id set to "liked videos" (LL) has a limit of 5000.
videos list API endpoint has a limit of 1000.
Unfortunately those endpoints don't provide me with parameters that I could use to paginate the requests myself (e.g. give me all the liked videos between date x and y), so I'm forced to take the provided order (which I can't get past 5k entries).
Is there any possibility I can fetch all my likes via the API?
more thoughts to the reply from #Yarin_007
if there are deleted videos in the timeline they appear as "Liked https://...url" , the script doesnt like that format and fails as the underlying elements dont have the same structure as existing videos
can be easily fixed with a try catch
function collector(all_cards) {
var liked_videos = {};
all_cards.forEach(card => {
try {
// ignore Dislikes
if (card.innerText.split("\n")[1].startsWith("Liked")) {
....
}
}
catch {
console.log("error, prolly deleted video")
}
})
return liked_videos;
}
to scroll down to the bottom of the page ive used this simple script, no need to spin up something big
var millisecondsToWait = 1000;
setInterval(function() {
window.scrollTo(0, document.body.scrollHeight);
console.log("scrolling")
}, millisecondsToWait);
when more ppl want to retrive this kind of data, one could think about building a proper script that is more convenient to use. If you check the network requests you can find the desired data in the response of requests called batchexecute. One could copy the authentification of one of them provide them to a script that queries those endpoints and prepares the data like the other script i currently manually inject.
Hmm. perhaps Google Takeout?
I have verified the youtube data contains a csv called "liked videos.csv". The header is Video Id,Time Added, and the rows are
dQw4w9WgXcQ,2022-12-18 23:42:19 UTC
prvXCuEA1lw,2022-12-24 13:22:13 UTC
for example.
So you would need to retrieve video metadata per video ID. Not too bad though.
Note: the export could take a while, especially with 25k videos. (select only YouTube data)
I also had an idea that involves scraping the actual liked videos page (which would save you 25k HTTP Requests). But I'm unsure if it breaks with more than 5000 songs. (also, emulating the POST requests on that page may prove quite difficult, albeit not impossible. (they fetch /browse?key=..., and have some kind of obfuscated / encrypted base64 strings in the request-body, among other parameters)
EDIT:
Look. There's probably a normal way to get a complete dump of all you google data. (i mean, other than takeout. Email them? idk.)
anyway, the following is the other idea...
Follow this deep link to your liked videos history.
Scroll to the bottom... maybe with selenium, maybe with autoit, maybe put something on the "end" key of your keyboard until you reach your first liked video.
Hit f12 and run this in the developer console
// https://www.youtube.com/watch?v=eZPXmCIQW5M
// https://myactivity.google.com/page?utm_source=my-activity&hl=en&page=youtube_likes
// go over all "cards" in the activity webpage. (after scrolling down to the absolute bottom of it)
// create a dictionary - the key is the Video ID, the value is a list of the video's properties
function collector(all_cards) {
var liked_videos = {};
all_cards.forEach(card => {
// ignore Dislikes
if (card.innerText.split("\n")[1].startsWith("Liked")) {
// horrible parsing. your mileage may vary. I Tried to avoid using any gibberish class names.
let a_links = card.querySelectorAll("a")
let details = a_links[0];
let url = details.href.split("?v=")[1]
let video_length = a_links[3].innerText;
let time = a_links[2].parentElement.innerText.split(" • ")[0];
let title = details.innerText;
let date = card.closest("[data-date]").getAttribute("data-date")
liked_videos[url] = [title,video_length, date, time];
// console.log(title, video_length, date, time, url);
}
})
return liked_videos;
}
// https://stackoverflow.com/questions/57709550/how-to-download-text-from-javascript-variable-on-all-browsers
function download(filename, text, type = "text/plain") {
// Create an invisible A element
const a = document.createElement("a");
a.style.display = "none";
document.body.appendChild(a);
// Set the HREF to a Blob representation of the data to be downloaded
a.href = window.URL.createObjectURL(
new Blob([text], { type })
);
// Use download attribute to set set desired file name
a.setAttribute("download", filename);
// Trigger the download by simulating click
a.click();
// Cleanup
window.URL.revokeObjectURL(a.href);
document.body.removeChild(a);
}
function main() {
// gather relevant elements
var all_cards = document.querySelectorAll("div[aria-label='Card showing an activity from YouTube']")
var liked_videos = collector(all_cards)
// download json
download("liked_videos.json", JSON.stringify(liked_videos))
}
main()
Basically it gathers all the liked videos' details and creates a key: video_ID - Value: [title,video_length, date, time] object for each liked video.
It then automatically downloads the json as a file.

How can I visualise tweets on a map in Processing?

I'm trying to use the twitter API to search for a keyword and get the location of that particular tweet to then visualise onto a map.
I've successfully created my map using unfolding maps and tilemill, I'm just struggling with the twitter part. Using the twitter 4J library, I've tried the following code but I'm not able to get the location as coordinates. Also, once I've got the coordinates I dont know how to go about visualising them on my map.
import twitter4j.*;
import twitter4j.api.*;
import twitter4j.auth.*;
import twitter4j.conf.*;
import twitter4j.json.*;
import twitter4j.management.*;
import twitter4j.util.*;
import twitter4j.util.function.*;
ConfigurationBuilder cb = new ConfigurationBuilder();
Twitter twitterInstance;
Query queryForTwitter;
void setup() {
cb.setOAuthConsumerKey("**********");
cb.setOAuthConsumerSecret("*******");
cb.setOAuthAccessToken("*********");
cb.setOAuthAccessTokenSecret("*********");
twitterInstance = new TwitterFactory( cb.build()).getInstance();
queryForTwitter = new Query("#nature");
size(640, 440);
background(0);
FetchAndDrawTweets();
}
void FetchAndDrawTweets() {
try {
QueryResult result = twitterInstance.search(queryForTwitter);
ArrayList tweets = (ArrayList) result.getTweets();
for (int i=0; i<tweets.size(); i++) {
Status t = (Status) tweets.get(i);
String user = (t.getUser()).getLocation();
text(user + ":" , 20,15+i*15);
}
}
catch(TwitterException te) {
println("Couldn't connect:" + te);
}
}here
I'm quite new to processing, so please be patient with me as I don't even know if I'm going about this the right way.
Any help would be gratefully appreciated!!
Nice work on separating your problem into smaller sub-problems. It's a good idea to separate the "get tweets and locations" step from the "show stuff on map" step. Good work.
As for getting the location of a tweet, there are two places you should look:
The Twitter4J JavaDocs: This is a list of every class, function, and variable available to you from the Twitter4J library. This should be your first stop. Do you see anything that looks useful in here? Specifically, the Status class has a getGeoLocation() function that looks pretty promising.
The Twitter API: this is the underlying JavaScript API that the Twitter4J library is built on. Check out this documentation for more details on what's going on underneath the Twitter4J library. Specifically, this page says the the geo object is deprecated and that you should use the coordinates field instead.
So the first thing I would try is the getGeoLocation() function. But note that not every tweet will have a location, since users can disable location tracking. Also note that if the underlying JavaScript library no longer provides the geo object, and if the Twitter4J library is using that to populate its getGeoLocation() function, then you won't be able to get at the location through Twitter4J. I haven't tested that at all though.

How do I fire a list var from a DTM direct call?

I am trying to fire values into a list var in Adobe Analytics from a DTM direct call but can't seem to get any values to appear.
In my custom code in the direct call rule I have
cTS = _satellite.getVar('conversionTypeShown');
s.list1 = cTS;
and the Data Element conversionTypeShown is getting information from the digitalData layer on the page (which is updated just before the direct call)
if ((digitalData.searchResults !== undefined) && (digitalData.searchResults !== ""))
{
return digitalData.otherJobsType + digitalData.searchResults;
}
I know that these values are being populated correctly because I am firing an eVar with the same data in it (within the same rule) which is coming through OK into Adobe Analytics. But I am not getting any values for the list var?
Does a direct call not allo me to use custom code in this manner?
Any help would be gratefully received.
Owen.
Many thanks to Owen. I didn't find this hint in the Adobe documentation.
Finally my code looks like this and works.
s.linkTrackVars="list1,list2";
s.list1=_satellite.getVar("FieldsSubmitted");
s.list2=_satellite.getVar("FieldsAborted");

Sales order total different with actual total

Just need to know any one of you experiencing this issue with sales order document in acumatica ERP 4.2,
The header level total is wrong when compared to the total of lines. Is there any way we can recalculate the totals in code as i couldn't find fix from acumatica yet?
If document is not yet closed, you can just modify qty or add/remove line.
If document is closed i do not see any possible ways except changing data in DB.
I am adding my recent experience to this topic in hopes it might help others.
Months ago, I wrote the code shown below anticipating its need when called by RESTful services. It was clearly not needed, and even worse, merely written and forgotten...
The code was from a SalesOrderEntryExt graph extension.
By removing the code block, the doubling of Order Total was resolved.
It's also an example of backing out custom code until finding the problem.
protected void _(Events.RowInserted<SOLine> e, PXRowInserted del)
{
// call the base BLC event handler...
del?.Invoke(e.Cache, e.Args);
SOLine row = e.Row;
if (!Base.IsExport) return;
if (row != null && row.OrderQty > 0m)
{
// via RESTful API, raise event
SOLine copy = Base.Transactions.Cache.CreateCopy(row) as SOLine;
copy.OrderQty = 0m;
Base.Transactions.Cache.RaiseRowUpdated(row, copy);
}
}

How to get user's geolocation?

On many sites I saw printed out my current city where I am (eg "Hello to Berlin."). How they do that? What everything is needed for that?
I guess the main part is here javascript, but what everything I need for implementing something like this to my own app? (or is there some gem for Rails?)
Also, I would like to ask for one thing yet - I am interesting in the list of states (usually in select box), where user select his state (let's say Germany), according to the state value are in another select displayed all regions in Germany and after choosing a region are displayed respective cities in the selected region.
Is possible anywhere to obtain this huge database of states/cities/regions? Would be interesting to have something similar in our app, but I don't know, where those lists get...
You need a browser which supports the geolocation api to obtain the location of the user (however, you need the user's consent (an example here) to do so (most newer browsers support that feature, including IE9+ and most mobile OS'es browsers, including Windows Phone 7.5+).
all you have to do then is use JavaScript to obtain the location:
if (window.navigator.geolocation) {
var failure, success;
success = function(position) {
console.log(position);
};
failure = function(message) {
alert('Cannot retrieve location!');
};
navigator.geolocation.getCurrentPosition(success, failure, {
maximumAge: Infinity,
timeout: 5000
});
}
The positionobject will hold latitude and longitude of the user's position (however this can be highly inaccurate in less densely populated areas on desktop browsers, as they do not have a GPS device built in). To explain further: Here in Leipzig in get an accuracy of about 300 meters on a desktop computer - i get an accuracy of about 30 meters with my cell phone's GPS device.
You can then go on and use the coordinates with the Google Maps API (see here for reverse geocoding) to lookup the location of the user. There are a few gems for Rails, if you want. I never felt the need to use them, but some people seem to like them.
As for a list of countries/cities, we used the data obtainable from Geonames once in a project, but we needed to convert it for our needs first.
Internet Service Providers buy up big chunks of IP addresses, so what you're most likely seeing is a backtrace your IP to a known ISP. They have a database with ISP's and their location in the world, so they can try to see where you're from. You could try to use a site like http://www.ipaddresslocation.org/ to do your work. If you look around, there is bound to be a site that lets you enter an IP and get a location, so you just send a POST request to that site with your visitor's IP and scrape the location from the response.
Alternatively you could try to look for an ISP database that has location and what chunks of the IP range they have been allocated. You could probably find one for money, but a free one might be harder to find.
Alternatively, check out this free database http://www.maxmind.com/app/geolite
I've found getCurrentPosition() to often be inaccurate since it doesn't spend a lot of time waiting on the GPS to acquire a good accuracy. I wrote a small piece of JavaScript that mimics getCurrentPosition() but actually uses watch position and monitors the results coming back until they are better accuracy.
Here's how it looks:
navigator.geolocation.getAccurateCurrentPosition(onSuccess, onError, {desiredAccuracy:20, maxWait:15000});
Code is here - https://github.com/gwilson/getAccurateCurrentPosition
Correc syntax would be :
navigator.geolocation.getCurrentPosition(successCallBack, failureCallBack);
Use :
navigator.geolocation.getCurrentPosition(
function(position){
var latitude = position.coords.latitude;
var longitude = position.coords.longitude;
console.log("Latitude : "+latitude+" Longitude : "+longitude);
},
function(){
alert("Geo Location not supported");
}
);
If you prefer to use ES6 and promises here is another version
function getPositionPromised() {
function successCb(cb) {
return position => cb(position);
}
function errorCb(cb) {
return () => cb('Could not retrieve geolocation');
}
return new Promise((resolve, reject) => {
if (window.navigator.geolocation) {
navigator.geolocation.getCurrentPosition(successCb(resolve), errorCb(reject));
} else {
return reject('No geolocation support');
}
})
}
And you can use it like this:
getPositionPromised()
.then(position => {/*do something with position*/})
.catch(() => {/*something went wrong*/})
Here is an another api to find out the location in PHP,
http://ipinfodb.com/ip_location_api.php
I have been using geoip.maxmind.com for quite a while and it works 100%. It can be accessed via HTTP requests.

Resources