I'm developing a google spreadsheet that is automatically requesting information from a site, below is the code. The variable 'tokens' is an array consisting of about 60 different 3 letter unique identifiers. The problem that i have been getting is that the code keeps failing to request all information on the site. Instead it falls back (at random) on the validation part, and fills the array up with "Error!" strings. Sometimes its row 5, then 10-12, then 3, then multiple rows, etc. When i run it in debug mode everythings fine, can't seem to be able to reproduce the problem.
Already tried to place a sleep (100ms) but that fixed nothing. Also looked at the amount of traffic the API accepts (10 requests per second, 1.200 per minute, 100.000 per day) , it shouldn't be a problem.
Runtime is limited so i need it to be as efficient as possible. I'm thinking it is an issue of computational power after i pushed all values in the json request into the 'tokens' array. Is there a way to let the script wait as long as necessary for the changes to be committed?
function newGetOrders() {
var starttime = new Date().getTime().toString();
var refreshTime = new Date();
var tokens = retrieveTopBin();
var sheet = SpreadsheetApp.openById('aaafFzbXXRzSi-eXBu9Xh81Ne2r09vM8rLFkA4fY').getSheetByName("Sheet37");
sheet.getRange('A2:OL101').clear();
for (var i=0; i<tokens.length; i++) {
var request = UrlFetchApp.fetch("https://api.binance.com/api/v1/depth?symbol=" + tokens[i][0] + "BTC", {muteHttpExceptions:true});
var json = JSON.parse(request.getContentText());
tokens[i].push(refreshTime);
Utilities.sleep(100);
for (var k in json.bids) {
tokens[i].push(json.bids[k][0]);
tokens[i].push(json.bids[k][1]);
}
for (var k in json.asks) {
tokens[i].push(json.asks[k][0]);
tokens[i].push(json.asks[k][1]);
}
if (tokens[i].length < 402) {
for (var x=tokens[i].length; x<402; x++) {
tokens[i].push("ERROR!");
}
}
}
sheet.getRange(2, 1, tokens.length, 402).setValues(tokens);
}
Related
Having a bit of trouble using importJSON for the first time in Google Sheets. My data is importing as truncated and I can't find any way to really filter things the way I'd like.
API source: https://prices.runescape.wiki/api/v1/osrs/1h
I'm using the following command: =IMPORTJSON(B1;B2)
where B1 is the source link, and B2 references any filters I've applied. So far I have no filters.
My result is a truncated list that displays as such:
data/2/avgHighPrice 166
data/2/highPriceVolume 798801
data/2/avgLowPrice 162
data/2/lowPriceVolume 561908
data/6/avgHighPrice 182132
data/6/highPriceVolume 7
data/6/avgLowPrice 180261
data/6/lowPriceVolume 37
data/8/avgHighPrice 195209
data/8/highPriceVolume 4
data/8/avgLowPrice 192880
data/8/lowPriceVolume 40
In the examples I've seen and worked with (primarily the example provided by the Addon), it will naturally pivot into a table. I can't even achieve that, which would be workable although I'm really only looking to ping the markers avgHighPrice and avgLowPrice.
EDIT:
I'm looking for results along the lines of this:
2
6
8
/avgLowPrice
162
180261
192880
/avgHighPrice
166
182132
195209
EDIT2:
So I have one more thing I was hoping to figure out. Using your script, I created another script to pull the names and item IDs
function naming(url){
//var url='https://prices.runescape.wiki/api/v1/osrs/mapping'
var data = JSON.parse(UrlFetchApp.fetch(url).getContentText())
var result = []
result.push(['#','id','name'])
for (let p in eval('data.data')) {
try{result.push([p,data.item(p).ID,data.item(p).Name])}catch(e){}
}
return result
}
Object.prototype.item=function(i){return this[i]};
I'm wondering if it is possible to correlate the item Name with the Item ID from the initial pricing script. To start, the 1st script only list items that are tradeable, while the 2nd list ALL item IDs in the game. I'd essentially like to correlate the 1st and 2nd script to show as such
ID
Name
avgHighPrice
avgLowPrice
2
Cannonball
180261
192880
6
Cannon Base
182132
195209
Try this script (without any addon)
function prices(url){
//var url='https://prices.runescape.wiki/api/v1/osrs/1h'
var data = JSON.parse(UrlFetchApp.fetch(url).getContentText())
var result = []
result.push(['#','avgHighPrice','avgLowPrice'])
for (let p in eval('data.data')) {
try{result.push([p,data.data.item(p).avgHighPrice,data.data.item(p).avgLowPrice])}catch(e){}
}
return result
}
Object.prototype.item=function(i){return this[i]};
You can retrieve informations for naming / from mapping as follows
function naming(url){
//var url='https://prices.runescape.wiki/api/v1/osrs/mapping'
var data = JSON.parse(UrlFetchApp.fetch(url).getContentText())
var result = []
result.push(["id","name","examine","members","lowalch","limit","value","highalch"])
json=eval('data')
json.forEach(function(elem){
result.push([elem.id.toString(),elem.name,elem.examine,elem.members,elem.lowalch,elem.limit,elem.value,elem.highalch])
})
return result
}
https://docs.google.com/spreadsheets/d/1HddcbLchYqwnsxKFT2tI4GFytL-LINA-3o9J3fvEPpE/copy
Integrated function
=pricesV2()
https://docs.google.com/spreadsheets/d/1HddcbLchYqwnsxKFT2tI4GFytL-LINA-3o9J3fvEPpE/copy
function pricesV2(){
var url='https://prices.runescape.wiki/api/v1/osrs/mapping'
var data = JSON.parse(UrlFetchApp.fetch(url).getContentText())
let myItems = new Map()
json=eval('data')
json.forEach(function(elem){myItems.set(elem.id.toString(),elem.name)})
var url='https://prices.runescape.wiki/api/v1/osrs/1h'
var data = JSON.parse(UrlFetchApp.fetch(url).getContentText())
var result = []
result.push(['#','name','avgHighPrice','avgLowPrice'])
for (let p in eval('data.data')) {
try{result.push([p,myItems.get(p),data.data.item(p).avgHighPrice,data.data.item(p).avgLowPrice])}catch(e){}
}
return result
}
Object.prototype.item=function(i){return this[i]};
I am trying to query 2 long columns for agents' name, the issue is the names are repeated on 2 tables, one for the total sum of productivity and the other is for total sum of utilization.
The thing is when I query the columns it returns back the numbers for Productivity and Utilization all together.
How can I make the query to search only for Productivity alone and for Utilization alone?
Link is here: https://docs.google.com/spreadsheets/d/12Sydw6ejFobySHUj5JoYkAPbhr0mKoInCWxtHY1W4lk/edit#gid=0
Apps Script would be a better solution in this case. The code below works as follows:
Gets the names from Column D and Column A.
For each name of Column D, it will compare it with each name of Column A (that's the 2 for loops)
If the names coincide (first if), it will check the background color (second if) of the Column A name to accumulate Total Prod and Total Util.
Once it reaches the end of the Column A, writes the values in Total Prod and Total Util (Columns E and F) for each name in D.
function onOpen() { //Will run every time you open the sheet
//Gets the active Spreadsheet and sheet
let sprsheet = SpreadsheetApp.getActiveSpreadsheet();
let sheet = sprsheet.getActiveSheet();
var lastRow = sheet.getLastRow();
var getNames = sheet.getRange(3, 1, lastRow).getValues(); //Names from row 2, col 1, until the last row
var totalNames = sheet.getRange("D4:D5").getValues(); //Change the range for more names
let prodColor = '#f2f4f7'; //hexadecimal codes of the background colors of names in A
let utilColor = '#cfe2f3'; //
for (var i = 0; i < totalNames.length; i++) {
var totalProd = 0, totalUtil = 0; //Starts at 0 for each name in D
for (var j = 0; j < getNames.length; j++) {
if (totalNames[i][0] == getNames[j][0]) {
if (sheet.getRange(j + 3, 1).getBackgroundObject().asRgbColor().asHexString() == prodColor) { //if colors coincide
totalProd += sheet.getRange(j + 3, 2).getValue();
} else if (sheet.getRange(j + 3, 1).getBackgroundObject().asRgbColor().asHexString() == utilColor) {
totalUtil += sheet.getRange(j + 3, 2).getValue();
}
}
}
sheet.getRange(i+4, 5, 1 ,2).setValues([[totalProd, totalUtil]]);
}
}
Note: You will have to run the code manually and accept permissions the first time you run it. After that it will run automatically each time you open the Sheet. It might take a few seconds for the code to run and to reflect changes on the Sheet.
To better understand loops and 2D arrays, I recommend you to take a look at this.
References:
Range Class
Get Values
Get BackgroundObject
Set Values
You can learn more about Apps Script and Sheets by following the Quickstart.
I have a Ruby on Rails project where I use a DHTMLX Grid.
Is there a way of showing, using the event handler "onFullSync" provided by the grid API, to show updated data?
Let me explain a little better... I know I can do something like:
dp.attachEvent("onFullSync", function(){
alert("update complete");
})
But what I want is something more complex. I want to, after each completed update, alter a div adding the information like this:
Field 2 was updated to XYZ and field 3 was updated to XER on line X
Field 1 was updated to 123 and field 3 was updated to XSD on line Y
Is this possible?
Thanks
There is a onAfterUpdate event that can be used similar to onFullSync
http://docs.dhtmlx.com/api__dataprocessor_onafterupdate_event.html
It will fire after each data saving operation ( if you are saving 5 rows - it will fire 5 times )
Still, info about updated columns will not be available here.
Also, you can try onEditCell event of grid. It fires after changing data in db, but before real saving in database. Here you can get all necessary info - row, column, old value and new value.
http://docs.dhtmlx.com/api__link__dhtmlxtreegrid_oneditcell_event.html
So, what I end up doing was:
After creating the grid I created an array:
var samples = [];
Then, as per #Aquatic suggestion, I added to "onEditCell" event the following line:
samples[samples.length] = grid.cells(rId, 5).getValue();
This allowed me to add to the array the value present on column 5 of the row changed. Then, on "onFullSync" event I hide or show the div created on the view with the messages (I distinguish if it's on row or more changed on the grid).
//Deals with messages after update
dp.attachEvent("onFullSync", function(){
unique_samples = uniq_fast(samples.sort());
if (unique_samples.length == 1){
$('#updated-samples').text("");
$(".messages").show();
$('#updated-samples').text("A seguinte amostra foi actualizada: " + unique_samples[0]);
//to clear the array
samples = [];
} else if (unique_samples.length > 1){
$('#updated-samples').text("");
$(".messages").show();
$('#updated-samples').text("As seguintes amostras foram actualizadas: " + unique_samples.sort().join(", "));
//to clear the array
samples = [];
} else {
$('#updated-samples').text("");
$(".messages").hide();
//to clear the array
samples = [];
}
})
The problem with using "onEditCell" is that everytime a field is changed on that row I get a repeated value on my "samples" array, I I had to remove duplicate from that array. For that I used one of the suggestions at this answer
// remove duplicates in array
function uniq_fast(a) {
var seen = {};
var out = [];
var len = a.length;
var j = 0;
for(var i = 0; i < len; i++) {
var item = a[i];
if(seen[item] !== 1) {
seen[item] = 1;
out[j++] = item;
}
}
return out;
}
Then I have on the beginning of the view, to show the messages:
<div class="alert alert-success messages" id="updated-samples">
And that's it. I could be not the most efficient way but it works for what I wanted. I will leave the question open a few more days to see if any other option appears.
the awesomium answering forums seem pretty much dead, so I'm reposting this here
First of all, before starting to learn Awesomium I used the HtmlAgilityPack library for all my parsing needs, but the library is not being updated anymore and I decided to move to Awesomium. (so my approach is based on my experience with HAP)
I figured out how to parse lists of objects with Awesomium, but I can't figure out how to work with them. For example:
public dynamic FindNodes(string xpath, dynamic node = null, WebView wv = null)
{
if (wv == null) wv = mainView;
dynamic nodes = (JSObject)wv.ExecuteJavascriptWithResult(String.Format("document.evaluate(\"{0}\", {1}, null, XPathResult.UNORDERED_NODE_SNAPSHOT_TYPE, null)", xpath, "document")));
int length = nodes.snapshotLength;
for (int i = 0; i < length; i++)
{
Console.WriteLine(nodes.snapshotItem(i).innerText);
}
return nodes;
}
The problems start after I return the nodes. I want to perform a series of searches for each node, so after returning them I decided that the following should work:
dynamic weakCounters = ap.FindNodes("//div[#id='weaklist']/ul/li");
for (int i = 0; i < weakCounters.snapshotLength; i++)
{
ap.FindNodes("//h3[#class='black']", weakCounters.snapshotItem(i));
}
But it did not. The part where I'm trying to get the length of the list and of course, if I try to get a snapshot of the item directly I get an error.
I understand, that I'm making a HUGE mistake somewhere. I just can't understand where.
Edit: Surprisingly if I do the following, everything seems fine, but it just doesn't look right to create a new variable everytime I need to access it (that's just bananas)
dynamic weakCounters = ap.FindNodes("//div[#id='weaklist']/ul/li");
dynamic nodes = weakCounters;
for (int i = 0; i < nodes.snapshotLength; i++)
{
Also, how can I pass the result (element) that I have extracted back to awesomium so that I could do a "subsearch" ?
cross-posted answer from http://answers.awesomium.com/questions/4276/parsing-with-awesomium.html
Why do you need Awesomium for HTML parsing? What's wrong with
HtmlAgilityPack?
Download page with Awesomium (if that is why you need it), get HTML,
parse it with HtmlAgilityPack.
Parsing like this should be very slow (if it return many elements).
I want to get all video information posible from Youtube for my proyect. I know that the limit page is 100.
I do the next code:
ArrayList<String> videos = new ArrayList<>();
int i = 1;
String peticion = "http://gdata.youtube.com/feeds/api/videos?category=Comedy&alt=json&max-results=50&page=" + i;
URL oracle = new URL(peticion);
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
yc.getInputStream()));
String inputLine = in.readLine();
while (in.readLine() != null)
{
inputLine = inputLine + in.readLine();
}
System.out.println(inputLine);
JSONObject jsonObj = new JSONObject(inputLine);
JSONObject jsonFeed = jsonObj.getJSONObject("feed");
JSONArray jsonArr = jsonFeed.getJSONArray("entry");
while(i<=100)
{
for (int j = 0; j < jsonArr.length(); j++) {
videos.add(jsonArr.getJSONObject(j).getJSONObject("id").getString("$t"));
System.out.println("Numero " + videosTotales + jsonArr.getJSONObject(j).getJSONObject("id").getString("$t"));
videosTotales++;
}
i++;
}
When the program finish, I have 5000 videos per category, but I need much more, much much more, but the limit is page = 100.
So, how can I get more than 10 millions of videos?
Thank you!
Are those 5000 also unique id's ?
I see the use of max-results=50, but not a start-index parameter in your url.
There is a limit on the results you can get per request. There is also a limit on the number of requests that you can send within some time interval. By checking the statuscode of the response and any error message you can find these limits, as they may change in time.
Besides the category parameter, use some other parameters too. For instance, you may vary the q parameter (used with some keywords) and/or order parameter to get a different results set.
See the documentation for available parameters.
Note, that you are using api version 2, which is deprecated. There is an api version 3.