I want to import in google sheet data from https://www.coinspeaker.com/ieo/feed/
function callCoinSpeaker() {
var response = UrlFetchApp.fetch("https://www.coinspeaker.com/ieo/feed/");
Logger.log(response.getContentText());
var fact = response.getContentText();
var sheet = SpreadsheetApp.getActiveSheet();
sheet.getRange(1,1).setValue([fact]);
}
The script works fine, but I don't know how to format the output that is all in a single cell (A1).
I would like to create a code that automatically format the output splitting into column and row. Any example of formatting output from API request?Thanks ALL!
What I think about your issue in when you're making a GET request to your link, the response is back as a string.
To be able to use the data, you should parse your response with the method JSON.parse(fact)
Use Logger.Log(JSON.parse(fact)) to see what is happening.
Related
I tried Importhtml ("https://nepsealpha.com/investment-calandar/dividend","table",) and then Importxml("https://nepsealpha.com/investment-calandar/dividend",xpath). I found out xpath from "selectorgadget" extension of googlechrome, but still couldn't import it. It shows either "empty content" or formula parse error".
You can retrieve quite all the informations this way
=importxml(url,"//div/#data-page")
and then parse the json.
By script : =getData("https://nepsealpha.com/investment-calandar/dividend")
function getData(url) {
var from='data-page="'
var to='"></div></body>'
var jsonString = UrlFetchApp.fetch(url).getContentText().split(from)[1].split(to)[0].replace(/"/g,'"')
var json = JSON.parse(jsonString).props.today_prices_summary.top_volume
var headers = Object.keys(json[0]);
return ([headers, ...json.map(obj => headers.map(header => obj[header]))]);
}
edit
to update periodically, add this script
function update(){
var chk = SpreadsheetApp.getActiveSpreadsheet().getSheets()[0].getRange('A1')
chk.setValue(!chk.getValue())
}
put a trigger as you wish on the update function and change as follows
=getData("https://nepsealpha.com/investment-calandar/dividend",$A$1)
I know that's not the answer you want to see.
It's impossible to get any content from this website using IMPORTXML or other tools included in Google Sheets.
It's generated using Javascript. Once Javascript is disabled no content is displayed:
It's done on purpose. Financial companies pay for live stock data and they don't want to share it with us for free.
So the site is protected against tools like importxml.
I would like to scrape data from a page, but cannot figure out the right xpath for Google sheets. I would like to extract the number 202 from https://www.belvilla.nl/zoeken/?land=nl&rgo=frie (on top of the page, "202
vakantiehuizen gevonden in Friesland")
If I take the xpath, I get: //*[#id="result-container-items"]/div[1]/div/div/div[1]/div[1]/div[1]/strong
In Google sheets I have tried =IMPORTXML(A1;"//*[#id="result-container-items"]/div[1]/div/div/div[1]/div[1]/div[1]/strong)") and some others like =IMPORTXML(A1;"//div[#class='search-numbers']"), but none of them are working. For the last one I get an error with 'Resource with URL content has exceeded the size limit.' but I'm guessing my xpath is wrong.
Can anyone help me out? Thanks!
IMPORTXML has its limitations especially on JS elements. However, if scripting is an option, try using UrlFetchApp.fetch() in Google Apps Script.
Code:
function fetchUrl(url) {
var html = UrlFetchApp.fetch(url).getContentText();
// startString and endString must be unique or at least the first result
// enclosing the result we want
var startString = 'search-result-div" ';
var endString = 'alternate-dates-filter-bar';
var startIndex = html.search(startString);
var endIndex = html.search(endString);
// regex for numbers and text content
var numbers = /strong>([^<]+)<\/strong/;
var text = /span>([^<]+)<\/span/;
// clean content then combine matches of numbers and text
var content = html.substring(startIndex, endIndex).replace(/\s\s+/g, ' ');
var result = numbers.exec(content)[1] + ' ' + text.exec(content)[1];
return result.trim();
}
Output:
Note:
Code above is specific to what you are fetching. You will need to update the script processing of the response if you want anything else.
You can reuse this on other url and will fetch the similar value located on your wanted xpath in your post.
This doesn't make use of the xpath.
google sheets do not support the scraping of JavaScript elements. you can check this if you disable JS for a given URL and you will be left with content you could import. in your case, this cant be achieved with IMPORTXML:
Let's say I have a Google Sheet with URLs to individual pins on Pinterest in column B. For example: https://www.pinterest.com/pin/146578162860144581/
I'd like to populate cells in column C with the main image from the URL in column B.
Currently I have to do this manually by clicking through to the URL in column B, copy the image URL, and insert image into the cell in column C.
Is there a way to automate this?
Solution
Yes, you can achieve this using the sheet's Google's Apps Script. Below the following piece of code you can find a brief explanation on how this works.
The code
function populateImage() {
var sheet = SpreadsheetApp.getActiveSheet();
// Get cell values of your link column
var values = sheet.getRange('B1:B5').getValues();
// Range where we want to insert the images
var imgrange = sheet.getRange('C1:C5');
for (i=0;i<5;i++){
// Cell we want to insert the image
var cell = imgrange.getCell(i+1, 1);
// Pin Id which is at the end of each url provided, in this case 146578162860144581
var number = values[i][0].substring(29,values[i][0].length-1);
// Url to make the fetch request to access the json of that pin
var url = 'https://widgets.pinterest.com/v3/pidgets/pins/info/?pin_ids='+number;
var response = UrlFetchApp.fetch(url);
var json = response.getContentText();
var data = JSON.parse(json);
// Url of the image of the pin
var imgurl =data.data[0].images["237x"].url;
// Insert image from url
cell.setFormula('=image("'+imgurl+'")');
}
}
Result
Explanation
To insert an image in a website in your sheet you will need the image url. This is the most tricky part in this issue. Pinterest does not provide a good way to get this image url by just fetching to the pin url as this request will return HTML data and not json. To achieve this you will need to make a fetch request to this url https://widgets.pinterest.com/v3/pidgets/pins/info/?pin_ids=PINIDNUMBER. You can find more information about this in this Stack Overflow question, credit goes to #SambhavSharma .
When you fetch following this url you will get the pin's json from which you can retrieve your desired image url (apart from many other data about this pin). With it you can simply insert it in the next column.
I hope this has helped you, let me know if you need anything else or if you did not understand something.
How to fetch data every two minutes from an URL, tried different methods to achieve this, couldn't succeed.
=if(Minute(Now())=Minute(Now()),
ImportHtml("https://www.nseindia.com/live_market/dynaContent/live_watch/option_chain/optionKeys.jsp?symbolCode=-10006&symbol=NIFTY&symbol=NIFTY&instrument=-&date=-&segmentLink=7&symbolCount=2&segmentLink=17",
"table",1),"")
Tried above formula, still not updating data.
Need help on this.
this could do the trick... put =NOW() in some cell and setup an update rate in settings:
function getData() {
var sheetName = SpreadsheetApp.getActiveSpreadsheet().getSheetByName("<sheet-name>");
var queryString = Math.random();
var cellFunction = '=IMPORTHTML("<url>?","table",<index>)';
sheetName.getRange('<import-cell>').setValue(cellFunction);
}
Replace with the name of your sheet (the tab name, not the name of the file).
Replace with the URL of your web page.
Replace with the table position on the page, e.g. 1.
Replace with the cell where you want to put your import statement, e.g. A1.
Now make a trigger for getData() to run in each minute.
It will write the IMPORTHTML command every time in the cell in every minute.
I am new to Google Sheets, and I have a Google Sheet that I have set up to dynamically place the present date in cell A1 and the time in cell A2. The sheet is "published to the web", and "Settings/Calculation" is set to Recalculate change every minute.
That all works fine, but I want to be able to read these values from the sheet using an API call. Also works perfectly, the FIRST TIME. Unfortunately, every time I try to call it again, I get the same answer as the first time, even a day later.
I'm using:
=int(hour(now()))&":"&int(minute(now()))&" "&int(SECOND(now()))
as the formula. I should also add that it's a JSON file that I'm reading and it is updating properly on the actual sheet.
I'm sure that I am missing something. Can someone please tell me what it is?
Thanks in advance.
may be you are not reading the JSON correctly. This give me correct result every time I run it.
function myFunction(){
var url = "https://spreadsheets.google.com/feeds/cells/1TtXe1JXKsxHKUWb3bqniHkLQB0Po1fSUqsiib2yMv90/1/public/values?alt=json";
try{
var sh = SpreadsheetApp.getActive().getSheetByName("Sheet1");
var response = UrlFetchApp.fetch(url)
var str = response.getContentText();
var data = JSON.parse(response);
var entry = data.feed.entry;
sh.getRange(1, 1).setValue(entry[0].content.$t);
sh.getRange(1, 2).setValue(entry[1].content.$t);
}catch(e){
Logger.log(e);
}
}
You need to check the size of "entry" before reading it, I just wanted to show that it works.
Thanks