I want a Googlesheet to lookup and import values from api - google-sheets

I have a Google sheet with a list of information on customers, one being a user count from a WaaS I run.
I would like to regularly run a script that makes a http api request that returns the user count from the platform and updates the value in the Googlesheet.
How easy is this?

I never worked with the Google Sheets API, but the documentation looks good. I hope this will help, https://developers.google.com/sheets/

Related

Importing a website table into Google Spreadsheet

I am trying to import this table into a Google Spreadsheet:
The table is available here:
https://competitions.lta.org.uk/sport/drawsheet.aspx?id=8D598CDE-8579-4541-B7AD-48558BF6FEA3&draw=4
Before Google changed their Spreadsheet addresses, I had the import working with ImportHTML(URL, "table", 2) - but this no longer works, even though there appears to be only two 'table' labels in the page HTML.
Looking for a way to abstract the table, I went to 'importXML' but tried several versions like 'importxml("https://competitions.lta.org.uk/sport/drawsheet.aspx?id=8D598CDE-8579-4541-B7AD-48558BF6FEA3&draw=4", "//div[contains(#id,'poule')]")'
and the same first part of the statement with "//table[contains(#class,'ruler')]")
but the formula fails with 'no content'
Would really appreciate some help to find a way to import this table!
Thanks in anticipation,
The reason you can't get the table data is because of the cookies page
Every time Google Sheets is trying to access that link, you need to accept cookies, and by default, Google Sheets won't do it.
You need to bypass or accept the cookies from the website to access data, you will need to implement more advanced things in Python or Google Apps Script

Google sheets IMPORTXML Could not fetch url

I am using IMPORTXML to get information on USPS tracking numbers for my orders and I have been using it for about a month or so, it used to work on and off, sometimes it would not work and all I had to do was either refresh the page or add/remove the "S" on https, and it would work again but it has been about 5 days that it is not working at all no matter what I do, and it is a very tedious task to do manually, external 3rd party tracking apps won't work either because we need everyone to use just the sheet we have because not only contains the tracking info but also everything else. So is there any other way I can import some contents of the USPS tracking website that is reliable and won't stop working, I've seen some scripts here and there but haven't been able to apply them to my needs. Also if that script or workaround could work with other websites like UPS and Fedex it would be awesome as IMPORTXML doesn't work with them (it always says that the content is empty). Thanks in advance.

Zapier pull Word Count of a given document and connect it with Google Sheets

Here is what I need help with:
I need to connect the Word Count of a given document with Google Sheets.
I tried this with the current settings that Zapier has, but apparently it doesn't pull the data from the Word Count.
I am struggling to find a solution for this. And I am wondering whether is just impossible with this platform because the data is not pulled, or there is something it could be done about it.
To give you some context, as why I would need this. It is because I run a ghostwriting company and I would need to track the progress of the projects. Having that information updating periodically in Google Sheets would be really helpful.
Thank you
David here, from the Zapier Platform team.
I took a look and it seems like we're not pulling that right now. We've got an open feature request for the content. If you'd like, I can add your email to that request and you'll get notified if/when it gets fixed!

Get Google Reviews with IMPORTXML function in G Sheets

I'm trying to import to a Google Sheet the number of reviews and average rating of a certain venue on Google Maps.
Taking as an example this page:
https://www.google.com/maps?cid=8807257593070771217
From Chrome's inspector, the XPath for the average should be:
//*[#id='pane']/div/div[1]/div/div/div[1]/div[3]/div[2]/div/div[1]/span[1]/span/span
However it always returns empty.
Any idea why?
PS - This URL redirects to another, but that shouldn't be the problem as the same thing happens with Facebook and it returns the correct values.
Thanks in advance for any help
Per the comments here, you can't. If you want to scrape Google Maps, use Google's officially supported way to do that: their APIs.
You're probably interested in the Place Details, in particular.
If you have access to a businesses Google My Business page you can leverage the API to pull in reviews that way: https://developers.google.com/my-business/content/review-data
Otherwise, https://serpapi.com can scrape Google reviews for you.

Performing Google Search In Spreadsheet And Scraping The SERP Data

I hope that you guys are fine. I want to build a simple spreadsheet and I thought I could be able to make one but blank sheet looks horrible to me. I am sure that you guys are kind enough to help me out.
I want to perform multiple Google search queries in Google spreadsheet and want to parse results of each search (top 10 results of each search)
Something like this: https://www.youtube.com/watch?v=tBwEbuMRFlI
But when I tried his given formula in description to play test, Google returned #Error to me, I don't know why.
Can you guys please help me out in making a simple spreadsheet compatible for multiple queries at once? Like one column for keywords (where I could paste my list of keywords) and then 10 columns of search results. All results for one keyword should come in one row
Something like this:
My 1st Example Query = 1st search result, 2nd search result, 3rd result and so on.
My 2nd Example Query = 1st search result, 2nd search result, 3rd result and so on.
It must be easy to code but yeah, it might be time-consuming and I would be very grateful if anyone of you could help me about it.
Looking forward to your help guys.
The problem is that you want to scrape out of Spreadsheets, that's a bad approach and is almost certainly not going to work. Even if you manage to write a scraper inside that limited environment it will easily be spotted by Google.
As you said time is not a problem, I would suggest another route.
Use a backend tool/script that scrapes the data
Use a backend tool/script that creates/modifies the Google spreadsheet
You can run such a script(s) manually on your PC or from a server full automated using a scheduler/cron job.
To create/modify spreadsheets look here: How do I access the Google Spreadsheets API in PHP?
To scrape Google look here: Is it ok to scrape data from Google results?
So this is PHP as language of choice but you can do the exactly same in Java or Python or C#
There is a third party solution like SerpApi you could use for this. It's a paid API with a free trial.
Google Sheets Add-on: SerpApi - Search Engine Results and Ranks
Example code to extract title from the first result:
=SERPAPI_RESULT("engine=google&q=coffee&location=Austin, Texas, United States&google_domain=google.com&gl=us&hl=en", "organic_results.0.title")

Resources