I have filled a google spreadsheet with around 10 URLs and Xpaths. After discovering that ImportXML has some drawbacks (it is getting perpetual loading errors, even when there are only 1 or so functions running). I am looking for another way to populate the sheet.
My first solution was to delete the importxml and implement a write and repeat macro.
However, after about two months, all the seats suddenly stopped working.
Is there a way to solve this with a script?
There is a sample sheet that has stopped.
https://docs.google.com/spreadsheets/d/10Ljo7SESFdGj1Xc7U_Tg3gljHhfqbuyVV5Whulezyv4/edit?usp=sharing
Related
For example I've tried this and it works fine. But the problem I have with this one is that it loads my entire Steam library of around 2700 items. I can't seem to find anything or figure out how to get statistics from just one specific game like the name or amount of hours I've played.
=IMPORTXML("https://api.steampowered.com/IPlayerService/GetOwnedGames/v0001/?key=STEAMAPIKEYHERE&steamid=STEAMIDHERE&format=xml&include_appinfo=1"; "//response/games/message/name")
I use GOOGLEFINANCE() to query the historical USD/GBP exchange rate at a set of fixed dates.
This works fine, except sometimes GOOGLEFINANCE returns #N/A, for whatever temporary upstream reason. When this happens, my spreadsheet becomes filled with #REF's, for all cells that depend on these exchange rates. The sheet is unreadable until the upstream data source is fixed. This can sometimes take hours.
This happens frequently and is especially annoying since I'm not using GOOGLEFINANCE to retrieve time-varying data. I'm just using it as a static dataset of historical exchange rates, so theoretically I have no reason to refresh the data at all.
Is there a way to locally cache the historical exchange rates in my sheet, and to fall back on those values if GOOGLEFINANCE returns #N/A?
(Bonus points for automatically updating the cached values if GOOGLEFINANCE changes its mind about the historical exchange rates.)
I know this is an old post and you probably don't care anymore, but I was having issues with my triggers updating my assets page every night - the totals would fail if any one stock price had an error.
I created a customfunction() which caches the googlefinance results - so it reverts to the last valid data point if googlefinace() fails.
However, this lead to the customfunction achiles heel, 'Loading' - which came up occasionally as well. So I then modified to use Triggers to update, using my new custom function code - which never fails.
I made it an open source project, with one file you need to add to you App Script.
Using it as a custom function would be something like:
=CACHEFINANCE(symbol, attribute, defaultValue)
For example:
=CACHEFINANCE("TSE:ZTL", "price", GOOGLEFINANCE("TSE:ZTL", "price"))
However, if you follow the instructions to create a trigger, it is way more reliable. It also has a built in web screen scraper to track down info on stocks googlefinance refuses to collect data for.
github cachefinance
well, you are working with historical data eg. those data won't change no matter what so you can get the data you need and just hardcode them eg. get rid of the GOOGLEFINANCE for good.
another way would be to wrap any possible #REF! into IFERROR so when the blackout occurs you will get nice blank sheet instead of the sea of #REF! errors
I have a sheet with a Google Finance lookup:
=googlefinance("USDZAR")
and a custom function that returns a constant string (abc). It doesn't take any parameters:
=test()
See here
Google Drive keeps syncing this sheet to my computer every 5-10 mins:
No actual content is being synced since Sheet files are only 176 bytes in size - they must be references to cloud data at Google:
I've compared subsequent files and they are identical.
Also, the Drive API keeps generating change events for this file every few minutes (https://developers.google.com/drive/api/v3/reference/changes/watch)
It's definitely the combination of the Google Finance and custom function - either separately doesn't cause this.
Does anyone know how I can fix this? It seems like a bug?
it's a "future" of GOOGLEFINANCE() to update in given intervals. it has nothing to do with your custom function.
and if GOOGLEFINANCE() constantly updating then it constantly syncing to your PC
you can try =GOOGLEFINANCE("USDZAR"; "daily") if that will do the trick, othervise you will need somehow to freeze GOOGLEFINANCE() formula
i'm using google sheeet api to update a spreadsheet every day.
For 2 months it worked perfectly, but suddenly it stopped on update.
I tried to run it in my machine and i received the error:
json returned "This document is too large to continue editing."
i tried some things to fix it but without sucesss. As you can see it accuses me of the document being large, but the document was as big as always. I tried to update the document writing only ONE LINE and i received the same error...
Also i tried to create a other spreadsheet to receive the complete update (that should be "too large") but it worked perfectly... ok, my code is working in other spread sheet and not in the first one...
I also tried to edit the first spreadsheet manually, but i can't, the spread sheet stays in the status 'saving' or 'reconnecting', but never saves.
Well... Yes, i can use the second spreadsheet that is working, but probably the problem will come back, so i would like to resolve this permanently.
Somebody can help me?
I want to retrieve all stocks from few exchanges - by retrieve the stocks that inside those exchanges (by taking from http://www.nasdaq.com/screening/company-list.aspx).
And then I will quote for all stocks from google or Yahoo.
My question is if I will quote all of them for every 5 seconds or 10 seconds - will they block me?
What is the correct way for getting all stocks and they updated data?
Thanks!
David,
tl;dr - yahoo finace is OK (scraping 2,000 stocks) if you insert pauses in your code
I have some clumsy, but working code (my first attempt at scrapping) that pulls some data from Yahoo Finance. While I don't like the code and I will rewrite it for nasdaq.com in following weeks, I can tell you that I'm not getting blocked.
I have a few years old list of stocks for Russel 2000 so there are around 2,000 tickers I'm slowly going through and pulling some data from balance sheet. I'm using Selenium (see my question history, there is only one to see/get working code), code loads Chromium web browser (Linux) clicks on Balance sheet, scrape some data, clicks quarterly link, scraps more data and then closes the browser. For every ticker (stock).
Just to be on a safe side, I put several pauses into my code, for every scrap or navigation on site I added between 5 and 10 seconds. That way I'm slowly scraping data and Yahoo seems to be OK with this :-) It takes about one minute per ticker. I'm running this scrap job (for the first time!) now for over 30 hours lol and I'm currently at ticker that starts with T so I have few more hours to go.
I have read somewhere that some sites can spot this slow scraping also. So as an idea, instead of just hard code pause of say 7 seconds, you could run random number generator between IDK, 7-15 seconds and that way pauses will be more random and less prone to be spotted... Just a though Hope this helps a little bit even if with delay.
Ah, and if this answer does help you, please be so kind to mark it as solved and up vote it. Maybe I can get a point or two for it. My points are so low I can't even vote other posts that I like and that helped me.