How to locally cache GOOGLEFINANCE results? - google-sheets

I use GOOGLEFINANCE() to query the historical USD/GBP exchange rate at a set of fixed dates.
This works fine, except sometimes GOOGLEFINANCE returns #N/A, for whatever temporary upstream reason. When this happens, my spreadsheet becomes filled with #REF's, for all cells that depend on these exchange rates. The sheet is unreadable until the upstream data source is fixed. This can sometimes take hours.
This happens frequently and is especially annoying since I'm not using GOOGLEFINANCE to retrieve time-varying data. I'm just using it as a static dataset of historical exchange rates, so theoretically I have no reason to refresh the data at all.
Is there a way to locally cache the historical exchange rates in my sheet, and to fall back on those values if GOOGLEFINANCE returns #N/A?
(Bonus points for automatically updating the cached values if GOOGLEFINANCE changes its mind about the historical exchange rates.)

I know this is an old post and you probably don't care anymore, but I was having issues with my triggers updating my assets page every night - the totals would fail if any one stock price had an error.
I created a customfunction() which caches the googlefinance results - so it reverts to the last valid data point if googlefinace() fails.
However, this lead to the customfunction achiles heel, 'Loading' - which came up occasionally as well. So I then modified to use Triggers to update, using my new custom function code - which never fails.
I made it an open source project, with one file you need to add to you App Script.
Using it as a custom function would be something like:
=CACHEFINANCE(symbol, attribute, defaultValue)
For example:
=CACHEFINANCE("TSE:ZTL", "price", GOOGLEFINANCE("TSE:ZTL", "price"))
However, if you follow the instructions to create a trigger, it is way more reliable. It also has a built in web screen scraper to track down info on stocks googlefinance refuses to collect data for.
github cachefinance

well, you are working with historical data eg. those data won't change no matter what so you can get the data you need and just hardcode them eg. get rid of the GOOGLEFINANCE for good.
another way would be to wrap any possible #REF! into IFERROR so when the blackout occurs you will get nice blank sheet instead of the sea of #REF! errors

Related

Rails saving and/or caching complicated query results

I have an application that, at its core, is a sort of data warehouse and report generator. People use it to "mine" through a large amount of data with ad-hoc queries, produce a report page with a bunch of distribution graphs, and click through those graphs to look at specific result sets of the underlying items being "mined." The problem is that the database is now many hundreds of millions of rows of data, and even with indexing, some queries can take longer than a browser is willing to wait for a response.
Ideally, at some arbitrary cutoff, I want to "offline" the user's query, and perform it in the background, save the result set to a new table, and use a job to email a link to the user which could use this as a cached result to skip directly to the browser rendering the graphs. These jobs/results could be saved for a long time in case people wanted to revisit the particular problem they were working on, or emailed to coworkers. I would be tempted to just create a PDF of the result, but it's the interactive clicking of the graphs that I'm trying to preserve here.
None of the standard Rails caching techniques really captures this idea, so maybe I just have to do this all by hand, but I wanted to check to see if I wasn't missing something that I could start with. I could create a keyed model result in the in-memory cache, but I want these results to be preserved on the order of months, and I deploy at least once a week.
Considering Data loading from lots of join tables. That's why it's taking time to load.
Also you are performing calculation/visualization tasks with the data you fetch from DB, then show on UI.
I like to recommend some of the approaches to your problem:
Minimize the number of joins/nested join DB queries
Add some direct tables/columns, ex. If you are showing counts of comments of user the you can add new column in user table to store it in user table itself. You can add scheduled job to update data or add callback to update count
also try to minimize the calculations(if any) performing on UI side
you can also use the concept of lazy loading for fetching the data in chunks
Thanks, hope this will help you to decide where to start 🙂

Google Sheet with Google Finance lookup and custom function causes Google Drive to constantly resync

I have a sheet with a Google Finance lookup:
=googlefinance("USDZAR")
and a custom function that returns a constant string (abc). It doesn't take any parameters:
=test()
See here
Google Drive keeps syncing this sheet to my computer every 5-10 mins:
No actual content is being synced since Sheet files are only 176 bytes in size - they must be references to cloud data at Google:
I've compared subsequent files and they are identical.
Also, the Drive API keeps generating change events for this file every few minutes (https://developers.google.com/drive/api/v3/reference/changes/watch)
It's definitely the combination of the Google Finance and custom function - either separately doesn't cause this.
Does anyone know how I can fix this? It seems like a bug?
it's a "future" of GOOGLEFINANCE() to update in given intervals. it has nothing to do with your custom function.
and if GOOGLEFINANCE() constantly updating then it constantly syncing to your PC
you can try =GOOGLEFINANCE("USDZAR"; "daily") if that will do the trick, othervise you will need somehow to freeze GOOGLEFINANCE() formula

UITableView + Large web data source

I'm using a UITableView which hooks into a rest API.
On first launch the app retrieves the data the UITableView will display and parses it in to a Core Data database.
This works fine for small datasets. But when the dataset grows to above 300-500 items it does not perform very well. Taking minutes to finish downloading+parsing. The app isn't deadlocked during this time, but the user likely won't wait for the parsing to complete.
I then decided to use paging. So now, I only retrieve the latest 20 items, and then the user can click "Load more" to go back further. The data is cached.
This seems to work well except for one problem.
Because I'm not downloading all the data on each load, I cannot tell when an item has been deleted on the server and I cannot tell when an item has changed (say the title may have changed).
Can anyone provide me with any suggestions to resolve this?
Thanks.
We routinely request a similar number of items and display it in a table view. However in our case the API returns JSON and we store it in model objects, not Core Data. Once the data is downloaded it takes less than a second to go from JSON to appearing in the table. Core Data is a bad idea for anything that isn't actually a database, or that isn't preserved for a past a user session. But you need to identify which part of your transaction is actually taking the most time. In our case it's the backend behind the API, but once it shows up everything is quite fast.
Also, in our case the data is around 700K and we are going to GZIP it soon to minimize the network time even further.

Data sync between database and google calendar

I would like to sync my db (tasks on my db, that have a decription, a date, a start time and an end time, and a user) with Google calendar.
For sync with google i plan to use these components (of course I could somehow write the whole stuff on my own but this is something I can plan for the future now I am short of time, or in alternative can you suggest some working code that connects to google calendar to send/recieve data?).
Now my main problem is not really linked to Delphi programming anyway I must ask a Delphi related questions because other questions get unviewd (like this one i asked).
So I wonder how to do the sync. Note: I do one way sync and the generated calendar will be a read only calendar.
I can set a max number in the past and future to be synced (like 10 days in past and 100 in the future for example). Then the idea I have is this:
as I start the sync app I comletely read the google calendar itmes in the range, I compare one by one with what I have in db and then I "merge" changes. Then on timer I check for differences in my db and i upload changes.
But I am not sure that these is the best solution.
A simplification of the real case is this: imagine it is a CRM with some task assigend to every user. Since beyond every task there is a logic i want to managea that logic only in my application, but the idea of pulishing the calendar to google is that it is then easily available from any mobile device. This is way there is a one way sync. Ic ould also let the calendar not be readonly anyway at every sync I wil "download" the newly inserted tasks but I will ignore the deleted ones and the edited ones. In this second case it is not enough to track changes in db, but I shuold also track changes on google, at least to "intercept" the newly added tasks.
I am aware this is gerneic question but I would like to trigger an answer that can be useful, etiher redirecting me to a sync algorithm or to Delphi sample code or anything that can help me progress on this issue. Thanks.
Google: "calendar sync algorithms"
https://wiki.mozilla.org/Calendar:Syncing_Algorithm
http://today.java.net/pub/a/today/2007/01/16/synchronizing-web-client-database.html
Synchronisation algorithms
The last one actually is funny because it leads right back to StackOverflow ;) Point is: I think there is no need to reinvent the wheel. Ps: The first link contains some useful thoughts similar to yours.

How do you track page views on a view

Is there a plugin for this or a gem that I can use. I was thinking about just writing it to a table when a view was called in the controller. Is this the best way? I see stackoverflow has this functionality how do they do it?
Google Analytics - Let Google or some other third-party analytics provider handle it for you for free. I don't think you want to do file writes on every page load - potentially costly. Another option is to store the information in memory and write to the database periodically instead of on every page load.
[EDIT] This is an interesting question. I asked for help on this issue of what's more efficient - db writes vs file writes - there's some good feedback there too.
If you just wanted to get something in there easily you could use a real time analytics provder like W3 Counter
It gives you real time data (as opposed to Google Analytics) and is relatively simple to deploy (a few lines in your global template) but may not give you the granularity that you want. I guess it depends on if you are wanting this information programmatically to display/use in the app or for statistical purposes.
Obviously, there are third party statistics services (Google Analytics, Mint, etc...), but if you must do it yourself then doing a write each time someone hits a page will seriously impact your DB.
I'd write individual hits to an intermediate file on the filesystem or memcached, then fire a task every 10 - 15 minutes that will parse that data and insert it into the database.

Resources