I am using google spreadsheet to collaborate on some common data, and processing this data using spreadsheet macros and saving it there itself in cells. This approach is error prone as the macro functions which are processing the data depend on inputs apart from what is given as parameters.
Example of how the common data-looks
Pages Sheet Explanation of this common data
- The three sheets are filled by various collaborators
- Content sheet defines base elements of a page, they are referred (using their UUIDs) in the sections sheet (using macros), and finally all sections add together to make publishable html page.
- The output html format varies depending upon the destination which are multiple - static html, Google Document, Google Slides, Word-press, MediaWiki etc. The formatting is done using GAS macros.
I tried various solutions, but nothing is working properly. Finally I have thought to keep the google spreadsheet as a data source only and move the entire computation to my local machine. This means I have to download the sheets data (3 sheets) which are some 1,000 rows, but can become voluminous with time.
Earlier when the entire computation was on google spreadsheets, I had to only fetch the final data and use it, which amounted to a lot fewer APIs calls. Referring to the example above, It means I would only fetch the output html of the "Pages sheet".
Q1) So my question is, given that I plan to move the entire computation to local machine, if I make only 3 APIs calls to bulk fetch data of three sheets, and pull the entire data, does it still counts as just 3 API calls, or big requests have a different API cost? Are there any hidden risks of over-running the quota in this approach?
Q2) And let's say i use hooks, both spreadsheet hooks and drive hooks to sync with real-time changes, what are my risks of running out of quota. Does each hook call constitute a API call. Any general advice on best practices for doing this?
Thanks