0
votes

The problem I'm running into is that I'm hitting a certain quota when processing my spread sheets. I process a bunch of spreadsheets each day and when I added in a new system that sends my google script more spreadsheets to process, I get the error:

Limit Exceeded DriveApp

The line that it ends on is always orderedCsv.getBlob().getDataAsString(); where orderedCsv is the current spreadsheet.

My questions are 1. Which quota could i be hitting? 2. How can I check my current quota usage?

I think it could be Properties read/write over exceeding since I import the original data which could be anywhere from 3000-9000 lines of data.

The error transcript it gives me is:

Error Transcript Pastebin

function ps_csvsToSheet ( currSheet, sheetCsvs, csvDict, sheetN, sheetOrderIndex){
  // import csvs into the sheet with formatting 
  lib_formatSheet(currSheet);

  var row = 39;
  var orderedCsv; 
  // loop for importing CSVs into one sheet in the order we want~~~~~~
  for (var i = 0; i < ps_statOrdering.length; i++) {
    // loop through all the sheets stored in a dictionary we created before
    for (var j = 0; j < sheetCsvs.length; j++) {
      var sheetName = sheetCsvs[j].getName();
      // additional test to ensure Draw chart  and not DrawCall 
      if ( ps_statOrdering[i] == 'Draw') {  
        if ( sheetName.indexOf(ps_statOrdering[i]) !== -1 && sheetName.indexOf('DrawCalls') == -1) {
          orderedCsv = sheetCsvs[j];
          break;
        }
      } else if ( sheetName.indexOf(ps_statOrdering[i]) !== -1) {
        orderedCsv = sheetCsvs[j];
        break;
    }
  }

  try{
    // import the csvs for spreadsheet
    var strData = orderedCsv.getBlob().getDataAsString(); //**********[Line it ends on]***********
    var importedData = lib_importCSV(row+1, 1, strData, currSheet);
  }
  catch(error) {
    Logger.log("Catch Error : " + error);
    return
  }
  // make formatting [][] for the importedData. Here we are working off
  // of pre-knowledge of what is expected 
  var nRows = importedData['rows'];
  var nCols = importedData['cols'];
  var c;
  var weightArr = new Array(nRows);
  var numFormatArr = new Array(nRows);
  for (var r = 0; r < nRows; r++) {  
    weightArr[r] = new Array(nCols);
    numFormatArr[r] = new Array(nCols);
    if (r == 0) {
      c = nCols;
      while(c--) { 
        weightArr[r][c] = "bold"; 
        numFormatArr[r][c] = '';
      }
    } else {
      c = nCols;
      while(c--) { 
        weightArr[r][c] = "normal"; 
        numFormatArr[r][c] = '0.00';
      }
      weightArr[r][0] =  "bold";
      numFormatArr[r][0] = '';
      if( sheetOrderIndex !== -1) {
        numFormatArr[r][0] =  'MMM.dd';
      }
    }
  }
  importedData['range'].setFontWeights(weightArr)
                       .setNumberFormats(numFormatArr);

   //Create the header of the sheet
   lib_inputSheetHeader(currSheet, row, nCols, (sheetN + " " + ps_statOrdering[i] 
   + " Averages"), ps_profileColors[0]) ;
  // insert appropriate graph 
  var key = ps_statOrdering[i];
  if( sheetOrderIndex !== -1) {
    // this is a setting trend sheet, line chart
    lib_makeLineChart(importedData['range'], ps_statLocDict[key][0], ps_statLocDict[key][1],
                  (sheetN + " " + ps_statOrdering[i] ), currSheet, 
                  ps_statVRange[key][0], ps_statVRange[key][1], ps_statAxisDict[key]);
  } else { 
    // this is a map sheet, bar chart  
    // debugPrint(importedData['range'].getValues().toString());
    lib_makeBarChart(importedData['range'], ps_statLocDict[key][0], ps_statLocDict[key][1],
                 (sheetN + " " + ps_statOrdering[i] ), currSheet, 
                 ps_statVRange[key][0], ps_statVRange[key][1], ps_statAxisDict[key]);
  }     
  row += importedData['rows'] +2;
  } // for loop close, import csv ~~~~  
  sleep(1000);
  SpreadsheetApp.flush();
}
1
You can hit whatever quotas exist for whatever methods you use. You provide 0 code, so how are we supposed to know what specific limits you are hitting? The Script Editor has links to existing quota values.tehhowch
thanks for the edit suggestions. I added the error transcript to the problem. I think we are doing too many linear searches to find the files we want which causes the driveapp errorMichael Isaza
How big are those files that you are getting as strings? Do you get the same issue if you use the Advanced Service? What happens if you wrap your call in a try/catch block and sleep for a couple seconds before trying again? Because you don't post the relevant code, we can't offer more specific suggestions w.r.t. avoiding the errortehhowch
I'll see if they'll let me post a snippet of the code, but the files go from 1kb - 2100kb. Ill add the try/catch around the block and see if that helps. Thank you!Michael Isaza

1 Answers

0
votes

So what i did was i off loaded a lot of calculations to python with pandas. That way I could import a data sheet that is already pre formatted and run a couple of operations on it in google to save execution time.

The code i used to make this work is a bit large, because of specific data operation i had to do. Here is a quick summary of the code done in python:

import pandas as pd
import numpy as np                        #used in case of needing np.NAN in our data

class DataProcessing():

    def __init__(self):
        rawData = pd.DataFrame.from_csv( **<enter path to csv>** )

        #From here i would run operations on the dataframe named rawdata
        #until the data frame matched what i needed it to look like.
        #PyCharm is a python IDE that can help you visualize the data frame 
        #through break points.

        #After im done modifying my dataframe i sent it to my google drive.
        #If you download google drive to your PC you can send it to a folder
        #in your PC that will auto sync files to your google drive.
        rawData.to_csv(os.path.join(**<GoogleFolder Path>**, csvName))

Learning pandas is a little tricky at the start but here is a resource that helped me modify my data right.

https://github.com/pandas-dev/pandas/blob/master/doc/cheatsheet/Pandas_Cheat_Sheet.pdf

Help this helps!