The following code adapted from elsewhere:
function importCSVFromWeb() {
var csvUrl = "https://drive.google.com/uc?export=download&id=xxxxxxxxxxxxxxxxxxxxxxxxxx";
var csvContent = UrlFetchApp.fetch(csvUrl).getContentText();
var csvData = Utilities.parseCsv(csvContent);
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName('data');
sheet.clear();
sheet.getRange(1, 1, csvData.length, csvData[0].length).setValues(csvData);
}
executes quickly until the SetValues() in the last line:
[19-12-07 13:45:09:093 PST] Starting execution
[19-12-07 13:45:10:463 PST] UrlFetchApp.fetch([https://drive.google.com/uc?export=download&id=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx]) [1.36 seconds]
[19-12-07 13:45:10:538 PST] UrlFetchApp.HTTPResponse.getContentText() [0.074 seconds]
[19-12-07 13:45:10:976 PST] Utilities.parseCsv([ColAHeader,ColBHeader,ColCHeader,ColDHeader,ColEHeader,ColFHeader,ColGHeader,ColHHeader,ColIHeader,ColJHeader,ColKHeader,ColLHeader,ColMHeader
aaa,bbb,ccc,ddd,eee,fff,ggg,hhh,...) [0.426 seconds]
[19-12-07 13:45:11:504 PST] SpreadsheetApp.getActiveSpreadsheet() [0 seconds]
[19-12-07 13:45:11:715 PST] SpreadsheetApp.Spreadsheet.getSheetByName([data]) [0.21 seconds]
[19-12-07 13:45:11:716 PST] SpreadsheetApp.Sheet.clear() [0 seconds]
[19-12-07 13:45:11:716 PST] SpreadsheetApp.Sheet.getRange([1, 1, 106974, 13]) [0 seconds]
[19-12-07 13:51:23:063 PST] SpreadsheetApp.Range.setValues([[[ColAHeader,ColBHeader,ColCHeader,ColDHeader,ColEHeader,ColFHeader,ColGHeader,ColHHeader,ColIHeader,ColJHeader,ColKHeader,ColLHeader,ColMHeader], [aaa, bbb, ccc, ddd...) [370.621 seconds]
[19-12-07 13:51:23:241 PST] Execution failed: Exceeded maximum execution time [373.963 seconds total runtime]
Other info:
- The array comes from a parsed CSV. This array has 106974 rows and 13 columns.
- SetValues() writes approximately 35000 rows to the sheet 'data' before the script reaches the execution limit.
The timeout is the reason for wanting to split the array. I'd like to process, say, 20,000 rows and then exit. Then create a new script that would process rows 20,001-40,000.
I have tried manually changing the third parameter of the getRange to 20,000 but I apparently don't understand the command well enough as it gives me an error:
The number of rows in the data does not match the number of rows in the range. The data has 106974 but the range has 20000.
Is there a way to split the array so I can process it in batches?
Approximately 35000 rows are successfullyalthough you sayThis array has 106974 rows and 13 columns. 2. If your goal is to put all CSV values to Google Spreadsheet, although I'm not sure whether this is the direction you want, in your case, how about directly converting the CSV file to Google Spreadsheet using Drive API? If I misunderstood your question, I apologize. - Tanaike