问题
I have designed a Dashboard in Google DataStudio based on one GoogleSheets file. The data that goes into the GoogleSheets file comes from two .csv files seperated by ";". These .csv files are automatically updated at night and stored on the GoogleDrive. In order for my DashBoard to also automatically update based on the changes in the .csv files I need to automatically read them into the GoogleSheets file, which I do by triggering my script to run every night after the .csvs have gotten updated...
As I do not really know how to code besides from some basics I build the following script from various sources throughout the internet. The nice thing is, that it basically works:
function parseCsv(csvString, delimiter) {
var sanitizedString = csvString.replace(/(["'])(?:(?=(\\?))\2[\s\S])*?\1/g, function(e){return e.replace(/\r?\n|\r/g, ' ') });
return Utilities.parseCsv(sanitizedString, delimiter)
}
//------------------------------------------------------------
function import_Sales() {
var fileName = "exported_Sales.csv";
var searchTerm = "title = '"+fileName+"'";
var files = DriveApp.searchFiles(searchTerm)
var csvFile = "";
while (files.hasNext()) {
var file = files.next();
if (file.getName() == fileName) {
csvFile = file.getBlob().getDataAsString('ISO-8859-15');
break;
}
}
var csvData = parseCsv(csvFile,";");
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName('ExportSales');
sheet.getRange(1, 1, csvData.length, csvData[0].length).setValues(csvData);
}
As already mentioned I am importing 2 different .csv files (so there is not only the import_sales function but also an import_stock funtion.
For the stocks csv everything works really fine (it looks exactly like the import_Sales() function above, besides the names of course).
The problem seems to be the size of my sales-.csv file. (23 colums x 56.000 rows) and the file is getting longer every night when it is being updated. So when trying to run the import_Sales function after some minutes i get the error "maximum execution time"... So i get that this must have something to do with the size of the .csv or the funtion (hopefully) being inefficient and maybe someone of you has an idea how it could run faster? The size of the .csv can not be changed and i can't imagine that it shall be impossible to import it to google Sheets?!
Has anyone an idea how i can manage it to get the data from the csv into google sheets automatically every night? Maybe i could skip the row that are already in the google sheets file and just import the new lines from the cvs somehow? But thats were my knowledge comes to an end definitely, so i would be glad i you guys were able to help me!
thanks & greetings!
回答1:
If it’s already in your Drive, try copying the CSV and converting it to Sheets, instead of “manually” taking the cells from csv to an existing sheet.
function main(){
var files = DriveApp.searchFiles('title contains "your csv file name"').next();
var name = files.getName();
var file_id = files.getId();
var file_Blob = files.getBlob();
var newFile = { title : name+'_Sheet',
key : file_id};
files = Drive.Files.insert(newFile,file_Blob, {convert: true});
}
I also attach the documentation for the function execution time limits.
来源:https://stackoverflow.com/questions/56968783/problem-importing-a-large-csv-file-into-google-sheets