问题
I have created a Map/Reduce script which will fetch customer invoices and delete it. If I am creating saved search in UI based on the below criteria, it shows 4 million records. Now, if I run the script, execution stops before completing the "getInputData" stage as maximum storage limit of this stage is 200Mb. So, I want to fetch first 4000 records out of 4 million and execute it and schedule the script for every 15 mins. Here is the code of first stage (getInputData) -
var count=0;
var counter=0;
var result=[];
var testSearch = search.create({
type: 'customrecord1',
filters: [ 'custrecord_date_created', 'notonorafter', 'startOfLastMonth' ],
columns: [ 'internalid' ]
});
do{
var resultSearch = testSearch.run().getRange({
start : count,
end : count+1000
});
for(var arr=0;arr<resultSearch.length;arr++){
result.push(resultSearch[arr]);
}
counter = count+counter;
}while(resultSearch.length >= 1000 && counter != 4000);
return result;
During creating the saved search, it is taking long time, is there any work around where we can filter first 4000 records during saved search creation?
回答1:
Why not a custom mass update?
It would be a 5-10 line script that grabs the internal id and record type of the current record in the criteria of the mass update then deletes the record.
回答2:
I believe this is what search.runPaged() and pagedData.fetch() is for.
search.runPaged
runs the current search and returns summary information about paginated results - it does not give you the result set or save the search.
pagedData.fetch
retrieves the data within the specified page range.
回答3:
If you are intent on the Map/Reduce you can just return your created search. Netsuite will run it and pass each line to the next phase. You can even use a saved search where you limit the number of lines and then in your summarize phase re-trigger the script if there's anything left to do.
The 4k record syntax though is:
var toDelete = [];
search.run().each(function(r){
toDelete.push(r.id);
return toDelete.length < 4000;
});
return toDelete;
finally I normally do this as scheduled mass update. It will tend to interfere less with any production scheduled and map/reduce scripts.
/**
* @NApiVersion 2.x
* @NScriptType MassUpdateScript
*/
define(["N/log", "N/record"], function (log, record) {
function each(params) {
try {
record.delete({
type: params.type,
id: params.id
});
log.audit({ title: 'deleted ' + params.type + ' ' + params.id, details: '' });
}
catch (e) {
log.error({ title: 'deleting: ' + params.type + ' ' + params.id, details: (e.message || e.toString()) + (e.getStackTrace ? (' \n \n' + e.getStackTrace().join(' \n')) : '') });
}
}
return {
each:each
};
});
来源:https://stackoverflow.com/questions/51304600/how-to-delete-mass-records-using-map-reduce-script