Azure Data Lake Analytics IOutputter E_RUNTIME_USER_ROWTOOBIG

浪尽此生 提交于 2019-12-25 09:24:53

问题


I'm trying to write the results of my custom IOutputter to an intermediate file on the local disk.

After that I want to copy the database file (~20MB) to the adl output store.

Sadly the script terminates with:

An unhandled exception of type 'Microsoft.Cosmos.ScopeStudio.BusinessObjects.Debugger.ScopeDebugException' occurred in Microsoft.Cosmos.ScopeStudio.BusinessObjects.Debugger.dll

Additional information: {"diagnosticCode":195887112,"severity":"Error","component":"RUNTIME","source":"User","errorId":"E_RUNTIME_USER_ROWTOOBIG","message":"The row has exceeded the maximum allowed size of 4MB","description":"","resolution":"","helpLink":"","details":"The row has exceeded the maximum allowed size of 4MB","internalDiagnostics":" 7ffe97231797\tScopeEngine!?ToStringInternal@KeySampleCollection@SSLibV3@ScopeEngine@@AEAA?AV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@XZ + 11b7\t\n 7ffe971d7261\tScopeEngine!??0ExceptionWithStack@ScopeEngine@@QEAA@W4ErrorNumber@1@AEBV?$initializer_list@VScopeErrorArg@ScopeCommon@@@std@@_N@Z + 121\t\n 7ffe971d7f6a\tScopeEngine!??0RuntimeException@ScopeEngine@@QEAA@W4ErrorNumber@1@PEBD@Z + aa\t\n 7ffe6de06aca\t(no module)!(no name)\t\n

    public class CustomOutputter : IOutputter
    {
        private Stream stream;

        public override void Close()
        {
            base.Close();

            using (var fs = File.Open("mydb.data", FileMode.Open))
            {
                fs.CopyTo(stream);
            }
        }

        public override void Output(IRow input, IUnstructuredWriter output)
        {
            if(stream == null)
                stream = output.BaseStream;

            myDb.Insert("somestuff");
        }
    }

Any ideas on this problem?


回答1:


As the error message indicates there is currently a limit to the length of rows read or written by USQL and that is 4MB. If you use record-oriented files like CSVs you will hit this limit.

There is an example of a byte-oriented file read/write UDO that can help you handle files as binaries at https://github.com/Azure/usql/tree/master/Examples/FileCopyUDOs/FileCopyUDOs. You can effectively chunk data using this.



来源:https://stackoverflow.com/questions/41533328/azure-data-lake-analytics-ioutputter-e-runtime-user-rowtoobig

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!