问题
I have below code in my PL/SQL procedure, which I called in API_XXX.put(it calls utl_file.put
) in a while loop. And the l_xmldoc is CLOB from a function of getReportXML, which returns the xml clob.
the code I write to write xml into a file is like:
l_offset := 1;
WHILE (l_offset <= l_length)
LOOP
l_char := dbms_lob.substr(l_xmldoc,1,l_offset);
IF (l_char = to_char(10)) ---I also tried if (l_char=chr(10)) but it did not work
THEN
API_XXXX.new_line(API_XXX.output, 1);
ELSE
API_XXXX.put(fnd_API_XXX.output, l_char);
END IF;
l_offset := l_offset + 1;
END LOOP;
Please note that the API_XXX is the existing package which I am not able to modify, and this api calls fflush
in the end of put
.
API_XXX.put
's part is like below("WHICH" is the first param):
elsif WHICH = API_XXX.OUTPUT then
temp_file := OUT_FNAME;
utl_file.put(F_OUT, BUFF);
utl_file.fflush(F_OUT);
API_XXX.new_line
is like(LINES is the number of lines to write):
elsif WHICH = API_XXX.OUTPUT then
temp_file := OUT_FNAME;
utl_file.new_line(F_OUT, LINES);
utl_file.fflush(F_OUT);
I notice a that the put/new_line procedure in my customer's side will sometimes raise UTL_FILE.WRITE_ERROR
for unknown reason(maybe due to the l_length is too large(up to 167465)) in the while loop from my customer.
I read Oracle PL/SQL UTL_FILE.PUT buffering . And I found that this is the same cause, my l_xmldoc is really large and when I loop it, I found that it is without a new line terminator so the buffer is up to 32767 even though I fflush every time.
So, how should I convert the l_xmldoc into a varchar with new line terminator.
PS: I confirmed that my customer is using Oralce 11g
回答1:
Post the Oracle Version you are using! Or we can just guess around...
Your
fflush
will not work as you expect - From the documentation:FFLUSH physically writes pending data to the file identified by the file handle. Normally, data being written to a file is buffered. The FFLUSH procedure forces the buffered data to be written to the file. The data must be terminated with a newline character.
tbone is abolutely right the line TO_CHAR(10) is wrong! Just try
SELECT TO_CHAR(10) FROM DUAL;
you will get10
which you then compare to a single character. A single character will never be '10' since 10 has two characters!Your problem is most likely a buffer-overflow with too large XML-Files, but keep in mind, also other problems on the target system can lead to write_errors, which should be handled.
Solutions
Quick&Dirty: Since you don't seem to care about performance anyways you can just close the file every X byte and reopen it with A for append. So just add to the loop:
IF MOD( l_offset, 32000 ) = 0 THEN UTL_FILE.FCLOSE( f_out ); UTL_FILE.FOPEN( out_fpath, out_fname, f_out, 'a', 32767 ); END IF;
Use the right tool for the right job:
UTL_FILE
is not suited for handling complex data. The only usecase for UTL_FILE are small newline-separated lines of text. For everything else you should write RAW bytes! (Which will also allow you porper control over ENCODING, which is currently just mini-vanilly-lucky-guess)Write a Java-Stored-Procedure with NIO-Filechannels - fast, safe, nice... But be careful, your program might run 10 times as fast!
回答2:
Just a guess, but instead of "to_char(10)" you might try chr(10) to determine/write a newline. Not sure if this will solve your problem, but sometimes very long lines (without newlines) can cause issues.
For example:
declare
l_clob clob;
l_char char;
begin
l_clob := 'Line 1' || chr(10) || 'Line 2' || chr(10);
for i in 1 .. DBMS_LOB.GETLENGTH(l_clob)
loop
l_char := dbms_lob.substr(l_clob, 1, i);
if (l_char = chr(10)) then
--if (l_char = to_char(10)) then
dbms_output.put_line('Found a newline at position ' || i);
end if;
end loop;
end;
Notice the difference between chr(10) and to_char(10). Easy enough to test if this solves your problem anyway.
来源:https://stackoverflow.com/questions/25279365/utl-file-write-error-when-calling-utl-file-put-in-a-loop