I have a few Oracle procedures that generate/return a large amount of data that I need to write out to a file. I\'m currently trying to accomplish with a data-reader. It seems
Just a couple of general comments, for the original version of your question:
If you're using the Microsoft .NET Framework's builtin System.Data.OracleClient provider classes, you might get better performance from Oracle's own updated .NET Provider.
If the time shifts around each run, maybe the .NET garbage collector is kicking in on some of the memory usage that's not seen in your example (i.e. if many objects are being instantiated and thrown away).
I'm assuming when you say "particular procedure" that you mean that you are calling an Oracle stored procedure that has an OUT parameter that is a REF CURSOR. Your DataReader is then fetching from the cursor returned by the procedure. Is that the case?
If so, can you eliminate the .Net code and write a PL/SQL block that calls the procedure and fetches all the data from the cursor to see if you get the same behavior there? Oracle doesn't materialize data when the cursor is opened-- it materializes the results as the client fetches the data. So it is possible that Oracle has to do quite a bit of work to fetch the Nth row if it has to materialize and filter out a bunch of data before it finds the N+1th row. If you see the same behavior in PL/SQL running on the database, that is almost certainly what's going on. If you don't see any issues in the PL/SQL block, then something must be going on in the middle tier.
What is the database actually doing ?
A query with a GROUP BY or and ORDER BY may need to generate the full result set, then sort/aggregate it before returning a row. A query scanning a large table may find 50 rows in the first couple of blocks, then read another hundred thousand blocks before it finds another one.
I suggest you ignore the VB code and post the database code.