This is the scenario we have: We have huge encrypted files, in the order of gigabytes that we can decrypt correctly if we read them until the end. The problem arises when we ar
My solution was to, in my derived class, add this to my Dispose(bool) override:
protected override void Dispose(bool disposing)
{
// CryptoStream.Dispose(bool) has a bug in read mode. If the reader doesn't read all the way to the end of the stream, it throws an exception while trying to
// read the final block during Dispose(). We'll work around this here by moving to the end of the stream for them. This avoids the thrown exception and
// allows everything to be cleaned up (disposed, wiped from memory, etc.) properly.
if ((disposing) &&
(CanRead) &&
(m_TransformMode == CryptoStreamMode.Read))
{
const int BUFFER_SIZE = 32768;
byte[] buffer = new byte[BUFFER_SIZE];
while (Read(buffer, 0, BUFFER_SIZE) == BUFFER_SIZE)
{
}
}
base.Dispose(disposing);
...
By making sure the stream is always read to the end, the internal issue in the CryptStream.Dispose is avoided. Of course, you need to weigh this against the nature of what you are reading, to be sure it doesn't have a negative impact. Only use it against a source of a known finite length.