问题
I have created an extension method called AddGZip which looks like the following:
public static void AddGZip(this HttpResponse response)
{
response.Filter = new GZipStream(response.Filter, CompressionMode.Compress);
response.AppendHeader("Content-Encoding", "gzip");
}
This is a very cut down version of the code:
var response = HttpContext.Current.Response;
var request = HttpContext.Current.Request;
var result = File.ReadAllText(path);
if (request.SupportsGZip)
{
response.AddGZip();
}
response.Write(result);
response.Flush();
When you view the response in a web browser with GZip support you get an error like this:
"XML Parsing Error: unclosed token Location: http://webserver1/1234.xml Line Number 78, Column 1:"
When i view the source it's basically missed out the last >
from the end of the XML file. So 1 or 2 bytes.
If I comment out the AddGZip Line it works fine. However I really want to support GZip as the XML can be quite large.
Does anyone have a suggestion for me? I've tried checking lots of blogs but no solution seems to be out there for this type of error.
Dave
回答1:
There is an issue (or perhaps a really clever feature that I haven't seen justified anywhere) with DeflateStream
(GZipStream
builds on DeflateStream
and inherits the issue*), where flushing can lose data.
Response.Flush()
will flush the filter. The solution is to use a wrapper that is aware of both the zipping and the underlying sink, and only flushes the latter:
public enum CompressionType
{
Deflate,
GZip
}
/// <summary>
/// Provides GZip or Deflate compression, with further handling for the fact that
/// .NETs GZip and Deflate filters don't play nicely with chunked encoding (when
/// Response.Flush() is called or buffering is off.
/// </summary>
public class WebCompressionFilter : Stream
{
private Stream _compSink;
private Stream _finalSink;
public WebCompressionFilter(Stream stm, CompressionType comp)
{
switch(comp)
{
case CompressionType.Deflate:
_compSink = new DeflateStream((_finalSink = stm), CompressionMode.Compress);
break;
case CompressionType.GZip:
_compSink = new GZipStream((_finalSink = stm), CompressionMode.Compress);
break;
}
}
public override bool CanRead
{
get
{
return false;
}
}
public override bool CanSeek
{
get
{
return false;
}
}
public override bool CanWrite
{
get
{
return true;
}
}
public override long Length
{
get
{
throw new NotSupportedException();
}
}
public override long Position
{
get
{
throw new NotSupportedException();
}
set
{
throw new NotSupportedException();
}
}
public override void Flush()
{
//We do not flush the compression stream. At best this does nothing, at worse it
//loses a few bytes. We do however flush the underlying stream to send bytes down the
//wire.
_finalSink.Flush();
}
public override long Seek(long offset, SeekOrigin origin)
{
throw new NotSupportedException();
}
public override void SetLength(long value)
{
throw new NotSupportedException();
}
public override int Read(byte[] buffer, int offset, int count)
{
throw new NotSupportedException();
}
public override void Write(byte[] buffer, int offset, int count)
{
_compSink.Write(buffer, offset, count);
}
public override void WriteByte(byte value)
{
_compSink.WriteByte(value);
}
public override void Close()
{
_compSink.Close();
_finalSink.Close();
base.Close();
}
protected override void Dispose(bool disposing)
{
if(disposing)
{
_compSink.Dispose();
_finalSink.Dispose();
}
base.Dispose(disposing);
}
}
It's also worth noting that most user-agents that support gzip-encoding also support deflate-encoding. While the size improvement with deflate is negliable (literally a few bytes), some libraries on some architecture deals with deflate considerably better (this goes for both compressing and decompressing), so it's always worth favouring deflate over gzip with HTTP compression.
回答2:
Have you tried adding gzip through IIS? There is a question about it, so have a look what's it about. Basically, the IIS does all the compression so you don't have to.
来源:https://stackoverflow.com/questions/3653250/gzipstream-is-cutting-off-last-part-of-xml