问题
I'd like to upload big files via ASP.NET to WCF service. Until 100 MB is not a problem, my configuration works perfectly, but above 100 MB it throws System.OutOfMemoryException.
The uploading method works with FileStream, but before that, I save the file to a temporary folder. Not sure if this is the problem, or something else. I add the code of my controller, which takes care of calling the wcf service.
[HttpPost]
public ActionResult Upload()
{
if (Request.Files.Count > 0)
{
var file = Request.Files[0];
if (file != null && file.ContentLength > 0)
{
string fileName = Path.GetFileName(file.FileName);
var path = Path.Combine(Server.MapPath("~/App_Data/Images"), fileName);
file.SaveAs(path);
FileStream fsSource = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
TileService.TileServiceClient client = new TileService.TileServiceClient();
client.Open();
client.UploadFile(fileName, fsSource);
client.Close();
fsSource.Dispose();
if (System.IO.File.Exists(path))
{
System.IO.File.Delete(path);
}
}
}
return RedirectToAction("");
}
The method is called like this:
@using (Html.BeginForm("Upload", "Home", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input type="file" name="FileUploader" />
<br />
<input type="submit" name="Submit" id="Submit" value="Upload file" />
}
In the ASP.NET web.config I already set the following things: executionTimeout, maxRequestLength, requestLengthDiskThreshold, maxAllowedContentLength. I add the binding part of the configuration.
<basicHttpBinding>
<binding name="BasicHttpBinding_ITileService"
closeTimeout="24:01:00" openTimeout="24:01:00" receiveTimeout="24:10:00" sendTimeout="24:01:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferPoolSize="4294967295" maxBufferSize="2147483647" maxReceivedMessageSize="4294967295" textEncoding="utf-8" transferMode="Streamed" useDefaultWebProxy="true" messageEncoding="Text">
<readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647" />
<security mode="None">
<transport clientCredentialType="None" proxyCredentialType="None" realm="" />
<message clientCredentialType="UserName" algorithmSuite="Default" />
</security>
</binding>
</basicHttpBinding>
回答1:
The problem was not in the code I think. The ASP.NET project was hosted in IIS Express instead of Local IIS. Since I changed that in project properties everything works smoothly.
I'm using now @nimeshjm 's code though. Thanks for your help!
回答2:
You can try reading it in chunks using the Request.Files[0].InputStream
Something along these lines:
public ActionResult Upload()
{
if (Request.Files.Count > 0)
{
var file = Request.Files[0];
if (file != null && file.ContentLength > 0)
{
string fileName = Path.GetFileName(file.FileName);
var path = Path.Combine(Server.MapPath("~/App_Data/Images"), fileName);
using (var fs = new FileStream(path, FileMode.OpenOrCreate))
{
var buffer = new byte[1024];
int count;
while ((count = file.InputStream.Read(buffer, 0, 1024)) > 0)
{
fs.Write(buffer, 0, count);
}
}
FileStream fsSource = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
TileService.TileServiceClient client = new TileService.TileServiceClient();
client.Open();
client.UploadFile(fileName, fsSource);
client.Close();
fsSource.Dispose();
if (System.IO.File.Exists(path))
{
System.IO.File.Delete(path);
}
}
}
return RedirectToAction("");
}
来源:https://stackoverflow.com/questions/33921019/asp-net-uploading-big-files-throws-system-outofmemoryexception