问题
TL;DR; Images converted to base64string have huge RAM footprint in large object heap.
I have some code in a windows service that consumes our product images uploaded by users, standardizes them into a web-grade format (they will upload 10MB bitmaps), and does some other things like resize them into a square and add whitespace padding.
It then converts them to a base64 string to upload them into our hosting environment via rest. The environment requires it be done this way, i cannot use URLS. When I do this, they get stored on the large object heap and the program's RAM usage skyrockets over time.
How do I get around this issue?
Here is the code,
private void HandleDocuments(IBaseProduct netforumProduct, MagentoClient client, bool isChild)
{
if (netforumProduct.Documents == null) { return; }
for (int idx = 0; idx < netforumProduct.Documents.Count; idx++)
{
JToken document = netforumProduct.Documents[idx]["Document"];
if (document == null) { continue; }
string fileName = document["URL"].ToString();
// Skip photos on child products (the only identifier is part of the url string)
if (fileName.ToLower().Contains("photo") && isChild) { continue; }
using (HttpClient instance = new HttpClient {BaseAddress = client.NetforumFilesBaseAddress})
{
string trimStart = fileName.TrimStart('.');
string base64String;
using (Stream originalImageStream = instance.GetStreamAsync("iweb" + trimStart).Result)
{
using (MemoryStream newMemoryStream = new MemoryStream())
{
using (Image img = Image.FromStream(originalImageStream))
{
using (Image retImg = Utility.Framework.ImageToFixedSize(img, 1200, 1200))
{
retImg.Save(newMemoryStream, ImageFormat.Jpeg);
}
}
newMemoryStream.Position = 0;
byte[] bytes = newMemoryStream.ToArray();
base64String = Convert.ToBase64String(bytes);
}
}
// MediaGalleryEntry is a simple class with a few string properties
MediaGalleryEntry mge = new MediaGalleryEntry
{
label = "Product_" + netforumProduct.Code + "_image_" + idx,
content = new MediaGalleryContent
{
base64_encoded_data = base64String,
name = "Gallery_Image_" + idx
},
file = trimStart
};
this.media_gallery_entries.Add(mge);
}
}
}
Its not the best code ever, probably not highly optimized, but its the best I can do.
回答1:
TL;DR; Images converted to base64string have huge RAM footprint in large object heap
Yes, that is obviously true. All images are huge. Compression methods only apply to storage and transfer. But when the Image is loaded into memory - for display or further processing - all compression steps have to be undone. This is a common pitfall of people working with them.
It then converts them to a Base64 string to upload them into our hosting environment via rest. The environment requires it be done this way, i cannot use URLS. When I do this, they get stored on the large object heap and the program's RAM usage skyrockets over time." Base64 is ineffective, but will not add a lot to this. +25% IIRC.
The big questions if you are really seeing an issue here, or are only misreading the memory footprint? @CodeCaster figured out that you kept a reference (wich is a real problem and one of the few ways you can get a memory leak in .NET at all), but even if you loose those this string will still stay in memory for some time.
.NET uses the GarbageCollection Memory Management approach. That approach has one issue: While the GC collects, all other Threads accessing the same managed area have to be paused. As a result the GC is - for lack of a better term - very lazy with running. If it only runs once on application closure, that is the ideal situation. The only things that can get it to run earlier are:
- calls to
GC.Collect();
which should generally not be used in productive code, only for debugging if you got a reference memory leak - the danger of a OOM Expection
- some of the alternative GC modes, particular stuff like the server one
All I can tell you that it will run eventually. But I do not think you need to know the exact time necessarily.
来源:https://stackoverflow.com/questions/58486276/memory-usage-and-manipulating-images