How do I convert encoding of a large file (>1 GB) in size - to Windows 1252 without an out-of-memory exception?

你离开我真会死。 提交于 2019-12-20 18:05:33

问题


Consider:

public static void ConvertFileToUnicode1252(string filePath, Encoding srcEncoding)
{
    try
    {
        StreamReader fileStream = new StreamReader(filePath);
        Encoding targetEncoding = Encoding.GetEncoding(1252);

        string fileContent = fileStream.ReadToEnd();
        fileStream.Close();

        // Saving file as ANSI 1252
        Byte[] srcBytes = srcEncoding.GetBytes(fileContent);
        Byte[] ansiBytes = Encoding.Convert(srcEncoding, targetEncoding, srcBytes);
        string ansiContent = targetEncoding.GetString(ansiBytes);

        // Now writes contents to file again
        StreamWriter ansiWriter = new StreamWriter(filePath, false);
        ansiWriter.Write(ansiContent);
        ansiWriter.Close();
        //TODO -- log success  details
    }
    catch (Exception e)
    {
        throw e;
        // TODO -- log failure details
    }
}

The above piece of code returns an out-of-memory exception for large files and only works for small-sized files.


回答1:


I think still using a StreamReader and a StreamWriter but reading blocks of characters instead of all at once or line by line is the most elegant solution. It doesn't arbitrarily assume the file consists of lines of manageable length, and it also doesn't break with multi-byte character encodings.

public static void ConvertFileEncoding(string srcFile, Encoding srcEncoding, string destFile, Encoding destEncoding)
{
    using (var reader = new StreamReader(srcFile, srcEncoding))
    using (var writer = new StreamWriter(destFile, false, destEncoding))
    {
        char[] buf = new char[4096];
        while (true)
        {
            int count = reader.Read(buf, 0, buf.Length);
            if (count == 0)
                break;

            writer.Write(buf, 0, count);
        }
    }
}

(I wish StreamReader had a CopyTo method like Stream does, if it had, this would be essentially a one-liner!)




回答2:


Don't readToEnd and read it like line by line or X characters at a time. If you read to end, you put your whole file into the buffer at once.




回答3:


Try this:

using (FileStream fileStream = new FileStream(filePath, FileMode.Open))
{
    int size = 4096;
    Encoding targetEncoding = Encoding.GetEncoding(1252);
    byte[] byteData = new byte[size];

    using (FileStream outputStream = new FileStream(outputFilepath, FileMode.Create))
    {
        int byteCounter = 0;

        do
        {
            byteCounter = fileStream.Read(byteData, 0, size);

            // Convert the 4k buffer
            byteData = Encoding.Convert(srcEncoding, targetEncoding, byteData);

            if (byteCounter > 0)
            {
                outputStream.Write(byteData, 0, byteCounter);
            }
        }
        while (byteCounter > 0);

        inputStream.Close();
    }
}

Might have some syntax errors as I've done it from memory but this is how I work with large files, read in a chunk at a time, do some processing and save the chunk back. It's really the only way of doing it (streaming) without relying on massive IO overhead of reading everything and huge RAM consumption of storing it all, converting it all in memory and then saving it all back.

You can always adjust the buffer size.

If you want your old method to work without throwing the OutOfMemoryException, you need to tell the Garbage Collector to allow very large objects.

In App.config, under <runtime> add this following line (you shouldn't need it with my code but it's worth knowing):

<gcAllowVeryLargeObjects enabled="true" />


来源:https://stackoverflow.com/questions/42551162/how-do-i-convert-encoding-of-a-large-file-1-gb-in-size-to-windows-1252-with

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!