Convert VERY large ppm files to JPEG/JPG/PNG?

为君一笑 提交于 2019-12-12 02:54:30

问题


So I wrote a C++ program that produces very high resolution pictures (fractals). I use fstream to save all the data in a .ppm file.

Everything works fine, but when I go into really high resolution (38400x21600) the ppm file has ~8 Gigabytes. With my 16 Gigabytes of Ram, however, I am still not able to convert that picture. I downloaded couple of converters, but they couldn't handle it. Even Gimp crashed when I try to "export as...".

So, does anyone know a good converter that can handle really large ppm files? In fact, I even want to go above 100 Gigabytes. I don't care if it's slow, it should just work.

If there is no such converter: Is there a way to std::ofstream in a better way? Like maybe, is there a library that automaticly produces a PNG file?

Thanks for you help !

Edit: also I asked myself what might be the best format for saving these large images. I researched and JPEG looks quite pretty (small size, still good quality). But may be there a better format? Let me know. Thanks


回答1:


A few thoughts...

An 8-bit PPM file of 38400x21600 should take 2.3GB. A 16-bit PPM file of the same dimensions requires twice as much, i.e. 4.6GB so I am not sure where you got 8GB from.

VIPS is excellent for processing large images, and if I take a 38400x21600 PPM file, and use the following command in Terminal (i.e. at the command-line), I can see it peaks at 58MB of RAM to do the conversion from PPM to JPEG...

vips jpegsave fractal.ppm fractal.jpg --vips-leak
memory: high-water mark 58.13 MB

That takes 31 seconds on a reasonable spec iMac and produces a 480MB file from my (random) data, so you would expect your result to be much smaller, since mine is pretty incompressible.

ImageMagick, on the other hand, takes 1.1GB peak working set of memory and does the same conversion in 74 seconds:

/usr/bin/time -l convert fractal.ppm fractal.jpg

       73.81 real        69.46 user         4.16 sys
11616595968  maximum resident set size
         0  average shared memory size
         0  average unshared data size
         0  average unshared stack size
   4051124  page reclaims
         4  page faults
         0  swaps
         0  block input operations
       106  block output operations
         0  messages sent
         0  messages received
         0  signals received
         9  voluntary context switches
     11791  involuntary context switches



回答2:


I'd suggest that a more efficient and faster solution would be to simply get more RAM - 128GB is not prohibitively expensive these days (or add swap space).




回答3:


Go to the Baby X resource compiler and download the JPEG encoder, savejpeg.c. It takes an rgb buffer which has to be flat in memory. Hack into it and replace with a version that accepts a stream of 16x16 blocks. Then write your own ppm loader that loads in a 16 pixel high strip at a time.

Now the system will scale up to huge images which don't fit in memory. How you're going to display them I don't know. But the JPEG will be to specification.

https://github.com/MalcolmMcLean/babyxrc



来源:https://stackoverflow.com/questions/36376504/convert-very-large-ppm-files-to-jpeg-jpg-png

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!