How to force download of big files without using too much memory?

后端 未结 6 1218
清歌不尽
清歌不尽 2021-02-15 06:33

I\'m trying to serve large zip files to users. When there are 2 concurrent connections, the server runs out of memory (RAM). I increased the amount of memory from 300MB to 4GB (

6条回答
  •  無奈伤痛
    2021-02-15 07:01

    You can't use $data with whole file data inside it. Try pass to this function not the content of file only it's path. Next send all headers once and after that read part of this file using fread(), echo that chunk, call flush() and repeat. If any other header will be send in the meantime then finally transfer will be corrupted.

提交回复
热议问题