I\'m trying to serve large zip files to users. When there are 2 concurrent connections, the server runs out of memory (RAM). I increased the amount of memory from 300MB to 4GB (
You can't use $data with whole file data inside it. Try pass to this function not the content of file only it's path. Next send all headers once and after that read part of this file using fread(), echo that chunk, call flush() and repeat. If any other header will be send in the meantime then finally transfer will be corrupted.