memory-limit

Serving Large Protected Files in PHP/Apache

与世无争的帅哥 提交于 2019-12-10 05:22:30
问题 I need to serve up large files (> 2gb) from an Apache web server. The files are protected downloads, so I need some kind of way to authorize the user. The CMS I'm using uses cookies checked against a MySQL database to verify the user. On the server, I have no control over max_execution_time, and limited control over memory_limit. My technique has been working for small files. After the user has been authorized in PHP (by the CMS), I use readfile() to serve the file, which is stored above the

Allowed memory size of 268435456 bytes exhausted [duplicate]

 ̄綄美尐妖づ 提交于 2019-12-10 01:24:13
问题 This question already has answers here : Closed 7 years ago . Possible Duplicate: Allowed memory size of X bytes exhausted I'm handling with a quite little big database (87mb) import and to do that I use a php script. All the operations are made in locale with an apache installation on Ubuntu Lucid. When I run the script after few minutes I receive this error: Allowed memory size of 268435456 bytes exhausted I've changed memory_limit to 2GB in php.ini file and restarted apache. After that I

PHP Memory Limit 25MB Exhausted - File Upload/Crop/Resize

♀尐吖头ヾ 提交于 2019-12-09 23:47:31
问题 I'm using a single image upload/crop/resize script for files up to 10MB. When testing, I set php_ini memory limit to 25M and that is exhausted when uploading a file only about 1.4MB. "Allowed memory size of 26214400 bytes exhausted (tried to allocate 10368 bytes)" This seems strange to me, isn't 10368 < 26214400? (Rhetorical Question) Or does that mean I went 10368 bytes over 25MB? Should my script be using this much memory? Code: function make_thumbnails($updir, $img) { $thumbnail_width =

Integers of unlimited size?

丶灬走出姿态 提交于 2019-12-07 16:39:10
问题 In Python, I can write a program to calculate integers of unlimited size. Just the other day I did the one millionth fibonacci number, and it was so large It was unable to fit in the console. If this is possible to do in Python, which to my understanding was written in C, how could one do the same in C++? It has to be possible, otherwise I don't see how it could be done in Python. I also believe there is something similar in Java/C# called a Biginteger, but I couldn't find anything saying how

Erlang: How to limit the memory assigned to a process

北战南征 提交于 2019-12-07 10:43:37
问题 What I'm asking is if it's possible to limit memory (heap or stack) assigned to a specific process, so that this process can't exceed it. Maybe something like "process_flag(min_heap_size, MinHeapSize)", but for the maximum heap. 回答1: You could put together some kind of process tracking gen_server that periodically checks assigned processes for memory footprint and kills them if it exceeds a certain amount. Using a combination of process_info(Pid, memory). and exit(Pid, Reason) calls, this

Why memory usage is greater than what I set in Kubernetes's node?

懵懂的女人 提交于 2019-12-07 07:01:08
问题 I allocated resource to 1 pod only with 650MB/30% of memory (with other built-in pods, limit memory is 69% only) However, when the pod handling process, the usage of pod is within 650MB but overall usage of node is 94%. Why does it happen because it supposed to have upper limit of 69%? Is it due to other built-in pods which did not set limit? How to prevent this as sometimes my pod with error if usage of Memory > 100%? My allocation setting ( kubectl describe nodes ): Memory usage of

C++ Change Max RAM Limit

对着背影说爱祢 提交于 2019-12-05 20:42:09
How would I change the max amount of RAM for a program? I constantly am running out of memory (not system max, ulimit max), and I do not wish to change the global memory limit. I looked around and saw the vlimit() function might work, but I'm unsure exactly how to use it. Edit: I'm on linux 2.6.38-11-generic This is not a memory leak, I literally must allocate 100k of a given class, no way around it. Sebastian Do you allocate the objects on the stack and are in fact hitting the stack limit? Do you, e.g., write something like this: void SomeFunction() { MyObject aobject[100000]; // do something

Why memory usage is greater than what I set in Kubernetes's node?

谁说胖子不能爱 提交于 2019-12-05 12:18:45
I allocated resource to 1 pod only with 650MB/30% of memory (with other built-in pods, limit memory is 69% only) However, when the pod handling process, the usage of pod is within 650MB but overall usage of node is 94%. Why does it happen because it supposed to have upper limit of 69%? Is it due to other built-in pods which did not set limit? How to prevent this as sometimes my pod with error if usage of Memory > 100%? My allocation setting ( kubectl describe nodes ): Memory usage of Kubernetes Node and Pod when idle: kubectl top nodes kubectl top pods Memory usage of Kubernetes Node and Pod

Serving Large Protected Files in PHP/Apache

我与影子孤独终老i 提交于 2019-12-05 10:30:36
I need to serve up large files (> 2gb) from an Apache web server. The files are protected downloads, so I need some kind of way to authorize the user. The CMS I'm using uses cookies checked against a MySQL database to verify the user. On the server, I have no control over max_execution_time, and limited control over memory_limit. My technique has been working for small files. After the user has been authorized in PHP (by the CMS), I use readfile() to serve the file, which is stored above the document root to prevent direct access. I've read about techniques to chunk the download or to use

Allowed memory size of 268435456 bytes exhausted [duplicate]

こ雲淡風輕ζ 提交于 2019-12-05 00:21:33
This question already has answers here : Closed 7 years ago . Possible Duplicate: Allowed memory size of X bytes exhausted I'm handling with a quite little big database (87mb) import and to do that I use a php script. All the operations are made in locale with an apache installation on Ubuntu Lucid. When I run the script after few minutes I receive this error: Allowed memory size of 268435456 bytes exhausted I've changed memory_limit to 2GB in php.ini file and restarted apache. After that I've checked phpinfo() and I see that memory_limit is set to '2048M' so all is ok. But when i relaunch my