php/timeout/connection to server reset?

后端 未结 6 1498
甜味超标
甜味超标 2021-01-06 02:42

I have a php script that needs to run for quite some time.

What the script does:

  • connects to mysql
  • initiates anywhere from 100 to 100,000 cURL
相关标签:
6条回答
  • 2021-01-06 02:49

    You can set the timeout to be indefinate by modifying your PHP.ini and setting the script execution variable.

    But you may also want to consider a slight architecture change. First consider a "Launch and forget" approach at getting 100,000 curl requests. Second, consider using "wget" instead of curl.

    You can issue a simple "wget URL -o UniqueFileName &" This will retrieve a web page, save it to a "unique" filename and all in the background.

    Then you can iterate over a directory of files, greping (preg_matching) data, and making your DB calls. Move the files as you process them to an archive and continue to iterate until there are no more files.

    Think of the directory as a "queue" and have one process just process the files. Have a second process simply go out and grab web-page data. You could add a third process that can be you "monitor" which works independently and simply reports snap-shot statistics. The other two can just be "web services" with no interface.

    This type of multi-threading is really powerful and greatly under-utilized IMHO. To me this is the true power of the web.

    0 讨论(0)
  • 2021-01-06 02:55

    I had the same problem when getting data from MySQL via PHP that contained special characters like umlauts ä,ö,ü, ampersands etc. The connection was reset and I found no errors in either the apache log nor the php logs. First I made sure in PHP that I accessed the characters set on the DB correctly with:

    mysql_query("SET NAMES 'latin1' COLLATE 'latin1_german2_ci'");
    
    mysql_query("SET CHARACTER SET 'latin1'");
    
    Then, finally, I resolved the problem with this line in PHP:
    
    mysql_query("SET character_set_connection='latin1'");
    
    0 讨论(0)
  • 2021-01-06 03:00

    100,000 cURL requests??? You are insane. Break that data up!

    0 讨论(0)
  • 2021-01-06 03:11

    Lots of ideas:

    1) Don't do it inside an HTTP request. Write a command-line php script to drive it. You can use a web-bound script to kick it off, if necessary.

    2) You should be able to set max_execution_time to zero (or call set_time_limit(0)) to ensure you don't get shut down for exceeding a time limit

    3) It sounds like you really want to refactor this into a something more sane. Think about setting up a little job queueing system, and having a php script that forks several children to chew through all the work.

    As Josh says, look at your error_log and see why you're being shut down right now. Try to figure out how much memory you're using -- that could be a problem. Try setting the max_execution_time to zero. Maybe that will get you where you need to be quickly.

    But in the long run, it sounds like you've got way too much work to do inside of one http request. Take it out of http, and divide and conquer!

    0 讨论(0)
  • 2021-01-06 03:13

    Well, disregarding the fact that attempting 100,000 cURL requests is absolutely insane, you're probably hitting the memory limit.

    Try setting the memory limit to something more reasonable:

    ini_set('memory_limit', '256M');
    

    And as a side tip, don't set the execution time to something ludicrous, chances are you'll eventually find a way to hit that with a script like this. ;]

    Instead, just set it to 0, it functionally equivalent to turning the execution limit off completely:

    ini_set('max_execution_time', 0);
    
    0 讨论(0)
  • 2021-01-06 03:13

    What's in the apache error_log? Are you reaching the memory limit?

    EDIT: Looks like you are reaching your memory limit. Do you have access to PHP.ini? If so, you can raise the memory_limit there. If not, try running curl or wget binaries using the exec or shell_exec functions, that way they run as separate processes, not using PHP's memory.

    0 讨论(0)
提交回复
热议问题