curl-multi

understanding php curl_multi_exec

本秂侑毒 提交于 2019-11-27 07:35:41
I'm trying to understand curl_multi_exec. I've copied a piece of the manual example here. So I'm wondering, how does it work? The first loop sends the http request I guess? But it then it is followed by a loop inside a loop using functions with seemingly undocumented flags.. I would like to download +=70 urls +=in parallel. http://www.php.net/manual/en/function.curl-multi-exec.php <?php ... $active = null; //execute the handles do { $mrc = curl_multi_exec($mh, $active); } while ($mrc == CURLM_CALL_MULTI_PERFORM); while ($active && $mrc == CURLM_OK) { if (curl_multi_select($mh) != -1) { do {

PHP Multiple Curl Requests

僤鯓⒐⒋嵵緔 提交于 2019-11-27 04:29:24
I'm currently using Curl for PHP a lot. It takes a lot of time to get results of about 100 pages each time. For every request i'm using code like this $ch = curl_init(); // get source curl_close($ch); What are my options to speed things up? How should I use the multi_init() etc? Reuse the same cURL handler ($ch) without running curl_close. This will speed it up just a little bit. Use curl_multi_init to run the processes in parallel. This can have a tremendous effect. take curl_multi - it is far better. Save the handshakes - they are not needed every time! cuttinger or take pcntl_fork , fork

understanding php curl_multi_exec

ぃ、小莉子 提交于 2019-11-26 17:38:41
问题 I'm trying to understand curl_multi_exec. I've copied a piece of the manual example here. So I'm wondering, how does it work? The first loop sends the http request I guess? But it then it is followed by a loop inside a loop using functions with seemingly undocumented flags.. I would like to download +=70 urls +=in parallel. http://www.php.net/manual/en/function.curl-multi-exec.php <?php ... $active = null; //execute the handles do { $mrc = curl_multi_exec($mh, $active); } while ($mrc == CURLM

PHP Multiple Curl Requests

微笑、不失礼 提交于 2019-11-26 16:25:43
问题 I'm currently using Curl for PHP a lot. It takes a lot of time to get results of about 100 pages each time. For every request i'm using code like this $ch = curl_init(); // get source curl_close($ch); What are my options to speed things up? How should I use the multi_init() etc? 回答1: Reuse the same cURL handler ($ch) without running curl_close. This will speed it up just a little bit. Use curl_multi_init to run the processes in parallel. This can have a tremendous effect. 回答2: take curl_multi