PHP cURL multi_exec delay between requests

亡梦爱人 提交于 2019-12-01 21:27:08

Don't think you can. If you run this from the cli, you could instead fork your script into 10 processes and then fire regular curl requests from each. That would allow you fine grained control over the timing.

Yes, this is possible. If you use the ParallelCurl library, you can much more easily add your 100ms delay with usleep(), as you can make a separate request to be added to the download queue.

for ($urls as $url) {
    $pcurl->startRequest($url);
    usleep(100000);
}

PHP is not solution for that. Forking the script won't help too. In the beginnings yes but once you have a little bit more websites you need to grab like that you will find yourself as your sever very, very red. In terms of costs and in terms of script stability you should reconsider using some other idea.

You can do that with Python easily and in case of non-blocking real time calls to API endpoints you should use stuff like Socket.IO + Node.JS or just Node.JS or well, huh... lol

In case that you do not have time nor will you can use stuff like this:

http://framework.zend.com/manual/en/zendx.console.process.unix.overview.html

It actually all depends on what are you trying to achieve.

mu3

You can try this:
Store a timestamp in the DB, add one handle and call to curl_multi_exec.
Use CURLOPT_PROGRESSFUNCTION to check timings and add more handles when you need it.
Here Daniel Stenberg (author of cURL and libcurl) says it's possible to add more handles after executing curl_multi_exec.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!