I have a php script that needs to run for quite some time.
What the script does:
You can set the timeout to be indefinate by modifying your PHP.ini and setting the script execution variable.
But you may also want to consider a slight architecture change. First consider a "Launch and forget" approach at getting 100,000 curl requests. Second, consider using "wget" instead of curl.
You can issue a simple "wget URL -o UniqueFileName &
" This will retrieve a web page, save it to a "unique" filename and all in the background.
Then you can iterate over a directory of files, greping (preg_matching) data, and making your DB calls. Move the files as you process them to an archive and continue to iterate until there are no more files.
Think of the directory as a "queue" and have one process just process the files. Have a second process simply go out and grab web-page data. You could add a third process that can be you "monitor" which works independently and simply reports snap-shot statistics. The other two can just be "web services" with no interface.
This type of multi-threading is really powerful and greatly under-utilized IMHO. To me this is the true power of the web.