I have a php script that runs a mysql query, then loops the result, and in that loop also runs several queries:
$sqlstr = \"SELECT * FROM user_pred WHERE
The best way is probably to get all userIds and flush them to a file. Then run a new script which forks with pipes to x amount of worker drones. Then just give them a small list of userIds to process as they complete each list. With multiple cpus/cores/servers you can finish the task faster. If one worker fails, just start a new one. To use other servers as workers you can call them with curl/fopen/soap/etc from a worker thread.