I have a batch flash upload script, that uploads video files to a dir. Simple. After the upload completes, it creates a mysql record for that file, and moves on to the next
The best way to implement this architecturally is a work queue, with your PHP front end feeding the daemon in the backend files to convert. Decouple PHP from your work, and your UI will always remain responsive.
I haven't written PHP in a long time, but it's my understanding that any process started falls under the maximum timeout rule, and is run under the Web server. You don't want that to be the case; you certainly don't want a Web request capable of starting additional processes.
Write a simple daemon that runs outside the Web server and watches a folder for uploaded files. Your Web front end dumps them there, and your daemon spawns off a thread for each conversion up to how many cores you have. Architecturally, this is the wiser choice.
See my answer to this question as well, which is also relevant.