I have split a large text file into a number of sets of smaller ones for performance testing that i\'m doing. There are a number of directories like this:
/h
For this kind of thing I always use find together with xargs:
$ find output-* -name "*.chunk.??" | xargs -I{} ./myexecutable -i {} -o {}.processed
Now since your script processes only one file at a time, using -exec (or -execdir) directly with find, as already suggested, is just as efficient, but I'm used to using xargs, as that's generally much more efficient when feeding a command operating on many arguments at once. Thus it's a very useful tool to keep in one's utility belt, so I thought it ought to be mentioned.