I have a store procedure
DO_STUFF(obj rowFromMyTable)
This take obj and process some data and save the result in an independent table. So
Two ways to do this (works in any of Windows / Linux / Mac):
PostgreSQL 9.6+ should now be able to (automatically) parallelize your queries to some extent and then you may want to see whether you need to take the pain to split the queries yourself at all.
Use dblink and connect to the database via multiple callbacks. The best part about DBLink is that these can be fire-n-forget (i.e. asynchronous) calls and so can be called in quick succession and then eventually wait till they all complete (although you'd need to weave the wait-for-result logic yourself). However, the drawback (as is with synchronous calls) is that unless you keep track of things like process failures / timeouts etc. you may wrongly assume that since the calls went through (successfully) all data was processed, where actually its possible that some calls failed (asynchronously).
SELECT * FROM dblink_send_query('testconn', 'SELECT do_stuff_wrapper(0, 5000)') AS t1;
SELECT dblink_is_busy('testconn');
SELECT * FROM dblink_get_result('testconn') AS t1(c1 TEXT, c2 TEXT, ....);
Update: Exemplify using dblink's asynchronous functions.