more generally, this could also look like this:
import multiprocessing
def chunks(l, n):
for i in range(0, len(l), n):
yield l[i:i + n]
numberOfThreads = 4
if __name__ == '__main__':
jobs = []
for i, param in enumerate(params):
p = multiprocessing.Process(target=f, args=(i,param))
jobs.append(p)
for i in chunks(jobs,numberOfThreads):
for j in i:
j.start()
for j in i:
j.join()
Of course, that way is quite cruel (since it waits for every process in a junk until it continues with the next chunk). Still it works well for approx equal run times of the function calls.