CoreyMSchafer/code_snippets

multiprocessing.Process vs concurrent.futures.ProcessPoolExecutor

Opened this issue · 0 comments

Hi,

I wanted to discuss the multiprocessing video and script with you.

From what I understand, using multiprocessing.Process, start, and join should yield the same results as using concurrent.futures.ProcessPoolExecutor and map. However, I'm observing different performance outcomes. I was wondering if you have any insights on this matter. Below please find the code.

Thank you.

import time
import multiprocessing
start = time.perf_counter()
def do_something(seconds):
    print(f'Sleeping {seconds} second(s)...')
    time.sleep(seconds)
    ans = f'Done Sleeping...{seconds}'
    print(ans)
    return ans

processes = []
for i in range(5):
    p = multiprocessing.Process(target=do_something, args=[1+i/10])
    p.start()
    processes.append(p)
for process in processes:
    process.join()
finish = time.perf_counter()
print(f'Finished in {round(finish-start, 2)} second(s)')

image

import time
import concurrent.futures

start = time.perf_counter()
def do_something(seconds):
    print(f'Sleeping {seconds} second(s)...')
    time.sleep(seconds)
    ans = f'Done Sleeping...{seconds}'
    print(ans)
    return ans

secs = []
for i in range(5):
    secs.append(1+i/10)
with concurrent.futures.ProcessPoolExecutor() as executor:
    results = executor.map(do_something, secs)
finish = time.perf_counter()
print(f'Finished in {round(finish-start, 2)} second(s)')

image