多线程与多进程
concurrent.futures 模块 拥有ThreadPoolExecutor (线程池)、ProcessPoolExecutor (进程池)两个类。
线程池-ThreadPoolExecutor
- 主线程可以获取某一个线程(或者任务的)的状态,以及返回值。
- 当一个线程完成的时候,主线程能够立即知道。
- 让多线程和多进程的编码接口一致。
常用方法 done判断是否完成 result获取返回值 submit 提交要执行的函数与参数
from concurrent.futures import ThreadPoolExecutor
import time
def spider(page):
time.sleep(page)
print(f"crawl task{page} finished")
return page
with ThreadPoolExecutor(max_workers=5) as t:
task1 = t.submit(spider, 1)
task2 = t.submit(spider, 2)
task3 = t.submit(spider, 3)
print(f"task1: {task1.done()}")
print(f"task2: {task2.done()}")
print(f"task3: {task3.done()}")
time.sleep(2.5)
print(f"task1: {task1.done()}")
print(f"task2: {task2.done()}")
print(f"task3: {task3.done()}")
print(task1.result())
-----------------------result-------------------------------
task1: False
task2: False
task3: False
crawl task1 finished
crawl task2 finished
task1: True
task2: True
task3: False
1
crawl task3 finished
wait(fs, timeout=None, return_when=ALL_COMPLETED) fs等待执行的线程队列 timeout等待最大时间 return_when等待返回结果的条件
from concurrent.futures import ThreadPoolExecutor, wait, FIRST_COMPLETED, ALL_COMPLETED
import time
def spider(page):
time.sleep(page)
print(f"crawl task{page} finished")
return page
with ThreadPoolExecutor(max_workers=5) as t:
all_task = [t.submit(spider, page) for page in range(1, 5)]
wait(all_task, return_when=FIRST_COMPLETED)
print('finished')
print(wait(all_task, timeout=2.5))
crawl task1 finished
finished
crawl task2 finished
crawl task3 finished
DoneAndNotDoneFutures(done={<Future at 0x28c8710 state=finished returned int>, <Future at 0x2c2bfd0 state=finished returned int>, <Future at 0x2c1b7f0 state=finished returned int>}, not_done={<Future at 0x2c3a240 state=running>})
crawl task4 finished
as_completed() 判断任务是否结束。 生成器方法在没有任务完成的时候,会一直阻塞,除非设置了 timeout。
from concurrent.futures import ThreadPoolExecutor, as_completed
import time
def spider(page):
time.sleep(page)
print(f"crawl task{page} finished")
return page
def main():
with ThreadPoolExecutor(max_workers=5) as t:
obj_list = []
for page in range(1, 5):
obj = t.submit(spider, page)
obj_list.append(obj)
for future in as_completed(obj_list):
data = future.result()
print(f"main: {data}")
crawl task1 finished
main: 1
crawl task2 finished
main: 2
crawl task3 finished
main: 3
crawl task4 finished
main: 4
map(fn, *iterables, timeout=None)
import time
from concurrent.futures import ThreadPoolExecutor
def spider(page):
time.sleep(page)
return page
start = time.time()
executor = ThreadPoolExecutor(max_workers=4)
i = 1
for result in executor.map(spider, [2, 3, 1, 4]):
print("task{}:{}".format(i, result))
i += 1
task1:2
task2:3
task3:1
task4:4