Python里的阻塞队列

multiprocessing.Queue

Python的multiprocessing里的Queue是多进程多线程都可以使用的阻塞队列

API使用

  1. 创建
# created an unbounded queue
queue = multiprocessing.Queue()
# created a size limited queue
queue = multiprocessing.Queue(maxsize=100)
  1. 添加put

有长度限制的队列,put时如果队列是满的,就阻塞住,等到不满了再放进去。
阻塞可由block参数来控制,也可设置超时timeout

# add an item to the queue
queue.put(item)
# add an item to the queue
queue.put(item, block=True, timeout=None)
# add an item to a size limited queue without blocking
try:
	queue.put(item, block=False)
except queue.Full:
	# ...
# add an item to a size limited queue with a timeout
try:
	queue.put(item, timeout=5)
except queue.Full:
	# ...
  1. 取出get

get的时候如果队列为空,则阻塞住,等待不为空的时候再取出
阻塞可由block参数来控制,也可设置超时timeout

# get an item from the queue
item = queue.get()
# get an item from the queue
item = queue.get(block=True, timeout=0)
# get an item from the queue with a timeout
try:
	item = queue.get(timeout=10)
except queue.Empty:
	# ...
  1. size相关
# check the size of the queue
size = queue.qsize()
# check if the queue is empty
if queue.empty():
	# ...
# check if the queue is full
if queue.full():
	# ...

使用示例

一个生产者、消费者模式的小例子

# example of using the queue with processes
from time import sleep
from random import random
from multiprocessing import Process
from multiprocessing import Queue
 
# generate work
def producer(queue):
    print('Producer: Running', flush=True)
    # generate work
    for i in range(10):
        # generate a value
        value = random()
        # block
        sleep(value)
        # add to the queue
        queue.put(value)
    # all done
    queue.put(None)
    print('Producer: Done', flush=True)
 
# consume work
def consumer(queue):
    print('Consumer: Running', flush=True)
    # consume work
    while True:
        # get a unit of work
        item = queue.get()
        # check for stop
        if item is None:
            break
        # report
        print(f'>got {item}', flush=True)
    # all done
    print('Consumer: Done', flush=True)
 
# entry point
if __name__ == '__main__':
    # create the shared queue
    queue = Queue()
    # start the consumer
    consumer_process = Process(target=consumer, args=(queue,))
    consumer_process.start()
    # start the producer
    producer_process = Process(target=producer, args=(queue,))
    producer_process.start()
    # wait for all processes to finish
    producer_process.join()
    consumer_process.join()

更多信息

你可能感兴趣的:(Python,python,阻塞队列,multiprocess)