Python Threading and Thread Pool
Simple decorators threadpool and processpool that makes use of the ThreadPoolExecutor and the ProcessPoolExecutor bounded by the number of concurrent workers with BoundedThreadPoolExecutor and BoundedProcessPoolExecutor:
_DEFAULT_POOL = BoundedThreadPoolExecutor(max_workers=5)
_PROCESS_POOL = BoundedProcessPoolExecutor(max_workers=5)Import the threadpool decorator and apply to your high load task:
from thread_support import threadpool
@threadpool
def my_high_load_task(self):
res = do_job()
return resCall the function and the get the results with a synchronous call to result()
task = my_high_load_task()
res = task.result()To directly get the thread or process executors import the BoundedThreadPoolExecutor and call it specifing the max_workers parameters:
from util.bounded_pool_executor import BoundedThreadPoolExecutor
thread_executor = BoundedThreadPoolExecutor(max_workers=5)Do the same for the BoundedProcessPoolExecutor:
from util.bounded_pool_executor import BoundedProcessPoolExecutor
process_executor = BoundedProcessPoolExecutor(max_workers=5)To use these decorators in Tornado you can import the BoundedThreadPoolExecutor in conjuction with the Tornado @tornado.concurrent.run_on_executor decorator like
import tornado
from util.bounded_pool_executor import BoundedThreadPoolExecutor
class MyHandler(tornado.web.RequestHandler):
executor = BoundedThreadPoolExecutor(max_workers=5)
@tornado.concurrent.run_on_executor
def get(self, *args):
serve_request()or you can simply import the threadpool
import tornado
from thread_support import threadpool
@threadpool
def my_high_load_task(self):
res = do_job()
return res
class MyHandler(tornado.web.RequestHandler):
@tornado.web.asynchronous
def get(self, *args):
task = my_high_load_task()
res = task.result()
response()Source code adapted from different sources: