🕷️ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 174 (from laksa134)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

📄
INDEXABLE
CRAWLED
1 month ago
🤖
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH1.1 months ago (distributed domain, exempt)
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://github.com/dano/aioprocessing
Last Crawled2026-03-05 12:34:47 (1 month ago)
First Indexed2014-10-25 20:45:14 (11 years ago)
HTTP Status Code200
Meta TitleGitHub - dano/aioprocessing: A Python 3.5+ library that integrates the multiprocessing module with asyncio · GitHub
Meta DescriptionA Python 3.5+ library that integrates the multiprocessing module with asyncio - dano/aioprocessing
Meta Canonicalnull
Boilerpipe Text
aioprocessing provides asynchronous, asyncio compatible, coroutine versions of many blocking instance methods on objects in the multiprocessing library. To use dill for universal pickling, install using pip install aioprocessing[dill] . Here's an example demonstrating the aioprocessing versions of Event , Queue , and Lock : import time import asyncio import aioprocessing def func ( queue , event , lock , items ): """ Demo worker function. This worker function runs in its own process, and uses normal blocking calls to aioprocessing objects, exactly the way you would use oridinary multiprocessing objects. """ with lock : event . set () for item in items : time . sleep ( 3 ) queue . put ( item + 5 ) queue . close () async def example ( queue , event , lock ): l = [ 1 , 2 , 3 , 4 , 5 ] p = aioprocessing . AioProcess ( target = func , args = ( queue , event , lock , l )) p . start () while True : result = await queue . coro_get () if result is None : break print ( "Got result {}" . format ( result )) await p . coro_join () async def example2 ( queue , event , lock ): await event . coro_wait () async with lock : await queue . coro_put ( 78 ) await queue . coro_put ( None ) # Shut down the worker if __name__ == "__main__" : loop = asyncio . get_event_loop () queue = aioprocessing . AioQueue () lock = aioprocessing . AioLock () event = aioprocessing . AioEvent () tasks = [ asyncio . ensure_future ( example ( queue , event , lock )), asyncio . ensure_future ( example2 ( queue , event , lock )), ] loop . run_until_complete ( asyncio . wait ( tasks )) loop . close () The aioprocessing objects can be used just like their multiprocessing equivalents - as they are in func above - but they can also be seamlessly used inside of asyncio coroutines, without ever blocking the event loop. What's new v2.0.1 Fixed a bug that kept the AioBarrier and AioEvent proxies returned from AioManager instances from working. Thanks to Giorgos Apostolopoulos for the fix. v2.0.0 Add support for universal pickling using dill , installable with pip install aioprocessing[dill] . The library will now attempt to import multiprocess , falling back to stdlib multiprocessing . Force stdlib behaviour by setting a non-empty environment variable AIOPROCESSING_DILL_DISABLED=1 . This can be used to avoid errors when attempting to combine aioprocessing[dill] with stdlib multiprocessing based objects like concurrent.futures.ProcessPoolExecutor . How does it work? In most cases, this library makes blocking calls to multiprocessing methods asynchronous by executing the call in a ThreadPoolExecutor , using asyncio.run_in_executor() . It does not re-implement multiprocessing using asynchronous I/O. This means there is extra overhead added when you use aioprocessing objects instead of multiprocessing objects, because each one is generally introducing a ThreadPoolExecutor containing at least one threading.Thread . It also means that all the normal risks you get when you mix threads with fork apply here, too (See http://bugs.python.org/issue6721 for more info). The one exception to this is aioprocessing.AioPool , which makes use of the existing callback and error_callback keyword arguments in the various Pool.*_async methods to run them as asyncio coroutines. Note that multiprocessing.Pool is actually using threads internally, so the thread/fork mixing caveat still applies. Each multiprocessing class is replaced by an equivalent aioprocessing class, distinguished by the Aio prefix. So, Pool becomes AioPool , etc. All methods that could block on I/O also have a coroutine version that can be used with asyncio . For example, multiprocessing.Lock.acquire() can be replaced with aioprocessing.AioLock.coro_acquire() . You can pass an asyncio EventLoop object to any coro_* method using the loop keyword argument. For example, lock.coro_acquire(loop=my_loop) . Note that you can also use the aioprocessing synchronization primitives as replacements for their equivalent threading primitives, in single-process, multi-threaded programs that use asyncio . What parts of multiprocessing are supported? Most of them! All methods that could do blocking I/O in the following objects have equivalent versions in aioprocessing that extend the multiprocessing versions by adding coroutine versions of all the blocking methods. Pool Process Pipe Lock RLock Semaphore BoundedSemaphore Event Condition Barrier connection.Connection connection.Listener connection.Client Queue JoinableQueue SimpleQueue All managers.SyncManager Proxy versions of the items above ( SyncManager.Queue , SyncManager.Lock() , etc.). What versions of Python are compatible? aioprocessing will work out of the box on Python 3.5+. Gotchas Keep in mind that, while the API exposes coroutines for interacting with multiprocessing APIs, internally they are almost always being delegated to a ThreadPoolExecutor , this means the caveats that apply with using ThreadPoolExecutor with asyncio apply: namely, you won't be able to cancel any of the coroutines, because the work being done in the worker thread can't be interrupted.
Markdown
[Skip to content](https://github.com/dano/aioprocessing#start-of-content) ## Navigation Menu Toggle navigation [Sign in](https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fdano%2Faioprocessing) Appearance settings - Platform - AI CODE CREATION - [GitHub CopilotWrite better code with AI](https://github.com/features/copilot) - [GitHub SparkBuild and deploy intelligent apps](https://github.com/features/spark) - [GitHub ModelsManage and compare prompts](https://github.com/features/models) - [MCP RegistryNewIntegrate external tools](https://github.com/mcp) - DEVELOPER WORKFLOWS - [ActionsAutomate any workflow](https://github.com/features/actions) - [CodespacesInstant dev environments](https://github.com/features/codespaces) - [IssuesPlan and track work](https://github.com/features/issues) - [Code ReviewManage code changes](https://github.com/features/code-review) - APPLICATION SECURITY - [GitHub Advanced SecurityFind and fix vulnerabilities](https://github.com/security/advanced-security) - [Code securitySecure your code as you build](https://github.com/security/advanced-security/code-security) - [Secret protectionStop leaks before they start](https://github.com/security/advanced-security/secret-protection) - EXPLORE - [Why GitHub](https://github.com/why-github) - [Documentation](https://docs.github.com/) - [Blog](https://github.blog/) - [Changelog](https://github.blog/changelog) - [Marketplace](https://github.com/marketplace) [View all features](https://github.com/features) - Solutions - BY COMPANY SIZE - [Enterprises](https://github.com/enterprise) - [Small and medium teams](https://github.com/team) - [Startups](https://github.com/enterprise/startups) - [Nonprofits](https://github.com/solutions/industry/nonprofits) - BY USE CASE - [App Modernization](https://github.com/solutions/use-case/app-modernization) - [DevSecOps](https://github.com/solutions/use-case/devsecops) - [DevOps](https://github.com/solutions/use-case/devops) - [CI/CD](https://github.com/solutions/use-case/ci-cd) - [View all use cases](https://github.com/solutions/use-case) - BY INDUSTRY - [Healthcare](https://github.com/solutions/industry/healthcare) - [Financial services](https://github.com/solutions/industry/financial-services) - [Manufacturing](https://github.com/solutions/industry/manufacturing) - [Government](https://github.com/solutions/industry/government) - [View all industries](https://github.com/solutions/industry) [View all solutions](https://github.com/solutions) - Resources - EXPLORE BY TOPIC - [AI](https://github.com/resources/articles?topic=ai) - [Software Development](https://github.com/resources/articles?topic=software-development) - [DevOps](https://github.com/resources/articles?topic=devops) - [Security](https://github.com/resources/articles?topic=security) - [View all topics](https://github.com/resources/articles) - EXPLORE BY TYPE - [Customer stories](https://github.com/customer-stories) - [Events & webinars](https://github.com/resources/events) - [Ebooks & reports](https://github.com/resources/whitepapers) - [Business insights](https://github.com/solutions/executive-insights) - [GitHub Skills](https://skills.github.com/) - SUPPORT & SERVICES - [Documentation](https://docs.github.com/) - [Customer support](https://support.github.com/) - [Community forum](https://github.com/orgs/community/discussions) - [Trust center](https://github.com/trust-center) - [Partners](https://github.com/partners) [View all resources](https://github.com/resources) - Open Source - COMMUNITY - [GitHub SponsorsFund open source developers](https://github.com/sponsors) - PROGRAMS - [Security Lab](https://securitylab.github.com/) - [Maintainer Community](https://maintainers.github.com/) - [Accelerator](https://github.com/accelerator) - [Archive Program](https://archiveprogram.github.com/) - REPOSITORIES - [Topics](https://github.com/topics) - [Trending](https://github.com/trending) - [Collections](https://github.com/collections) - Enterprise - ENTERPRISE SOLUTIONS - [Enterprise platformAI-powered developer platform](https://github.com/enterprise) - AVAILABLE ADD-ONS - [GitHub Advanced SecurityEnterprise-grade security features](https://github.com/security/advanced-security) - [Copilot for BusinessEnterprise-grade AI features](https://github.com/features/copilot/copilot-business) - [Premium SupportEnterprise-grade 24/7 support](https://github.com/premium-support) - [Pricing](https://github.com/pricing) Search or jump to... # Search code, repositories, users, issues, pull requests... [Search syntax tips](https://docs.github.com/search-github/github-code-search/understanding-github-code-search-syntax) # Provide feedback Cancel Submit feedback # Saved searches ## Use saved searches to filter your results more quickly Cancel Create saved search [Sign in](https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fdano%2Faioprocessing) [Sign up](https://github.com/signup?ref_cta=Sign+up&ref_loc=header+logged+out&ref_page=%2F%3Cuser-name%3E%2F%3Crepo-name%3E&source=header-repo&source_repo=dano%2Faioprocessing) Appearance settings Resetting focus You signed in with another tab or window. [Reload](https://github.com/dano/aioprocessing) to refresh your session. You signed out in another tab or window. [Reload](https://github.com/dano/aioprocessing) to refresh your session. You switched accounts on another tab or window. [Reload](https://github.com/dano/aioprocessing) to refresh your session. Dismiss alert {{ message }} [dano](https://github.com/dano) / **[aioprocessing](https://github.com/dano/aioprocessing)** Public - [Notifications](https://github.com/login?return_to=%2Fdano%2Faioprocessing) You must be signed in to change notification settings - [Fork 37](https://github.com/login?return_to=%2Fdano%2Faioprocessing) - [Star 662](https://github.com/login?return_to=%2Fdano%2Faioprocessing) - [Code](https://github.com/dano/aioprocessing) - [Issues 7](https://github.com/dano/aioprocessing/issues) - [Pull requests 0](https://github.com/dano/aioprocessing/pulls) - [Actions](https://github.com/dano/aioprocessing/actions) - [Projects](https://github.com/dano/aioprocessing/projects) - [Wiki](https://github.com/dano/aioprocessing/wiki) - [Security 0](https://github.com/dano/aioprocessing/security) - [Insights](https://github.com/dano/aioprocessing/pulse) Additional navigation options - [Code](https://github.com/dano/aioprocessing) - [Issues](https://github.com/dano/aioprocessing/issues) - [Pull requests](https://github.com/dano/aioprocessing/pulls) - [Actions](https://github.com/dano/aioprocessing/actions) - [Projects](https://github.com/dano/aioprocessing/projects) - [Wiki](https://github.com/dano/aioprocessing/wiki) - [Security](https://github.com/dano/aioprocessing/security) - [Insights](https://github.com/dano/aioprocessing/pulse) # dano/aioprocessing master [**3** Branches](https://github.com/dano/aioprocessing/branches) [**3** Tags](https://github.com/dano/aioprocessing/tags) Go to file Code Open more actions menu ## Folders and files | Name | Name | Last commit message | Last commit date | |---|---|---|---| | Latest commit [![dano](https://avatars.githubusercontent.com/u/38807?v=4&size=40)](https://github.com/dano)[dano](https://github.com/dano/aioprocessing/commits?author=dano) [Fix flake issue](https://github.com/dano/aioprocessing/commit/39e84ac016aa21d93835b44ca134b0791c8c13ba) Open commit details Sep 16, 2022 [39e84ac](https://github.com/dano/aioprocessing/commit/39e84ac016aa21d93835b44ca134b0791c8c13ba) · Sep 16, 2022 History [148 Commits](https://github.com/dano/aioprocessing/commits/master/) Open commit details 148 Commits | | | | ## Repository files navigation - [README](https://github.com/dano/aioprocessing) - [License](https://github.com/dano/aioprocessing) # aioprocessing [![Build Status](https://github.com/dano/aioprocessing/workflows/aioprocessing%20tests/badge.svg?branch=master)](https://github.com/dano/aioprocessing/actions) `aioprocessing` provides asynchronous, [`asyncio`](https://docs.python.org/3/library/asyncio.html) compatible, coroutine versions of many blocking instance methods on objects in the [`multiprocessing`](https://docs.python.org/3/library/multiprocessing.html) library. To use [`dill`](https://pypi.org/project/dill) for universal pickling, install using `pip install aioprocessing[dill]`. Here's an example demonstrating the `aioprocessing` versions of `Event`, `Queue`, and `Lock`: ``` import time import asyncio import aioprocessing def func(queue, event, lock, items): """ Demo worker function. This worker function runs in its own process, and uses normal blocking calls to aioprocessing objects, exactly the way you would use oridinary multiprocessing objects. """ with lock: event.set() for item in items: time.sleep(3) queue.put(item+5) queue.close() async def example(queue, event, lock): l = [1,2,3,4,5] p = aioprocessing.AioProcess(target=func, args=(queue, event, lock, l)) p.start() while True: result = await queue.coro_get() if result is None: break print("Got result {}".format(result)) await p.coro_join() async def example2(queue, event, lock): await event.coro_wait() async with lock: await queue.coro_put(78) await queue.coro_put(None) # Shut down the worker if __name__ == "__main__": loop = asyncio.get_event_loop() queue = aioprocessing.AioQueue() lock = aioprocessing.AioLock() event = aioprocessing.AioEvent() tasks = [ asyncio.ensure_future(example(queue, event, lock)), asyncio.ensure_future(example2(queue, event, lock)), ] loop.run_until_complete(asyncio.wait(tasks)) loop.close() ``` The aioprocessing objects can be used just like their multiprocessing equivalents - as they are in `func` above - but they can also be seamlessly used inside of `asyncio` coroutines, without ever blocking the event loop. ## What's new `v2.0.1` - Fixed a bug that kept the `AioBarrier` and `AioEvent` proxies returned from `AioManager` instances from working. Thanks to Giorgos Apostolopoulos for the fix. `v2.0.0` - Add support for universal pickling using [`dill`](https://github.com/uqfoundation/dill), installable with `pip install aioprocessing[dill]`. The library will now attempt to import [`multiprocess`](https://github.com/uqfoundation/multiprocess), falling back to stdlib `multiprocessing`. Force stdlib behaviour by setting a non-empty environment variable `AIOPROCESSING_DILL_DISABLED=1`. This can be used to avoid [errors](https://github.com/dano/aioprocessing/pull/36#discussion_r631178933) when attempting to combine `aioprocessing[dill]` with stdlib `multiprocessing` based objects like `concurrent.futures.ProcessPoolExecutor`. ## How does it work? In most cases, this library makes blocking calls to `multiprocessing` methods asynchronous by executing the call in a [`ThreadPoolExecutor`](https://docs.python.org/3/library/concurrent.futures.html#threadpoolexecutor), using [`asyncio.run_in_executor()`](https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.BaseEventLoop.run_in_executor). It does *not* re-implement multiprocessing using asynchronous I/O. This means there is extra overhead added when you use `aioprocessing` objects instead of `multiprocessing` objects, because each one is generally introducing a `ThreadPoolExecutor` containing at least one [`threading.Thread`](https://docs.python.org/2/library/threading.html#thread-objects). It also means that all the normal risks you get when you mix threads with fork apply here, too (See <http://bugs.python.org/issue6721> for more info). The one exception to this is `aioprocessing.AioPool`, which makes use of the existing `callback` and `error_callback` keyword arguments in the various [`Pool.*_async`](https://docs.python.org/3/library/multiprocessing.html#multiprocessing.pool.Pool.apply_async) methods to run them as `asyncio` coroutines. Note that `multiprocessing.Pool` is actually using threads internally, so the thread/fork mixing caveat still applies. Each `multiprocessing` class is replaced by an equivalent `aioprocessing` class, distinguished by the `Aio` prefix. So, `Pool` becomes `AioPool`, etc. All methods that could block on I/O also have a coroutine version that can be used with `asyncio`. For example, `multiprocessing.Lock.acquire()` can be replaced with `aioprocessing.AioLock.coro_acquire()`. You can pass an `asyncio` EventLoop object to any `coro_*` method using the `loop` keyword argument. For example, `lock.coro_acquire(loop=my_loop)`. Note that you can also use the `aioprocessing` synchronization primitives as replacements for their equivalent `threading` primitives, in single-process, multi-threaded programs that use `asyncio`. ## What parts of multiprocessing are supported? Most of them! All methods that could do blocking I/O in the following objects have equivalent versions in `aioprocessing` that extend the `multiprocessing` versions by adding coroutine versions of all the blocking methods. - `Pool` - `Process` - `Pipe` - `Lock` - `RLock` - `Semaphore` - `BoundedSemaphore` - `Event` - `Condition` - `Barrier` - `connection.Connection` - `connection.Listener` - `connection.Client` - `Queue` - `JoinableQueue` - `SimpleQueue` - All `managers.SyncManager` `Proxy` versions of the items above (`SyncManager.Queue`, `SyncManager.Lock()`, etc.). ## What versions of Python are compatible? `aioprocessing` will work out of the box on Python 3.5+. ## Gotchas Keep in mind that, while the API exposes coroutines for interacting with `multiprocessing` APIs, internally they are almost always being delegated to a `ThreadPoolExecutor`, this means the caveats that apply with using `ThreadPoolExecutor` with `asyncio` apply: namely, you won't be able to cancel any of the coroutines, because the work being done in the worker thread can't be interrupted. ## About A Python 3.5+ library that integrates the multiprocessing module with asyncio ### Topics [python](https://github.com/topics/python "Topic: python") [multiprocessing](https://github.com/topics/multiprocessing "Topic: multiprocessing") [coroutines](https://github.com/topics/coroutines "Topic: coroutines") [asyncio](https://github.com/topics/asyncio "Topic: asyncio") ### Resources [Readme](https://github.com/dano/aioprocessing#readme-ov-file) ### License [View license](https://github.com/dano/aioprocessing#License-1-ov-file) ### Uh oh\! There was an error while loading. [Please reload this page](https://github.com/dano/aioprocessing). [Activity](https://github.com/dano/aioprocessing/activity) ### Stars [**662** stars](https://github.com/dano/aioprocessing/stargazers) ### Watchers [**18** watching](https://github.com/dano/aioprocessing/watchers) ### Forks [**37** forks](https://github.com/dano/aioprocessing/forks) [Report repository](https://github.com/contact/report-content?content_url=https%3A%2F%2Fgithub.com%2Fdano%2Faioprocessing&report=dano+%28user%29) ## [Releases 3](https://github.com/dano/aioprocessing/releases) [v2.0.1 Latest Sep 15, 2022](https://github.com/dano/aioprocessing/releases/tag/v2.0.1) [\+ 2 releases](https://github.com/dano/aioprocessing/releases) ## [Packages 0](https://github.com/users/dano/packages?repo_name=aioprocessing) No packages published ## [Used by 570](https://github.com/dano/aioprocessing/network/dependents) [![@lucasmarcelin0](https://avatars.githubusercontent.com/u/66127687?s=64&v=4) ![@bleu](https://avatars.githubusercontent.com/u/116457461?s=64&v=4) ![@rehmantechwork-code](https://avatars.githubusercontent.com/u/262094790?s=64&v=4) ![@rooksec-security-lab](https://avatars.githubusercontent.com/u/251647220?s=64&v=4) ![@lunatechza](https://avatars.githubusercontent.com/u/8457050?s=64&v=4) ![@Danny-Dasilva](https://avatars.githubusercontent.com/u/32214239?s=64&v=4) ![@1jamesthompson1](https://avatars.githubusercontent.com/u/103026808?s=64&v=4) ![@BITS-Rohit](https://avatars.githubusercontent.com/u/125949183?s=64&v=4) + 562](https://github.com/dano/aioprocessing/network/dependents) ## [Contributors](https://github.com/dano/aioprocessing/graphs/contributors) ### Uh oh\! There was an error while loading. [Please reload this page](https://github.com/dano/aioprocessing). ## Languages - [Python 100.0%](https://github.com/dano/aioprocessing/search?l=python) ## Footer © 2026 GitHub, Inc. ### Footer navigation - [Terms](https://docs.github.com/site-policy/github-terms/github-terms-of-service) - [Privacy](https://docs.github.com/site-policy/privacy-policies/github-privacy-statement) - [Security](https://github.com/security) - [Status](https://www.githubstatus.com/) - [Community](https://github.community/) - [Docs](https://docs.github.com/) - [Contact](https://support.github.com/?tags=dotcom-footer) - Manage cookies - Do not share my personal information You can’t perform that action at this time.
Readable Markdown
[![Build Status](https://github.com/dano/aioprocessing/workflows/aioprocessing%20tests/badge.svg?branch=master)](https://github.com/dano/aioprocessing/actions) `aioprocessing` provides asynchronous, [`asyncio`](https://docs.python.org/3/library/asyncio.html) compatible, coroutine versions of many blocking instance methods on objects in the [`multiprocessing`](https://docs.python.org/3/library/multiprocessing.html) library. To use [`dill`](https://pypi.org/project/dill) for universal pickling, install using `pip install aioprocessing[dill]`. Here's an example demonstrating the `aioprocessing` versions of `Event`, `Queue`, and `Lock`: ``` import time import asyncio import aioprocessing def func(queue, event, lock, items): """ Demo worker function. This worker function runs in its own process, and uses normal blocking calls to aioprocessing objects, exactly the way you would use oridinary multiprocessing objects. """ with lock: event.set() for item in items: time.sleep(3) queue.put(item+5) queue.close() async def example(queue, event, lock): l = [1,2,3,4,5] p = aioprocessing.AioProcess(target=func, args=(queue, event, lock, l)) p.start() while True: result = await queue.coro_get() if result is None: break print("Got result {}".format(result)) await p.coro_join() async def example2(queue, event, lock): await event.coro_wait() async with lock: await queue.coro_put(78) await queue.coro_put(None) # Shut down the worker if __name__ == "__main__": loop = asyncio.get_event_loop() queue = aioprocessing.AioQueue() lock = aioprocessing.AioLock() event = aioprocessing.AioEvent() tasks = [ asyncio.ensure_future(example(queue, event, lock)), asyncio.ensure_future(example2(queue, event, lock)), ] loop.run_until_complete(asyncio.wait(tasks)) loop.close() ``` The aioprocessing objects can be used just like their multiprocessing equivalents - as they are in `func` above - but they can also be seamlessly used inside of `asyncio` coroutines, without ever blocking the event loop. What's new `v2.0.1` - Fixed a bug that kept the `AioBarrier` and `AioEvent` proxies returned from `AioManager` instances from working. Thanks to Giorgos Apostolopoulos for the fix. `v2.0.0` - Add support for universal pickling using [`dill`](https://github.com/uqfoundation/dill), installable with `pip install aioprocessing[dill]`. The library will now attempt to import [`multiprocess`](https://github.com/uqfoundation/multiprocess), falling back to stdlib `multiprocessing`. Force stdlib behaviour by setting a non-empty environment variable `AIOPROCESSING_DILL_DISABLED=1`. This can be used to avoid [errors](https://github.com/dano/aioprocessing/pull/36#discussion_r631178933) when attempting to combine `aioprocessing[dill]` with stdlib `multiprocessing` based objects like `concurrent.futures.ProcessPoolExecutor`. How does it work? In most cases, this library makes blocking calls to `multiprocessing` methods asynchronous by executing the call in a [`ThreadPoolExecutor`](https://docs.python.org/3/library/concurrent.futures.html#threadpoolexecutor), using [`asyncio.run_in_executor()`](https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.BaseEventLoop.run_in_executor). It does *not* re-implement multiprocessing using asynchronous I/O. This means there is extra overhead added when you use `aioprocessing` objects instead of `multiprocessing` objects, because each one is generally introducing a `ThreadPoolExecutor` containing at least one [`threading.Thread`](https://docs.python.org/2/library/threading.html#thread-objects). It also means that all the normal risks you get when you mix threads with fork apply here, too (See <http://bugs.python.org/issue6721> for more info). The one exception to this is `aioprocessing.AioPool`, which makes use of the existing `callback` and `error_callback` keyword arguments in the various [`Pool.*_async`](https://docs.python.org/3/library/multiprocessing.html#multiprocessing.pool.Pool.apply_async) methods to run them as `asyncio` coroutines. Note that `multiprocessing.Pool` is actually using threads internally, so the thread/fork mixing caveat still applies. Each `multiprocessing` class is replaced by an equivalent `aioprocessing` class, distinguished by the `Aio` prefix. So, `Pool` becomes `AioPool`, etc. All methods that could block on I/O also have a coroutine version that can be used with `asyncio`. For example, `multiprocessing.Lock.acquire()` can be replaced with `aioprocessing.AioLock.coro_acquire()`. You can pass an `asyncio` EventLoop object to any `coro_*` method using the `loop` keyword argument. For example, `lock.coro_acquire(loop=my_loop)`. Note that you can also use the `aioprocessing` synchronization primitives as replacements for their equivalent `threading` primitives, in single-process, multi-threaded programs that use `asyncio`. What parts of multiprocessing are supported? Most of them! All methods that could do blocking I/O in the following objects have equivalent versions in `aioprocessing` that extend the `multiprocessing` versions by adding coroutine versions of all the blocking methods. - `Pool` - `Process` - `Pipe` - `Lock` - `RLock` - `Semaphore` - `BoundedSemaphore` - `Event` - `Condition` - `Barrier` - `connection.Connection` - `connection.Listener` - `connection.Client` - `Queue` - `JoinableQueue` - `SimpleQueue` - All `managers.SyncManager` `Proxy` versions of the items above (`SyncManager.Queue`, `SyncManager.Lock()`, etc.). What versions of Python are compatible? `aioprocessing` will work out of the box on Python 3.5+. Gotchas Keep in mind that, while the API exposes coroutines for interacting with `multiprocessing` APIs, internally they are almost always being delegated to a `ThreadPoolExecutor`, this means the caveats that apply with using `ThreadPoolExecutor` with `asyncio` apply: namely, you won't be able to cancel any of the coroutines, because the work being done in the worker thread can't be interrupted.
Shard174 (laksa)
Root Hash6325672905007345774
Unparsed URLcom,github!/dano/aioprocessing s443