🕷️ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 20 (from laksa141)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

đź“„
INDEXABLE
âś…
CRAWLED
26 days ago
🤖
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH0.9 months ago
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://livebook.manning.com/book/python-concurrency-with-asyncio/chapter-6
Last Crawled2026-03-19 20:35:28 (26 days ago)
First Indexed2023-05-17 21:54:08 (2 years ago)
HTTP Status Code200
Meta Title6 Handling CPU-bound work · Python Concurrency with asyncio
Meta DescriptionThe multiprocessing library · Creating process pools to handle CPU-bound work · Using async and await to manage CPU-bound work · Using MapReduce to solve a CPU-intensive problem with asyncio · Handling shared data between multiple processes with locks · Improving the performance of work with both CPU- and I/O-bound operations
Meta Canonicalnull
Boilerpipe Text
The multiprocessing library Creating process pools to handle CPU-bound work Using async and await to manage CPU-bound work Using MapReduce to solve a CPU-intensive problem with asyncio Handling shared data between multiple processes with locks Improving the performance of work with both CPU- and I/O-bound operations Until now, we’ve been focused on performance gains we can get with asyncio when running I/O-bound work concurrently. Running I/O-bound work is asyncio’s bread and butter, and with the way we’ve written code so far, we need to be careful not to run any CPU-bound code in our coroutines. This seems like it severely limits asyncio, but the library is more versatile than just handling I/O-bound work. asyncio has an API for interoperating with Python’s multiprocessing library. This lets us use async await syntax as well as asyncio APIs with multiple processes. Using this, we can get the benefits of the asyncio library even when using CPU-bound code. This allows us to achieve performance gains for CPU-intensive work, such as mathematical computations or data processing, letting us sidestep the global interpreter lock and take full advantage of a multicore machine. 6.1 Introducing the multiprocessing library 6.2 Using process pools 6.2.1 Using asynchronous results 6.3 Using process pool executors with asyncio 6.3.1 Introducing process pool executors 6.3.2 Process pool executors with the asyncio event loop 6.4 Solving a problem with MapReduce using asyncio 6.4.1 A simple MapReduce example 6.4.2 The Google Books Ngram dataset 6.4.3 Mapping and reducing with asyncio 6.5 Shared data and locks 6.5.1 Sharing data and race conditions 6.5.2 Synchronizing with locks 6.5.3 Sharing data with process pools 6.6 Multiple processes, multiple event loops Summary
Markdown
chapter six # 6 Handling CPU-bound work ### This chapter covers - The multiprocessing library - Creating process pools to handle CPU-bound work - Using `async` and `await` to manage CPU-bound work - Using MapReduce to solve a CPU-intensive problem with asyncio - Handling shared data between multiple processes with locks - Improving the performance of work with both CPU- and I/O-bound operations Until now, we’ve been focused on performance gains we can get with asyncio when running I/O-bound work concurrently. Running I/O-bound work is asyncio’s bread and butter, and with the way we’ve written code so far, we need to be careful not to run any CPU-bound code in our coroutines. This seems like it severely limits asyncio, but the library is more versatile than just handling I/O-bound work. asyncio has an API for interoperating with Python’s multiprocessing library. This lets us use async `await` syntax as well as asyncio APIs with multiple processes. Using this, we can get the benefits of the asyncio library even when using CPU-bound code. This allows us to achieve performance gains for CPU-intensive work, such as mathematical computations or data processing, letting us sidestep the global interpreter lock and take full advantage of a multicore machine. ## 6\.1 Introducing the multiprocessing library ## 6\.2 Using process pools ### 6\.2.1 Using asynchronous results ## 6\.3 Using process pool executors with asyncio ### 6\.3.1 Introducing process pool executors ### 6\.3.2 Process pool executors with the asyncio event loop ## 6\.4 Solving a problem with MapReduce using asyncio ### 6\.4.1 A simple MapReduce example ### 6\.4.2 The Google Books Ngram dataset ### 6\.4.3 Mapping and reducing with asyncio ## 6\.5 Shared data and locks ### 6\.5.1 Sharing data and race conditions ### 6\.5.2 Synchronizing with locks ### 6\.5.3 Sharing data with process pools ## 6\.6 Multiple processes, multiple event loops ## Summary ![](https://www.facebook.com/tr?id=1940497162877014&ev=PageView&noscript=1)
Readable Markdown
- The multiprocessing library - Creating process pools to handle CPU-bound work - Using `async` and `await` to manage CPU-bound work - Using MapReduce to solve a CPU-intensive problem with asyncio - Handling shared data between multiple processes with locks - Improving the performance of work with both CPU- and I/O-bound operations Until now, we’ve been focused on performance gains we can get with asyncio when running I/O-bound work concurrently. Running I/O-bound work is asyncio’s bread and butter, and with the way we’ve written code so far, we need to be careful not to run any CPU-bound code in our coroutines. This seems like it severely limits asyncio, but the library is more versatile than just handling I/O-bound work. asyncio has an API for interoperating with Python’s multiprocessing library. This lets us use async `await` syntax as well as asyncio APIs with multiple processes. Using this, we can get the benefits of the asyncio library even when using CPU-bound code. This allows us to achieve performance gains for CPU-intensive work, such as mathematical computations or data processing, letting us sidestep the global interpreter lock and take full advantage of a multicore machine. 6\.1 Introducing the multiprocessing library 6\.2 Using process pools 6\.2.1 Using asynchronous results 6\.3 Using process pool executors with asyncio 6\.3.1 Introducing process pool executors 6\.3.2 Process pool executors with the asyncio event loop 6\.4 Solving a problem with MapReduce using asyncio 6\.4.1 A simple MapReduce example 6\.4.2 The Google Books Ngram dataset 6\.4.3 Mapping and reducing with asyncio 6\.5 Shared data and locks 6\.5.1 Sharing data and race conditions 6\.5.2 Synchronizing with locks 6\.5.3 Sharing data with process pools 6\.6 Multiple processes, multiple event loops Summary
Shard20 (laksa)
Root Hash7867033290496853820
Unparsed URLcom,manning!livebook,/book/python-concurrency-with-asyncio/chapter-6 s443