Response content too large
Response too large
$ python -c "import requests; len(requests.get('https://big.example.com/blob').content)"
# May raise MemoryError or cause process to be killed
Why this happens
Loading very large bodies into memory can cause MemoryError or OOM kills.
Fix
- Stream and process in chunks with
stream=True.
Wrong code
import requests
content = requests.get("https://big.example.com/blob").content
print(len(content))
Fixed code
import requests
with requests.get("https://big.example.com/blob", stream=True) as r:
r.raise_for_status()
for chunk in r.iter_content(chunk_size=1024 * 1024):
# process chunk
pass