requests-cache is a transparent, persistent HTTP cache for the python requests library. It's a convenient tool to use with web scraping, consuming REST APIs, slow or rate-limited sites, or any other scenario in which you're making lots of requests that are expensive and/or likely to be sent more than once.
See full project documentation at: https://requests-cache.readthedocs.io
- Ease of use: Use as a drop-in replacement
for
requests.Session
, or install globally to add caching to allrequests
functions - Customization: Works out of the box with zero config, but with plenty of options available for customizing cache expiration and other behavior
- Persistence: Includes several storage backends: SQLite, Redis, MongoDB, and DynamoDB.
- Compatibility: Can be used alongside other popular libraries based on requests
First, install with pip:
pip install requests-cache
Next, use requests_cache.CachedSession to send and cache requests. To quickly demonstrate how to use it:
This takes ~1 minute:
import requests
session = requests.Session()
for i in range(60):
session.get('http://httpbin.org/delay/1')
This takes ~1 second:
import requests_cache
session = requests_cache.CachedSession('demo_cache')
for i in range(60):
session.get('http://httpbin.org/delay/1')
The URL in this example adds a delay of 1 second, simulating a slow or rate-limited website.
With caching, the response will be fetched once, saved to demo_cache.sqlite
, and subsequent
requests will return the cached response near-instantly.
If you don't want to manage a session object, requests-cache can also be installed globally:
requests_cache.install_cache('demo_cache')
requests.get('http://httpbin.org/delay/1')
To find out more about what you can do with requests-cache, see:
- The User Guide and Advanced Usage sections
- A working example at Real Python: Caching External API Requests
- More examples in the examples/ folder