urls = [ 'http://www.heroku.com', 'http://python-tablib.org', 'http://httpbin.org', 'http://python-requests.org', 'http://fakedomain/', 'http://kennethreitz.com' ] rs = (grequests.get(u) for u in urls) grequests.map(rs)
get takes the same arguments as requests.get(), and post, put, delete, etc. are supported as well. Failed requests return None by default; you can pass an exception_handler callback to map.
Parameters accepted by map:
1 2 3 4 5
:param requests: a collection of Request objects. :param stream: If True, the content will not be downloaded immediately. :param size: Specifies the number of requests to make at a time. If None, no throttling occurs. :param exception_handler: Callback function, called when exception occurred. Params: Request, Exception :param gtimeout: Gevent joinall timeout in seconds. (Note: unrelated to requests timeout)
If you prefer a generator, use imap. Note that imap defaults size to 2 and does not offer a gtimeout argument.
requests-threads
Summary
Another async client from the requests author, this time built on Twisted.
from txrequests import Session from twisted.internet import defer
@defer.inlineCallbacks defmain(): # use with statement to cleanup session's threadpool, and connectionpool after use # you can also use session.close() if want to use session for long term use with Session() as session: # first request is started in background d1 = session.get('http://httpbin.org/get') # second requests is started immediately d2 = session.get('http://httpbin.org/get?foo=bar') # wait for the first request to complete, if it hasn't already response_one = yield d1 print('response one status: {0}'.format(response_one.status_code)) print(response_one.content) # wait for the second request to complete, if it hasn't already response_two = yield d2 print('response two status: {0}'.format(response_two.status_code)) print(response_two.content)
treq
Summary
A Twisted-based HTTP client with an interface modeled after requests. Its documentation is extensive.
import gevent from gevent import socket urls = ['www.google.com', 'www.example.com', 'www.python.org'] jobs = [gevent.spawn(socket.gethostbyname, url) for url in urls gevent.joinall(jobs, timeout=2) [job.value for job in jobs]
Monkey patching that turns standard networking into async operations:
1 2 3
from gevent import monkey monkey.patch_socket() import urllib2