python多執行緒非同步爬蟲-Python非同步爬蟲試驗[Celery,gevent,requests]

weixin_39915267發表於2020-11-11

以往爬蟲都是用自己寫的一個爬蟲框架,一群Workers去Master那領取任務後開始爬。程式數量等於處理器核心數,通過增開執行緒數提高爬取速度。

最近看了Celery,介面真是優美,挺想試驗下非同步模型來寫個爬蟲。

模擬目標

為了方便測試,用Tornado搭了一個簡易的伺服器,用來模擬被爬的網站。

功能很簡單,每個請求阻塞6秒才回復import tornado.webimport tornado.ioloopimport timefrom concurrent.futures import ThreadPoolExecutorfrom tornado.concurrent import run_on_executorimport tornado.genclass MainHandler(tornado.web.RequestHandler):

executor = ThreadPoolExecutor(40) @tornado.web.asynchronous @tornado.gen.coroutine

def get(self):

print(time.asctime()) yield self.sleep(6)

self.write("from server:" + time.asctime())

self.finish() @run_on_executor

def sleep(self, sec):

time.sleep(sec)if __name__ == "__main__":

app = tornado.web.Application(handlers=[

("^/.*", MainHandler)

])

app.listen(10240)

tornado.ioloop.IOLoop.instance().start()

消費者

task裡就一個spider函式,功能是利用gevent去請求給定的目標import gevent.monkey

gevent.monkey.patch_socket()from celery import Celeryimport socketimport requestsimport gevent

app = Celery("tasks",

broker="redis://127.0.0.1:6379/3",

backend="redis://127.0.0.1:6379/3")@app.taskdef spider(url):

resp = gevent.spawn(requests.get, url)

tmp = 0

while True:

print("wait...", tmp) if resp.ready(): return "from:" + socket.getfqdn() + " res:" + str(resp.value.text)

gevent.sleep(1)

tmp += 1

用gevent模式啟動Celerycelery worker -A tasks --loglevel info -c 100 -P gevent

生產者

利用剛剛編寫的spider函式去爬取目標

測試中,下面程式碼開了6個程式,結果均在7秒內返回,證明成功了。from tasks import spiderimport timeimport random

res = spider.delay("http://127.0.0.1:10240/{}".format(random.randint(1, 999)))

i = 0while True: if res.ready():

print("res:", res.get()) break

else:

print("wait...", i)

time.sleep(1)

i += 1

Celery的部分日誌輸出:

可以看出在一個Celery程式內,多個spider函式輪替執行的[2016-08-20 21:27:11,281: INFO/MainProcess] Starting new HTTP connection (1): 127.0.0.1[2016-08-20 21:27:11,313: INFO/MainProcess] Received task: tasks.spider[7b8b6f63-2bef-491e-a3a8-fdbcff824b9c][2016-08-20 21:27:11,314: WARNING/MainProcess] wait...[2016-08-20 21:27:11,314: WARNING/MainProcess] 0[2016-08-20 21:27:11,316: INFO/MainProcess] Starting new HTTP connection (1): 127.0.0.1[2016-08-20 21:27:11,354: INFO/MainProcess] Received task: tasks.spider[5aa05e65-504d-4a04-8247-3f5708bfa46f][2016-08-20 21:27:11,356: WARNING/MainProcess] wait...[2016-08-20 21:27:11,356: WARNING/MainProcess] 0[2016-08-20 21:27:11,357: INFO/MainProcess] Starting new HTTP connection (1): 127.0.0.1[2016-08-20 21:27:11,821: WARNING/MainProcess] wait...[2016-08-20 21:27:11,821: WARNING/MainProcess] 1[2016-08-20 21:27:11,989: WARNING/MainProcess] wait...[2016-08-20 21:27:11,990: WARNING/MainProcess] 1[2016-08-20 21:27:12,059: WARNING/MainProcess] wait...[2016-08-20 21:27:12,059: WARNING/MainProcess] 2[2016-08-20 21:27:12,208: WARNING/MainProcess] wait...[2016-08-20 21:27:12,209: WARNING/MainProcess] 1[2016-08-20 21:27:12,225: WARNING/MainProcess] wait...[2016-08-20 21:27:12,225: WARNING/MainProcess] 1[2016-08-20 21:27:12,246: WARNING/MainProcess] wait...[2016-08-20 21:27:12,247: WARNING/MainProcess] 2[2016-08-20 21:27:12,282: WARNING/MainProcess] wait...[2016-08-20 21:27:12,282: WARNING/MainProcess] 1[2016-08-20 21:27:12,316: WARNING/MainProcess] wait...[2016-08-20 21:27:12,316: WARNING/MainProcess] 1[2016-08-20 21:27:12,357: WARNING/MainProcess] wait...[2016-08-20 21:27:12,357: WARNING/MainProcess] 1[2016-08-20 21:27:12,823: WARNING/MainProcess] wait...[2016-08-20 21:27:12,823: WARNING/MainProcess] 2[2016-08-20 21:27:12,991: WARNING/MainProcess] wait...[2016-08-20 21:27:12,992: WARNING/MainProcess] 2[2016-08-20 21:27:13,061: WARNING/MainProcess] wait...[2016-08-20 21:27:13,061: WARNING/MainProcess] 3[2016-08-20 21:27:13,210: WARNING/MainProcess] wait...[2016-08-20 21:27:13,211: WARNING/MainProcess] 2[2016-08-20 21:27:13,227: WARNING/MainProcess] wait...[2016-08-20 21:27:13,227: WARNING/MainProcess] 2

最後

藉助Celery,爬蟲很容易實現橫向擴充套件,在多臺伺服器上增加消費者程式即可;

藉助gevent,單程式內requests做到了非阻塞,而我過去是用多執行緒對付阻塞的。

Celery,gevent我也是初學一天,這小玩意兒做出來後,得開始看文件了深入瞭解了!

作者:spencer404

連結:https://www.jianshu.com/p/c1e53cc32d4d

相關文章