This is a Python implementation of crawling Weibo data (e.g., text, images, live photos, and videos) of one Sina Weibo user from the Weibo Mobile Client. It simulates user login with the session (username and password).
Many thanks to Python Chinese Community for providing the source code SourceCode_weibocrawler.py
.
-
Crawling short text in original and retweeted Weibo posts.
-
Crawling large (preferred) or small JPG/GIF images in original and retweeted Weibo posts.
-
[New!] Crawling live photos (as JPG images, MOV videos, and/or GIF images) in original and retweeted Weibo posts.
-
Crawling HD (preferred) or SD videos in original and retweeted Weibo posts.
- requests 2.21.0
- lxml 4.2.5
- cv2 4.1.0
- imageio 2.4.1
- PIL 5.3.0
-
Set
S_DATA
andS_HEADER
of the session for simulating user login (see comments for details). -
Set
USER_URL
of the target Sina Weibo user (see comments for details). -
Set the amount of pages (
PAGE_AMOUNT
) for crawling (see comments for details). -
Set the path (
PATH_FOLDER
) and the TXT file (PATH_FILE_TXT
) for saving Weibo data. -
Set the type of Weibo data (
IF_IMAGE
,IF_PHOTO
, andIF_VIDEO
as 1) for crawling. -
Set
IF_LIVE2GIF = True
if live photos (videos) need to be converted to GIF images. -
Set
TIME_DELAY
of the crawler to aovidConnectionError 104: ('Connection aborted.')
. -
If
ConnectionError 104: ('Connection aborted.')
occurs:-
Set
IF_RECONNECT = True
for running the crawler in reconnection mode. -
Set
TAG_STARTCARD
as the serial number of the starting Weibo post (according to log information).
-
-
Run
run_WeiboCrawler.py
to crawl Weibo data of the target Sina Weibo user. -
See
Log_run_WeiboCrawler.txt
for log information of running the code.
-
The Weibo data will be saved in the pre-specified folder (e.g.,
Demo_WeiboData/
). -
The text of Weibo posts will be saved in the TXT file (e.g.,
Demo_WeiboData/Demo_WeiboPost_Records.txt
). -
The images, photos, and videos will be saved in sub-folders (e.g.,
1/
,1_livephoto/
, and1_video/
).
Please report an issue if you have any question about this repository, I will respond ASAP.
Please star this repository if you found its content useful. Thank you very much. ^_^