武汉加油,中国加油!
根据网易新闻的提供的数据进行了疫情地图数据的爬取,源网址是为
https://news.163.com/special/epidemic/spss=other&spssid=e259599be7482ba582c50d022764d829&spsw=1
根据上面的数据进行了字段划分,并且爬取了相关的数据,代码如下:
import requests
import pprint
import re
import pymysql
import time
import datetime
headers={
'User-Agent':'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36'
}
url = 'https://c.m.163.com/ug/api/wuhan/app/index/feiyan-data-list?t=1580702781961'
html = requests.get(url,headers=headers)
datas = html.json()
datas = datas['data']['list']
print (datas)
conn = pymysql.connect("localhost", "root", "123456", "feiyan", charset='utf8')
date=datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")
print(date)
for data in datas:
name = data['name']
province = data['province']
confirm = data['confirm']
heal =data['heal']
suspect = data['suspect']
dead = data['dead']
if dead == None:
dead = 0
if suspect == None:
suspect = 0
if confirm == None:
confirm = 0
if heal == None:
heal = 0
sql = "insert into 表名 (province,city,confirm,dead,suspect,heal,time) values ('{}','{}','{}','{}','{}','{}','{}');".format(
province,name,confirm,dead,suspect,heal,date)
try:
conn.query(sql)
conn.commit()
except Exception as e:
print(e)
pass
conn.close()
数据库的字段,省份和城市是varchar类型,确诊,死亡,疑似,治愈均为int类型,date是用的datetime类型记录自己爬取数据的时间。
但是该网站针对疑似病例只有总数,没有细分到各个地区,其他网站的疑似病例也没有针对地区的细分,还望找到的朋友分享一下,谢啦!

愿我们能共渡难关,早日战胜疫情!!!
本接口于2020-2-27作废,最新请移步https://blog.youkuaiyun.com/Hello_Bye/article/details/104553676
本文介绍了通过网易新闻数据爬取疫情地图数据的过程,包括省份、城市、确诊、死亡、治愈等字段,并指出现存问题——疑似病例数据无法细分到地区。作者呼吁有相关信息资源的人分享,并希望早日战胜疫情。
574





