1 Star 0 Fork 1

fei_Niu/stock_data_crawler

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
save_news_to_csv.py 703 Bytes
一键复制 编辑 原始数据 按行查看 历史
root 提交于 2019-03-27 06:20 . get news of top 100 stocks
import pymongo
import csv
def reformat(datestr):
return datestr[-4:] + datestr[:4]
csvfile = open('news_reuters.csv', 'w', encoding='utf-8', newline='')
writer = csv.writer(csvfile)
client = pymongo.MongoClient()
db = client.us
col_count = 0
news_count = 0
for col_name in sorted(db.collection_names()):
if col_name == 'system.indexes':
continue
col = db[col_name]
col_count += 1
print('processing ', col_name)
for k in col.find():
data = [k['symbol'], k['name'], reformat(k['date']), k['title'], k['content'], k['news_type']]
writer.writerow(data)
news_count += 1
print(col_count, 'stocks, and', news_count, 'pieces of news')
csvfile.close()
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Python
1
https://gitee.com/_sky/stock_data_crawler.git
git@gitee.com:_sky/stock_data_crawler.git
_sky
stock_data_crawler
stock_data_crawler
master

搜索帮助

0d507c66 1850385 C8b1a773 1850385