-
Notifications
You must be signed in to change notification settings - Fork 384
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
一千多条数据入csv文件后,结果文件不再更新 #500
Comments
参考 #66 . |
这是来自QQ邮箱的假期自动回复邮件。邮件已收到了哦~
|
作者好,请问现在还是一个小时段最多能爬取50页数据吗? |
这是来自QQ邮箱的假期自动回复邮件。邮件已收到了哦~
|
@liushuiji 应该是的。 |
好的,谢谢 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
大佬好,感谢大佬的数据!去年用还没更新的代码的时候,可以完美爬到所有数据,但这几天使用更新过的代码,发现每次只能爬到一两千条数据进CSV后,结果文件就不再更新了,但程序还是再不断运行的,请问是为什么呢?
另外还想问一下,这个代码似乎是模拟微博搜索程序的,微博搜索程序是以一小时为最小的单位,爬取50页数据,请问现在更新后的代码是不是也是这样,一个小时段最多能爬取50页数据呢?谢谢大佬解答!
The text was updated successfully, but these errors were encountered: