-
Notifications
You must be signed in to change notification settings - Fork 575
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Got 0 tweets #344
Comments
I have the same issue. "twitterscraper Trump --limit 1000 --output=tweets.json" gives an empty json file. |
same issue :( |
Same issue, getting 0 tweets. |
Yes, I'm having the same problem, getting the following error message when "getting 0 tweets" raise ConnectionError(e, request=request) |
I have the same issue, 0 tweets using query_tweets... |
It doesn't work and return 0 tweet again. please help. |
ooh bad, doesn't work function query_tweets from_user() nor query_tweets(), since yesterday. please help. |
0 Tweets again. Please help. |
The same error. |
Easy Twitter enabled Javascript and shut down the backdor this code used It was surprising that with the russia collusion and other horse shit they didnt start to work to protect the data. So as i said in April, the software is probably dead. |
This doesnt work with the Twitter API not even close. They changed the HTML Page to avoid non conscented web scrapping. |
So is this library just donzo? It's over? |
Temporarily. Yeah. |
Thought I'd use this for my project. Guess that's not gonna happen anytime soon :( |
Same here. Trying the branch of selenium alternatively. |
Is there another good similar library that people are using? |
same to me |
same issue ,twitter disgust, |
Even I got 0 Tweets count. I posted a new tweet and still got Zero tweets. I spent a lot of time understanding the workflow and got 0 ! |
just use tweepy instead import tweepy as tw #Setting up Twitter API connection:
#Finding some AI Tweets search_words = "artificial+intelligence" for tweet in tweets: |
As twitterscraper is (temporarily) dead, I thought I'd share my favorite alternative scraper. @mattwsutherland your solution requires an API-key (which is super hard to get) so unfortunately it's useless to most people here. Simply go for TweetScraper - for me its working perfectly. Installation for Ubuntu is straightforward as described. For Windows just do the following:
Change to directory "TweetScraper" and test with
|
Thanks I will definitely 👍 try that
…On Wed, Nov 11, 2020, 1:40 AM do-me ***@***.***> wrote:
As twitterscraper is (temporarily) dead, I thought I'd share my favorite
alternative scraper. @mattwsutherland <https://github.com/mattwsutherland>
your solution requires an API-key (which is super hard to get) so
unfortunately it's useless to most people here.
Simply go for TweetScraper <https://github.com/jonbakerfish/TweetScraper>
- for me its working perfectly.
Also see this medium article
***@***.***/scraping-twitter-with-tweetscraper-and-python-ea783b40443b>
providing a useful pandas snippet to set up a nicely formatted dataframe.
Installation for Ubuntu is straightforward as described. For Windows just
do the following:
git clone https://github.com/jonbakerfish/TweetScraper.git
conda create -n tweetscraper python=3.7.7 -y
conda activate tweetscraper
conda install -y -c conda-forge scrapy ipython ipdb
pip install scrapy-selenium
Change to directory "TweetScraper" and test with
scrapy crawl TweetScraper -a query="foo,#bar"
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#344 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AD4DDQIMFDUXIU7TEKI5O73SPGM4HANCNFSM4REYB7DQ>
.
|
still, the issue is continuing, Anyone finds a solution or another work around? |
I think this scrapper is dead and may not be able untill somebody make changes. |
@Guolin1996 Yes unfortunately ... |
where to set Author API in this code? |
Commenting to watch, same issue here |
I guess it's still dead. Just tried, got 0 tweets. |
Still 0 tweets |
sad. still 0 tweets. is this related to Elon Musk? |
No description provided.
The text was updated successfully, but these errors were encountered: