Program randomly freezes
Opened this issue · 6 comments
Hello... I'm using your library and I don't know why but my program randomly freezes sometimes. My program is pretty simple and is pretty much just copying the code sample you provide @https://twittersearch.readthedocs.org/en/latest/index.html (actually, your code sample was also freezing when I tried it).
Could it have to do with the version of python I'm using? (2.7.9)
I installed TwitterSearch through pip. I hope its not some deadlock issue.
Here's what I've been running:
from TwitterSearch import *
from time import sleep
try:
tso = TwitterSearchOrder() # create a TwitterSearchOrder object
tso.set_keywords(['#vr', '-RT']) # let's define all words we would like to have a look for
tso.set_language('en') # hell no German, I want English!
tso.set_include_entities(False) # and don't give us all those entity information
# it's about time to create a TwitterSearch object with our secret tokens
ts = TwitterSearch(
consumer_key = 'xxxx',
consumer_secret = 'xxxx',
access_token = 'xxxx',
access_token_secret = 'xxxx'
)
# open file for writing
text_file = open("#vrtest.txt", "w")
# check when to stop
iterations = 0
max_tweets = 100000
# callback fucntion used to check if we need to pause the program
def my_callback_closure(current_ts_instance): # accepts ONE argument: an instance of TwitterSearch
queries, tweets_seen = current_ts_instance.get_statistics()
if queries > 0 and (queries % 2) == 0: # trigger delay every other query
print("\nQueries: " + str(queries) + " now sleeping, 1 minute.\n");
sleep(60) # sleep for 60 seconds
# this is where the fun actually starts :)
for tweet in ts.search_tweets_iterable(tso, callback=my_callback_closure):
current_line = "%s" % ( tweet['text'] )
iterations = iterations + 1
print( "i: " + str(iterations) + " - " + tweet['user']['screen_name'] + " tweeted: " + current_line )
text_file.write(current_line.encode('utf-8', 'ignore') + "\n")
# wait 1 second every 10 tweets
if (iterations % 10 == 0):
print("\nSleeping 1 second.\n")
sleep(1)
if (iterations >= max_tweets):
break
except TwitterSearchException as e: # take care of all those ugly errors if there are some
print(e)
finally:
# close file
text_file.close()
I edited your code to remove your credentials. Remember to renew them immediately as they are likely to be stored in the Google Cache!
According your problem: I'll look into it this weekend. My first guess is that there is some kind of infinite loop or a problem in the library itself when calling the callback function.
Is this behavior also present when you run it under Python3?
Also, when exactly does the program freeze? Like after how many executions of the callback function?
Thanks,
From the printouts that I can recall, it seemed to only happen after querying (but before the function callback). Also, its after a random number of queries... I'm using the default of 100 tweets per query now, and sometimes it stops at 500, sometimes 300, sometimes 1000+.
I'll try it on different versions of python.
Tried on Python 2.7.10 and 3.5, I'm still getting the same issue.
I'm running Windows 10 by the way.
I'm looking into the issue as soon as I have a bit of spare time :)
Looks like a bug in TwitterSearch.
Thanks again for bringing this up here!
Hi, your sample code is not working even on Python 3.5. I isolated it to the for
statement in
for tweet in ts.search_tweets_iterable(tso):
print( '@{} tweeted: {}' , ( tweet['user']['screen_name'], tweet['text'] ) )
So I tried fooling around with your API and realized that none of the functions search_tweets()
or search_tweets_iterable()
work as expected. I too installed TwitterSearch from pip. Gentle reminder that this bug exists.
I'll look into this once I have a bit of spare time. Thanks for bringing this up again!