getting list of all datasets
m-fayer opened this issue · 1 comments
m-fayer commented
When I tried to use the following code to get the list of all datasets:
import csv
import quandl
quandl.ApiConfig.api_key = 'your_key'
ds=quandl.Dataset.all()
npage=ds.meta['total_pages']
csv_file = open('Datasets.csv', 'w', newline='')
csv_writer = csv.writer(csv_file)
for i in range( 1, npage+1 ):
t=quandl.Dataset.all(params={ 'page': i })
t.to_list()
for e in t:
line=e.to_list()
csv_writer.writerow(line)
csv_file.close()
I got the following error:
Traceback (most recent call last):
File "getlists.py", line 9, in <module>
t=quandl.Dataset.all(params={ 'page': i })
File "C:\Program Files\Python37\lib\site-packages\quandl\operations\list.py", line 15, in all
r = Connection.request('get', path, **options)
File "C:\Program Files\Python37\lib\site-packages\quandl\connection.py", line 38, in request
return cls.execute_request(http_verb, abs_url, **options)
File "C:\Program Files\Python37\lib\site-packages\quandl\connection.py", line 50, in execute_request
cls.handle_api_error(response)
File "C:\Program Files\Python37\lib\site-packages\quandl\connection.py", line 114, in handle_api_error
raise klass(message, resp.status_code, resp.text, resp.headers, code)
quandl.errors.quandl_error.LimitExceededError: (Status 416) (Quandl Error QELx08) This API call returns a maximum of 2000 results. To retrieve metadata for all time-series in this data feed, please use the metadata route. For more information, see https://help.quandl.com/article/92-how-do-i-download-the-quandl-codes-of-all-the-timeseries-in-a-given-data-feed
Either the link in the error message or online Quandl documents about downloading bulk data can only download specific databases/datasets, and looks not working in my case.
Is there any way to download such big list (58781 pages)? Thanks in advance.
chaudharyachint08 commented
Facing the same error when trying to download data for stocks under "National Stock Exchange of India" which has 2260 listing, why does not API support from downloading from last page, and only limits up to page_id 20, where each page has 100 entries.
I have no interest in first few pages, and wanted to look into last page only, why is that not allowed?