TypeError: Cannot read property 'itemSectionRenderer' of undefined
Closed this issue · 18 comments
Describe the bug
Running v2.0.5 but getting this error
Failed to extract video data. Please report this issue on GitHub so it can be fixed.
To Reproduce
Steps to reproduce the behavior:
yarn add scrape-youtube
include using promises:
let output = await youtube.search(req.query.q, {
page: req.query.page || 1
})
yarn start
Results: Failed to extract video data. Please report this issue on GitHub so it can be fixed.
For the sake of testing, what are the values to req.query.q
and req.query.page
?
req.query.q = "Trump"
req.query.page = null, which after || = 1
The page parameter is not used at the moment (a mistake by me, I'll add it in a few minutes) but the searches seem to be working fine for both my bot and local server.
You can set youtube.debug = true
to verify the query string is being parsed correctly.
Setting debug to true will also print the error, Please send that here too.
Thanks. Debugging printed this: TypeError: Cannot read property 'itemSectionRenderer' of undefined
https://www.youtube.com/results?search_query=trump&sp=EgIQAQ%253D%253D
(node:22407) [DEP0066] DeprecationWarning: OutgoingMessage.prototype._headers is deprecated
TypeError: Cannot read property 'itemSectionRenderer' of undefined
at /.../scrape-youtube/lib/index.js:66:39
at new Promise (<anonymous>)
at Youtube.extractRenderData (/home/.../Github/.../node_modules/scrape-youtube/lib/index.js:53:16)
at Youtube.<anonymous> (/home/.../Github/.../node_modules/scrape-youtube/lib/index.js:158:51)
at step (/home/.../Github/.../node_modules/scrape-youtube/lib/index.js:33:23)
at Object.next (/home/.../Github/.../node_modules/scrape-youtube/lib/index.js:14:53)
at fulfilled (/home/.../Github/.../node_modules/scrape-youtube/lib/index.js:5:58)
at processTicksAndRejections (internal/process/task_queues.js:97:5)
Failed to extract video data. Please report this issue on GitHub so it can be fixed.
btw - it would be great to be able to pass the "sp" property all the way through (my use case is to search uploads within X days)
btw - it would be great to be able to pass the "sp" property all the way through (my use case is to search uploads within X days)
I'll add an option to override the sp
parameter. It's worth mentioning that youtube has a bizarre and slightly confusing system for filters but I'll definitely look into that.
Thanks for the error message, I'll look into this and get back to you
Would you mind if I provide you with an unofficial build to extract some more data for debugging?
It will just dump the page and json to two files that you can send to me for further examination.
Here you go: https://github.com/DrKain/scrape-youtube/tree/issue-22
If you set youtube.dump = true
it will create debug.json
and debug.html
when it encounters an error. Please upload the contents to dropbox or whatever hosting site you prefer so I can take a look.
Thanks. I'm getting a bunch of typescript errors when adding the repo directly to package.json. Any suggestions or possible to compile and publish to NPM?
My mistake, I accidentally added tsconfig
to the dependencies on that one. If you remove it should be fine.
The lib
directory is the one with the compiled code so you could just replace with what you already have
was able to get it running without removing tsconfig or using the lib folder. interestingly, this one outputs:
(node:25179) [DEP0066] DeprecationWarning: OutgoingMessage.prototype._headers is deprecated
Error: socket hang up
at connResetException (internal/errors.js:609:14)
at TLSSocket.socketOnEnd (_http_client.js:459:23)
at TLSSocket.emit (events.js:326:22)
at TLSSocket.EventEmitter.emit (domain.js:483:12)
at endReadableNT (_stream_readable.js:1223:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21) {
code: 'ECONNRESET'
}
no debug.json file created and dump.html just says: null
oh here we go - using /lib i do get the outputs... uploading files now
Files: https://we.tl/t-3q4SNt5bK6
I've published scrape-youtube@2.0.6
, the issue should now be resolved. Please update and let me know if it's working now
Yep, works great. Thanks!
No problem. Thanks for the help reporting and resolving the issue.