philbot9/youtube-comment-scraper

URL posting to download the comments

Closed this issue · 8 comments

Hello,

I wanted to download the comments from multiple videos, I wanted to pass the URL for each video then download the respective CSV file.

Is it possible to do?

Hi there,

Sorry, I'm not sure I understand what you're asking. Can you elaborate what you mean by "pass the URL for each video and then download the respective CSV file."? How is it different from how things work currently?

Thanks

I see. No, unfortunately, there is not. They have to be done one-at-a-time.

Is there any automated way through which, I can pass the URL for the 200 and get the comments in CSV format for each video?

This would be easy to do with any shell scripting language you have access to (e.g. Bash, Windows batch file, etc), if you're willing to go that route.

I looked into it. I think I misunderstand this project. I thought it could provide an easy commandline feature that would allow for scripting. I think it does not.

So developing this feature would be pretty annoying to do. I can think of ways, but I won't explore this problem for myself. I think it would be easier just to do this by hand for every URL manually, than to solve it programatically.

Sorry

This project is a web client for scraping comments from a YouTube video.
It's hosted here: http://ytcomments.klostermann.ca/

There is, however, a command line version based on the same idea available here: https://github.com/philbot9/youtube-comment-scraper-cli

@Gautamshahi This might be of interest to you. No idea why I didn't think of this earlier. Sorry about that.