salimk/Rcrawler

ContentScraper store in between results

KnutJaegersberg opened this issue · 0 comments

Hi,
sometimes using the contentscraper function on a longer vector of urls, it may fail in between and then everything is lost. working around it with wrapping the function with safely from purrr and then an lapply. but would be nice if the function would store inbetween results to disk or in memory and making it fail save, because often traversing a predefined path yields better results.