steverobbins/magescan

Extra space between "Sitemap:" and (url) in robots.txt causes a crash

Closed this issue · 2 comments

                                                                               
  [GuzzleHttp\Exception\RequestException]                                      
  Error creating resource: [message] fopen(%20http://magentosite/sit  
  emap.xml_): failed to open stream: No such file or directory              
  [file] phar:///root/magescan.phar/vendor/guzzlehttp/guzzle/src/Handler/Stre  
  amHandler.php                                                                
  [line] 312                                                                   
                                                                               

                                                                               
  [RuntimeException]                                                           
  Error creating resource: [message] fopen(%20http://magentosite/sit  
  emap.xml_): failed to open stream: No such file or directory              
  [file] phar:///root/magescan.phar/vendor/guzzlehttp/guzzle/src/Handler/Stre  
  amHandler.php                                                                
  [line] 312                                                                   

@7anner what does the robots.txt look like

User-agent: *
Host: magentosite
Sitemap:  http://magentosite/sitemap.xml
...

Note the double space after Sitemap: