Creates an XML-Sitemap by crawling a given site.
Automatically split into multiple sitemaps for pages with more than 50000 sitemap entries. This required breaking changes.
Syntax changed as follows:
sitemap-generator http://example.com sitemap.xml
Note the missing >
pipe operator. A second argument for the file path is now required. The sitemap will not be printed to the console anymore. Instead it's always written to disk. In case of more than 50000 entries it will generate a sitemapindex file sitemap.xml
and x sitemap_partX.xml
files.
Fixed crawling bug in underlying sitemap-generator
lib.
Added support for robots meta noindex
.
The API changed quite a bit. The generator does not save any file but rather prints the xml markup directly to the console. To avoid numerous options for filepath, filename.. you can now handle that yourself. As it is quite simple to achieve the old behavior and this is quite flexible that's the way to go. This means there will be no status messages during generation. If you have to check those enable the dry mode and look for problems.
--silent
to omit crawler notificationsNPM version bump.
Fixed robots.txt requests.
Raised required version.