Create HTML, RSS and XML website sitemaps.
Generate text, HTML, RSS plus XML sitemaps to assist engines like google and Yahoo in order to crawl and index your own website. Crawler is function rich supporting many site crawling options. Configure quantity of simultaneous connections to use. Supports crawler filters, robots. txt, custom connection and read timeout values, removal of session IDs, scanning of javascript and css files, proxy setup, website login, and various other options. Alias paths during scan, e. g. sites that use mutliple domain names with same articles. Scan websites from several start paths, helpful for sites not fully crosslinked. Check local and online sites upon internet, localhost, LAN, CD-ROM and disks. Scan stationary and dynamic websites for example portals, online stores, websites and forums. View reviews on broken and rerouted links (whereto and wherefrom). Rich template support regarding HTML sitemaps. Generate sitemap files for ASP. World wide web controls. Supports splitting plus compressing XML sitemaps. May set and calculate concern, change frequency and last modified in XML sitemaps. Change root path used in generated text, HTML, RSS and XML sitemaps, useful in case you have scanned a reflection or localhost website copy. Integrated FTP sitemap add. Can also ping and notify search engines of sitemap modifications. Command line support consists of load project, scan site, build sitemap, FTP publish and ping search motors.
|