With the help of flexible settings, you can fully control the entire process as well as the audit results. Settings can be accessed from any section of the tool.
Under this settings block, you can decide how often you want to audit your website. The following options are available:
1. Manually. If you select this option, you will have to click on the Restart audit button in order to start the audit. For convenience, this button can be found in any section of the tool.
2. Weekly. The audit will run on a weekly basis on the day of the week and time of your choice.
3. Monthly. The audit will run on a monthly basis on the day of the week and time of your choice.
The time can be set only in the GMT time zone.
However, you can always start the audit manually even if you have set up automatic checks.
Under this settings block, you can let the tool know which pages you want to audit.
Under this settings block, you can set rules for crawling site pages.
For example:
1. Take robots.txt directives into account. When this option is enabled, the tool will crawl the site according to the list of valid instructions in the robots.txt file. Otherwise, the tool will ignore robots.txt file instructions.
2. Ignore Noindex. When this option is enabled, the tool will crawl pages ignoring the Noindex directive (in the format of a meta tag or an HTTP response header).
3. Ignore Nofollow. By enabling this option, you instruct the Eden Metrics bot to follow the links on the page ignoring the Nofollow directive (in the format of a meta tag, an HTTP response header, or a link attribute).
4. Allow/disallow URLs. Dissalow directives take precedence over allow ones. If you don't want to see mentions of resources and pages at all, specify them in the hidden ones.
5. Ignore URL parameters. Here you can specify which UTM tags should be ignored in page URLs during the audit. You can either exclude every parameter or manually select the ones to ignore.
The tool allows you to select a bot that will do the crawling, plus it provides access to pages that are closed off to search robots.
1. Select a User Agent.
The Eden Metrics bot is selected by default. If the analyzed site can’t be reached by our bot, you can choose another option from the list. No matter which User Agent is selected, all pages oа your site will be crawled (even if you select Googlebot-Image).
2. Authorization on restricted pages.
You can crawl closed off website pages. To do this, give our bot access by specifying your login credentials in the appropriate fields.
This section allows you to configure crawling limits and restrictions. You can set separate limits and restrictions for each site under your account.
When analyzing website parameters, Search Intelligence looks to current search engine recommendations. Under the Report setup section, you can change the parameters that are taken into account by the tool when crawling sites and putting together reports.
What’s more, we have introduced a new feature that allows you to create guest links for the audit reports created on the Search Intelligence platform. Thanks to these shareable reports, presenting information to the specialists you work with has become much easier. This feature is available in both the stand-alone tool and project-based audits.