SEO Radar FAQs
Every alert has a default priority. Critical are items that you might always want to look at, such as a meta-robots tag changing or title. High are still important and medium might be considered informational. Low priority alerts are completely suppressed. However, you may not agree with our defaults.
No two sites are the same so you can configure priorities on the settings screen. Priority settings can be changed on globally or they can be domain specific.
The default behavior is thus:
- Critical – text message and email
- High – email
- Medium – UI only
- Low – suppressed
This behavior can also be changed on the settings screen. You can also set up filters for individual alerts on specific URLs from the alert details window.
Internal linking structure is critical for SEO, both for content indexing and page authority. Removal of on-site cross-links can sometimes lead to a critical problems and have dramatic impact on SEO.
For link removals we have distinct logic for the removal of persistent links.
These are links that have been active on your page the last 4 times we have seen the page. If the link has been there a while, most likely it is important to your site indexing and ranking.
We also independently monitor footer and navigation links. These generally are static and important for indexing so you should pay special attention to navigation and footer links getting removed.
By default, recently discovered link removals are low priority. If your site is quite static in regards to link, you may want to raise the priority.
We keep track of the total page size and generate an alert if the change is more than the default threshold of 10%. A large change in page size could indicate a problem such as the page not completely loading or content getting dropped off the page. You can configure the threshold on the settings page.
We generate an alert if there is a significant change in the number of links on a page. You might want to review the default priority settings and threshold on the settings screen. By default, only cross-link removals are a ‘high’ and the threshold is set at 10%.
You can define focus keywords for your pages on the mange URL screen. These are the most important keywords for a page. We will monitor Titles, H1s and H2s and generate an alert if focus keywords are deleted.
We generate an alert for any change to the redirect chain. If a 301 changes to a 302, or an intermediate redirect as added, we generate an alert. We do these tests for all URLs.
Yes, we do these tests with every audit.
We do. These are lightweight tests we can do on a large group of URLs. If you have changed URL structure, migrated to a new site or new domain then you can configure the old URLs and we will monitor them to insure that the redirects stay in place. These URLs do not count against your normal URL limits.
Just click on a URL in domain overview or crawl summary and you will get a complete history of the URL, source code, screenshots and alerts.
Certain sites block Amazon (AWS) servers. Contact us and we will set up your account to use IP proxies. That should solve the issue.
The admin can schedule crawls from the edit domain screen.
Yes, click the crawl icon next to the domain on the dashboard. You also have a button on the domain overview screen to do this.
With our enterprise edition, you can test your staging vs. live. This way you catch problems before they get pushed. Contact us if you are interested in an upgrade.
This depends on service level. Bronze is 30 days, Silver is 90 days and Gold is six months.
No problem. You need our Chrome extension. As long as you are behind the firewall, you can kick off the staging crawl on Chrome. That will send the source from staging off to our servers and we will run our normal diff process. The results will be available on your dashboard.
This varies by site. If you have a template driven site, you want a representative sample of each template and any page that has unique logic wrapped around rendering the page. Any pages that are important for indexing such as HTML sitemaps should also be monitored. Additionally, content that is manually updated and optimized should be monitored. If you need to monitor hundreds or even hundreds of thousands of URLs, then contact us for info about our Enterprise plans.
Yes, if any schema gets removed or changes, you will get an alert.
If you have a Gold level account, you can create a test on the edit domain screen. We will search for a specific string and generate an alert if it is removed. For example, you could use this to monitor the Adobe Tag Manager tag.
Yes, with our Enterprise of Agency edition. With this, we will monitor a specific section of a page and let you know about any changes. Contact us for an upgrade.
Connecting your Google search console account allows many incremental features:
- We will analyze keyword and page performance and generate a weekly winners and losers report that will be available every Monday morning.
- The data is available in the UI for deeper analysis.
- If you have a Gold level of service or higher, we will save the archive beyond Google’s 90 days.
- You can click to see the impact of your historical changes.
- We will analyze your submitted sitemaps and generate alerts if there is either a drop in URLs or a submitted or a drop in indexed URLs. (This feature is currently in progress).
There are action icons listed next to the URLs in crawl summary or URL history screen. Select the icon for the full-source diff and you can step through all changes between crawls.
Yes, there is an option to on the source code diff to view only text changes to the page.
Yes it is on the URL history page. Click on the domain overview or crawl summary page to view the page load data. Render time is broken down into 3 variables:
- Server response
- Resource load
- Render
We are capturing that data when we render the page using a headless browser (phantomJS). The data is not yet perfect; you may see an large occasional spike in server response time (one of the components) when our crawlers hit 100% CPU. Once we over-provision our servers, that problem will be resolved and we will start to generate alerts.
User agent string is ‘Seoradar’. You can configure a different user agent on the domain settings screen.
Yes. If you have case where you would like us to ignore robots.
We support the meta-fragment tag and will go out to fetch the HTML snapshot. We will also generate an alert if the meta-fragment tag is deleted.
With our Gold version we monitor Hreflang tags for any insertions, deletions or changes.
Yes. If you go to the add domain screen and add a mobile site, it will automatically synch the URLs that it tests with your main site configuration.
For basic (Bronze, Silver and Gold), multi-user is view only, sees all domains and does not get notifications. For Enterprise and Agencies, we have enhanced multi-user. Admins can assign domains to specific users and they will get full alerts for the domains they are assigned to.
Currently, we overlay GA data on top of your change history. You can think of this as an auto-annotation of GA data. You can change segments and metrics and then drill into your change history to research changes to key metrics.
You can assign categories to pages. For example, product page, review page or search result page. Once you do this, you can filter the crawl summary by page category and review alerts specific to that page category.
Competitors are monitored weekly and do not generate email alerts. Only essential alerts are generated for competitors. However, they do not count against your domain count.