Audit Your Website With Deep Crawl

0
625
DeepCrawl

If you want your website to be popular, you need to Deep Crawl your way to higher search engine rankings.

As a website owner, one thing many webmasters are not aware of is that problems lurk in your pages. Many sites are left running with errors suffering with poor performance, bad SEO, duplicate content, broken links, and a whole lot more. What should be done regularly, is to give your site a content audit.

This kind of work can be very taxing, long and laborious. If you have a huge website with lots of pages in it, the bigger it is, the more brutal it is going to be. Fortunately, there are automated tools that can do this sort of job. The one I am going to discuss here is a Service As A Platform (Saas) called Deep Crawl.

And Away We Go!

In less than 5 minutes, I created a new project to spider this website with the easy to use entry screens. Clicked the button to crawl and had a comprehensive report generated.

All of this was done through Deep Crawl’s impressive dashboard which I immediately noticed to be so streamlined and well thought out it made me completely appreciated what I was staring at. Very well done and totally polished, professional looking.

Now it was only a few months ago that I tore down this website and started all over from scratch. I was using Drupal 6 all these years and pretty much got tired of having to backup and deal with the upgrade process that I didn’t like. So I switched to my favorite, WordPress, and got a nice theme and eased my way into what it is today.

Fifty six posts and eight pages later, I decided that this site needed a crawl report to find out if there are any problems lurking. Boy was I in for a surprise.

After setting up a project, I ran Deep Crawl for the first time. It took less than 5 minutes to crawl my site and generate an audit report. Deep Crawl presented the final report in a nice layout that looks super in my wide screen desktop monitor. Since the report is way too big to show here in full screen size, I have broken it up in parts to give you a taste of what it offers.

Site Issues

This screen shows the number of problems and categorizes them. You can drill down to find out more.
Deep Crawl Report Issues

Pages Breakdown

A cool pie chart breakdown of the different page types.
Deep Crawl Pages Breakdown

HTTP & HTTPS Pages

A count of how many pages are HTTP based and HTTPS based.
Deep Crawl HTTP and HTTPS Pages

Web Crawl Depth

A super chart showing the different category of pages and their depths.
Deep Crawl crawl depth

Fixing Stuff With Deep Crawl

Now that I had my report in hand, I was off for the next three days cleaning this website up.

Canonical Url Errors

First thing I had to do was get rid of all the canonical url errors. I did a little research and even penned an article on the subject just to remember what I learned – Understanding SEO: The Canonical Url. Since this is a WordPress website, I decided to install the Yoast SEO plugin. Turns out, canonical urls are automatically generated for every post and page so I had to do nothing at all! The result of that effort got rid of a slew of duplicate content and title errors too.

When you run Deep Crawl, it keeps track of the changes that were made. This is crucial so that you can track what you did with the outcome of the audit. If what you did made things worse, you’ll know right away and can roll back those changes.

Deep Crawl canonical url fixes

Meta Descriptions

Meta descriptions help search engines like Google insert information about what your web page is about in their search results. To fix these problems, I used Yoast SEO. For posts and pages, I supplied and excerpt and used the %%exerpt_only%%% parameter in the settings. Category and tags by default do not get meta descriptions. So I used %%tag_description%% and %%category_description%% in the taxonomies section.

One thing you will find that can be annoying is having your sub pages crawled and indexed in the search engines. The problem that I see in this is that they are not static in nature and are always changing as new posts, pages, or tags are added. Pretty much they are worthless to a visit and at most times, the contents are out dated. To avoid having these archive pages indexed, I choose to use the Yoast SEO feature of noindex on subpages of archives.

This prevents duplication of meta description errors from coming up.

Duplicate Content Errors

In the body of some of my tag pages, Deep Crawl flagged them as duplicate content. After studying the pages, it was pretty much a false positive. Regardless, I tried to update the description in all the tags thinking that would clear it but it didn’t. There must be some sort of criteria that Deep Crawl uses to base duplication or similar content on. It goes to show though, that it is not in your control, but in the crawlers decision. Sometimes they can be wrong.

I later found the duplication precision setting that seems to be a sensitivity parameter. I changed it from 3 to 5 and that fixed the problem.

Disallowed JS/CSS Files

Disallowed JS/CSS files were another false positive that came up. Deep Crawl reported that both of my Google font references were being restricted. That wasn’t the case.

Max Fetch Time

Deep Crawl keeps track of URLs that take over a certain amount of time to crawl. They call this Max Fetch Time. By default this is set to two seconds and frankly, that is really too low of a bar to set. This, especially since it depends on what is on the page. For example, throughout my site I have links to Amazon, Google Adsense, and other advertisers that can slow down the page load. Two seconds is asking for a miracle. Four seconds should be the practical time limit. Fortunately, you can change these options as shown here:

Deep Crawl report setup

Max Links

This flag comes up when you have way too many anchor tag links in a page. For my site, some of these pages flagged were due to the tag cloud widget placed in the side bar. For every post and page on my site this existed and I decided it was not a good idea to put it there.

Instead, I came up with a better idea, one of which I did on a custom WordPress Theme I developed many years ago. Not wanting to have to reinvent the wheel, I searched around in the WordPress plugin directory and found Multi-column tag map. Its a perfect solution to my needs as all I did was create a page and put in a short code in it. This way, I can offer it up as part of my main menu navigation with the label “Explore” and let visitors search based on tag name.

Summary

Overall, using Deep Crawl to audit your website is something you should do at least once a week. It is better to find problems early on before they become an even bigger problem later. If you really want to rank high in Google, you need to use this tool. Highly recommended!