Pre Exam Test Practice
Ask a Question
Advertise on boostr.in
Organizational Research By
Surprising Reserch Topic
boostr.in questions - Question:How To Use Index Status in Google Webmaster Tools to Diagnose SEO Problems
How To Use Index Status in Google Webmaster Tools to Diagnose SEO Problems
Index Status in Google Webmaster Tools
In late July, Google added Index Status to Webmaster Tools to help site owners better understand how many pages are indexed on their websites. Â In addition, Index Status can also help webmasters diagnose indexation problems, which can be caused by redirects, canonicalization issues, duplicate content, or security problems. Â Until now, many webmasters relied on using less-than-optimal methods for determining true indexation. Â For example, running site: commands against a domain, subdomain, subdirectory, etc. Â This was a maddening exercise for many SEOâ€™s, since the number shown could radically change (and quickly).
So, Google adding Index Status was a welcome addition to Webmaster Tools. Â That said, Iâ€™m getting a lot of questions about what the reports mean, how to analyze the data, and how to diagnose potential indexation problems. Â So thatâ€™s exactly what Iâ€™m going to address in this post. Â Iâ€™ll introduce the reports and then explain how to use that data to better understand your siteâ€™s indexation. Note, itâ€™s important to understand that Index Status doesnâ€™t necessarily answer questions. Â Instead, it might raise red flags and prompt more questions. Â Unfortunately, it wonâ€™t tell you where the indexation problems reside on your site. Â Thatâ€™s up to you and your team to figure out.
The Index Status reports are under the â€śHealthâ€ť tab in Google Webmaster Tools. Â The default report (or â€śBasicâ€ť report) will show you a trending graph of total pages indexed for the past year. Â This report alone can signal potential problems. Â For most sites, you should see a steady increase in indexation over time. Â For example, this is a normal indexation graph:
Basic Index Status Report in Webmaster Tools
But what about a trending graph that shows spikes and valleys? Â If you see something like the graph below, it very well could mean you are experiencing indexation issues. Â Notice how the indexation graph spikes, then drops, only to spike again. Â There may be legitimate reasons why this is happening, based on changes you made to your site. Â But, you might have no idea why your indexation is spiking, and would require further site analysis to understand whatâ€™s going on. Â Once again, this is why SEO Audits are so powerful.
Trending Spikes in Index Status Basic Reporting
Now itâ€™s time to dig into the advanced report, which definitely provides more data. Â When you click the â€śAdvancedâ€ť tab, youâ€™ll see four trending lines in the graph. Â The data includes:
Â Â Â Â Total Indexed
Â Â Â Â Ever Crawled
Â Â Â Â Not Selected
Â Â Â Â Blocked by Â Robots
â€śTotal indexedâ€ť is the same data we saw in the basic report. â€śEver crawledâ€ť shows the total number of pages ever crawled by Google (the cumulative total). Â â€śNot selectedâ€ť includes the total number of pages that have not been selected to be indexed, since they look extremely similar to other pages, or that redirect to other pages. Â Iâ€™ll cover â€śNot selectedâ€ť in more detail below. Â And â€śBlocked by robotsâ€ť is just that, pages that you are choosing to block. Â Note, those are pages you are hopefully choosing to blockâ€¦ Â More about that below.
Advanced Index Status Report in Google Webmaster Tools
What You Can Learn From Index Status
When you analyze the advanced report, you might notice some strange trending right off the bat. Â For example, if you see the number of pages blocked by robots.txt spike, then you know someone added new directives. Â For example, one of my clients had that number jump from 0 to 20,000+ URLâ€™s in a short period of time. Â Again, if you want this to happen, then thatâ€™s totally fine. Â But if this surprises you, then you should dig deeper.
Depending on how you structure a robots.txt file, you can easily block important URLâ€™s from being crawled and indexed. It would be smart to analyze your robots.txt directives to make sure they are accurate. Â Speak with your developers to better understand the changes that were made, and why. Â You never know what you are going to find.
The Red Flag of â€śNot Selectedâ€ť
If you notice a large number of pages that fall under â€śNot selectedâ€ť, then that could also signal potential problems. Â Note, depending on the type of website you have, it might be completely normal to see a larger number of â€śNot selectedâ€ť than indexed. Â Itâ€™s natural for Google to run into some redirects and non-canonical URLâ€™s while crawling your site. And thatâ€™s especially the case with ecommerce sites or large publishers.
But, that number should not be extremeâ€¦ Â For example, if you see the number of pages flagged as â€śNot selectedâ€ť suddenly spike to 100K pages, when you only have 1,500 pages indexed, then you might have a new technical issue on your hands. Â Maybe each page on your site is resolving at multiple URLâ€™s based on a coding change. Â That would yield many â€śNot selectedâ€ť pages. Â Or maybe you implemented thousands of redirects without realizing it. Â Those would fall under â€śNot selectedâ€ť as well.
Index Status can also flag potential hacking scenarios. Â If you notice the number of pages indexed spike or drop significantly, then it could mean that someone (or some bot) is adding or deleting pages from your site. Â For example, someone might be adding pages to your site that link out a number of other websites delivering malware. Â Or maybe they are inserting rich anchor text links to other risky sites from newly-created pages on your site. Â You get the picture.
Again, these reports donâ€™t answer your questions, they prompt you to ask more. Â Take the data and speak with your developers. Â Find out what has changed on the site, and why. Â If you are still baffled, then have an SEO audit completed. Â As you can guess, these reports would be much more useful if the problematic URLâ€™s were listed. Â That would provide actionable data right within the Index Status reports in Google Webmaster Tools. Â My hope is that Google adds that data some day.
Bonus Tip: Use Annotations to Document Site Changes
For many websites, change is a constant occurrence. Â If you are rolling out new changes to your site on a regular basis, then you need a good way to document those changes. Â One way of doing this is by using annotations in Google Analytics. Â Using annotations, you can add notes for a specific date that are shared across users of the GA profile. Â I use them often when changes are made SEO-wise. Â Then itâ€™s easier to identify why certain changes in your reporting are happening. Â So, if you see strange trending in Index Status, then double check your annotations. Â The answer may be sitting right in Google Analytics. Â :)
Adding Annotations in Google Analytics
Summary â€“ Analyzing Your Index Status
I think the moral of the story here is that normal trending can indicate strong SEO health. Â You want to see gradual increases in indexation over time. Â That said, not every site will show that natural increase. Â There may be spikes and valleys as technical changes are made to a website. Â So, itâ€™s important to analyze the data to better understand the number of pages that are indexed, how many are being blocked by robots.txt, and how many are not selected based on redirects or canonical issues. What you find might be completely expected, which would be good. Â But, you might be uncovering a serious issue thatâ€™s inhibiting important pages from being crawled and indexed. Â And that can be a killer SEO-wise.
Sep 13, 2013
Java Interview Questions
Sep 12, 2013
to add a comment.
Related Hot Questions
Government Jobs Opening