Can poor site structure hurt your rankings? Rick DeJarnette of Bing Webmaster Center says it will. Speaking in the Bing Community Blog, he stated:
“You can have great content and a plethora of high quality inbound links from authority sites, but if your site’s structure is flawed or broken, then it will still not achieve the optimal page rank you desire from search engines.”
It shouldn’t come as a surprise. We have reported in the past that poor site structure can be an obstacle to a search engine spider ‘seeing’ all of your pages. If they cannot see those pages, they won’t visit them and they wont be indexed. So what does Bing suggest?
- Use descriptive file and directory names – this has become a standard in recent years for all search engines – the suggestion is to include keywords in the file and directory names.
- Limit directory depth – Search engine spiders are only on your site for a short period of time. This means they are not going to dig to far to find pages.
- Use 301 redirects for moved pages - use redirects if you have moved your pages. This helps everyone find the page rather than returning a 404 page not found error. Your page has been indexed, why lose it because the search engines cannot find them?
- Use a robots.txt – There are certain areas that a search engine spider just cannot reach and read properly so rather than frustrating the spider, prevent them from even looking. These pages include forms, authentications pages and pages with no useful content.
These are useful tips but hardly anything new. Having said that, poor site structure will hurt your rankings – with that in mind, how tight is your sites setup?