Search engine ranking is one of those things that website owners are constantly fighting to improve. The worst of search engine ranking is being termed “supplemental.” With Google, once you are in the supplemental directory, getting your site back out is like battling upstream with a spoon though the rapids.
Recently, an article was published that gave some very worthy tips on doing just that, rescuing your site from the black hole of “supplemental” listing.
The first point the writer made is to make sure that the URLs involved are search engine friendly, and short. Making them search engine friendly is fairly easy, take out those special characters. They are not necessarily special so much as a hindrance to the search engines. Quite simply, putting that percentage sign in there confuses them and they have no idea what percent to index and what percent to leave out, so just leave the percentage sign out.
Next, check that 404 error page. If it’s not returning correctly and showing an error then the search engines are likely to term the entire site as an error and into the black hole it goes. Fixing that 404 error page can rescue the entire site.
Take a look at the page content. Making sure that each page has unique keywords, page titles, and descriptions will not only keep you out of the black hole but will also make your site more attractive to the search engines and human visitors alike.
Finally, remove from the search engines range the extra pages that provide duplicate content. Each page on your site should be unique. Some webmasters like to have printable versions of certain pages in their site. This is especially true of things like articles, checklists, or even diagrams. The problem comes in that the search engines then see the duplicate content and add both pages to the supplemental index. A simple solution is to add a “robot.txt” file to your site and tell the search engines to skip over the printable copies. They will not index these pages and that will allow the original pages to be indexed correctly.