- Log in to post comments
I think it is fair to say that Internet users love change, in fact I imagine that many of them embrace the concept wholeheartedly. After all, there are not many mediums guaranteed to keep you engaged and on the tip of your toes learning new things on a daily basis.
The changes that have occurred on the search engines since I began StepForth in 1997 have been remarkable and very characteristic of the ever-changing environment that the Internet embodies. In keeping with such change, it stands to reason that the techniques required to achieve top search engine placements will change as well. The prerogative of search engines has been and always will be to advance their technology in an effort to provide better and more effective search results for their users. Such effective search results require the identification and ultimately the complete eradication of manipulative SEO tactics often equivalent to Spam. This quest for the Holy Grail of search results necessarily requires that search engine marketing tactics change and grow with the new technology. Unfortunately there are many SEO strategies still being used by webmasters that were put to bed a long time ago. In many cases, these SEO strategies are not only ineffective but are now considered Spam which often has dire consequences for rankings. The following is a listing of what I consider to be the Top10 worst search engine optimization tactics:
1. Doorway Pages (aka Gateway Pages, Leader Pages, etc.)
Definition: Create multiple web pages that are devoid of useful content but heavily optimized for search engine rankings. Each page is optimized for a variation of a keyword phrase or for completely different keyword targets. The essence of this concept was to fool the search engines into thinking that these pages were highly relevant and provide top rankings for them under their targeted phrase. When a surfer stumbled on the page they were often shown a "Click Here to Visit Joe's Pizza" link that the surfer had to click on to actually arrive at the legitimate website.
Once among the most popular methods of attaining multiple search engine placements, doorway pages were widely used until 2000 by many webmasters. Since then, Doorway pages have become the most obvious form of Spam that a search engine can find and the repercussions are dire if such a tactic is employed. Unfortunately, I have seen many sites still employing this tactic and occasionally we even get calls from potential customers wondering why they have dropped off the search engines for using this technique. We also continue to receive requests to perform this tactic on a daily basis.
2. Invisible Text
Definition: Invisible text is implemented in a variety of ways in an effort to increase the frequency of keywords in the body text of a web page. Some of the implementation methods are: making text the same color as the background of the web page, hiding text behind layers, placing text at the very bottom of over-sized pages, etc.
This tactic is perilously old and obvious to search engine spiders. It constantly amazes me when a web site utilizes these methods for placement. Invariably, placements are the last thing that a webmaster will get when using this tactic. Invisible text had its heyday from 1995 to 1999. This not to say that invisible text didn't work after 1999 but the majority of web sites were not using it by this time as the search engines began implementing automated methods of detection and penalization.
3. Content Misrepresentation
Definition: Misleading search engines into believing your webpage is about topic ‘A' when it is in fact about ‘B'. This tactic was used primarily for the promotion of adult, gambling, and other extremely competitive search markets.
Unfortunately this tactic is still in use; you likely find one or two every time you search! The fact is that this tactic is the simplest for a search engine to identify and the result will be swift and complete; banishment from the search engine index indefinitely. The worst offense in the realm of the search engines is to try to fool them.
4. Redirects
Definition: Redirects have some innocent uses (practical, legal, etc.) but they are also used nefariously to mislead search engines by making them believe that the page they have indexed is about ‘A'. When a surfer visits the page, however, they are redirected to an entirely different site about ‘B'.
In most cases search engines have advanced enough to see this technique a mile away. In fact they usually ignore any page with a redirect (assuming correctly that the content is useless) while spidering the redirect destination. Redirects, unless blatantly Spam-related do not directly result in intentional ranking penalties, however, they have no positive effect either.
5. Heading Tag Duplication
Definition: Heading Tags, by definition, were created to highlight page headings in order of importance. Thus the Heading Tags that are available: H1, H2, H3, etc. This duplication technique involves implementing more than one H1 tag into a webpage in order to enhance a particular keyword or phrase.
This tactic is still very prevalent and likely still works on some search engines; however, none of the major search engines will respond well to this technique as it has been identified as a common manipulation.
6. Alt Tag Stuffing
Definition: Alt Tag stuffing is the act of adding unnecessary or repetitive keywords into the Alt Tag (or alternative tag - shown by words that appear when you hover over an image with you mouse pointer).
The Alt Tag is meant to be a textual description of the image it is attached to. There is nothing wrong with tailoring the Alt tag to meet your keyword goals IF the tag is still understandable and if the change still appropriately describes the image. The offense occurs when an Alt tag has obvious keyword repetition/filler which a search engine can key in on as Spam.
7. Comment Tag Stuffing
Definition: Comment Tags are used to include useful design comments in the background source code (html) when creating a webpage. These are suppose to be used only for adding technical instructions or reminders; however, in times past these tags were used to artificially increase the keyword count for targeted phrases.
At one time there was some argument that this technique worked, but it has always been a "Black Hat" SEO technique which even then could result in placement penalties. Nowadays this technique will not help an SEO campaign, if anything it will be ignored or produce a negative result.
8. Over Reliance on Meta Tags
Definition: Meta Tags is a broad term for descriptive tags that appear in the of most webpages and are used to provide search engines with a concept of the page topic. The most common tags are the description and keyword tags.
At one time, extinct search engines such as Infoseek relied a great deal on Meta Tags and many took advantage of this factor to manipulate rankings with relative ease. In today's far more advanced climate the search engines place cautious weight on Meta Tags and when considering rankings Metas play only a fractional role. Some webmasters still consider Meta Tags the ‘end-all and be-all' of ranking producers and forget to optimize the rest of their webpage for the search engines. With this line of thinking they miss that the search engines place far more importance on the body text (or visible text) of the webpage. This is a critical error that will ultimately lead to low or insignificant rankings.
Note: An extremely common example of Meta Tag over-reliance are web sites that have been designed totally graphically and are devoid (or nearly so) of html text that a search engine can read. A webpage such as this will have no body text to index and may only provide a small amount of relevance to the webpage which ultimately leads to poor rankings.
Over reliance on Meta Tags does not produce intentional search engine penalties, however, the simple act of ignoring other ranking principles often means a lower ranking.
9. Duplicate Content
Definition: This tactic is blatant Spam that is very common today. Essentially the webmaster will create a web site and then create duplicates of each page and optimize them differently in order to obtain varying placements. By doing this you are saturating the search engine databases with content that is essentially eating valuable bandwidth and hard drive space.
Duplicate content is a dangerous game often played by full-time marketers accustomed to trying to attain placements in aggressive markets. Avoid this tactic like the plague unless you are willing to sustain serious ranking damages if you get caught - which you likely will.
10. Automatic Submission and Page Creation
Definition:
- Automatic Submission is the use of automated software to submit a website to the search engines automatically and often repeatedly.
- Automatic Page Creation is using software to create pages ‘on the fly' using predefined content (body text, keywords, images etc) to create "optimized" webpages to target specific keyword rankings on the search engines.
At StepForth the word ‘automated' is an abomination when used in reference to SEO. The fact is that automated SEO campaigns are not as effective as manual (by hand) optimization techniques AND such techniques often require the use of doorway pages to lead search engine users to polished marketing pages at the true destination page. In this case this is a double-offense by using the banned doorway page technique. My strong prejudices aside, lets take a short logical look at both tactics noted here:
A) Make Automatic Submission Search engines the majority of their profit from surfers like you viewing their advertising. Do you think that allowing automated submission tools to submit a web site (which bypasses SE advertisements) is in the search engines best interest? No, in fact the submission companies have had to upgrade their software repeatedly to try and subvert the search engines' latest effort to stop their programs. There are also concerns about bandwidth because automated tools have been known to repeatedly submit sites and sometimes each individual page within a site.
All-in-all, this leaves the submitter in an unstable position where they may or may not have their submission ignored. This is not even considering the fact that automated tools claim to submit a website once a day or a week or a month! The cardinal rule of search engines… submit ONCE and it may take a while but the site will get spidered at some point (up to 2 or 3 months later - max). If within a few months a site is not listed, then resubmit. If a search engine is submitted to too often that it is Spam and frankly the website being submitted will not fair well. As for the major engines like Google… be patient and definitely don't submit more than once if you can help it.
B) Automatic Page Creation
If a page is automatically created will it have the kind of quality content that search engines require for their index? Also, if the page is automatically created, will it not be using repetitive content? There may be a few incidences where there are some variable answers to these questions, however, I imagine the answer will be ‘No' 99% of the time which instantly illustrates a poor and dangerous search engine optimization technique.
In summary, I am desperately hoping that this Top10 listing prompted a "DUH" from you because variations of this Top10 list have been repeated to exhaustion in many introductory SEO forums and articles. Unfortunately, I know there are still many that have fallen into the trap of using such dated and dangerous techniques, and for them I hope this article prompts a rapid change in SEO policy.
Finally, be extremely wary of any search engine optimization company that suggests any of these tactics. Sure, there may be a small chance that a few of the tactics will work in the short term; however, that outcome is not only rare it is also a great way to get banned from the major search engines completely.
About The Author
Ross Dunn is the founder and CEO of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth has provided professional search engine placement and management services since 1997. Ross is a search engine optimization and placement expert with over 9 years of marketing experience and is a Certified Internet Marketing and Business Strategist (CIMBS). Blending his experience in the art of web design and search engine optimization, Ross offers a unique and informed perspective on obtaining top search engine placements. Ross can be reached at ross@stepforth.com.