Last Updated on September 13, 2024
There is no way that you can learn SEO techniques that work and how to implement them properly without making mistakes. Some are going to be a lot more common than you might believe and these are the ones that every person that wants to learn on page search engine optimization should be aware of.
1. Using the Same Title On Many Pages
The page’s title stands out as one of the really important factors that are considered by the search engine in order to determine what the webpage will talk about. It is highly important that every single page on your site will have a unique title that will properly reflect the purpose and content. Never create pages with the same title as the search engine will see them as similar and this will hurt your rankings. You should also remember the fact that the name of the page should include the really important keyword phrase that you are optimizing for.
Interesting Fact: Google webmaster tools can highlight if many or all of your pages are using the same title (yes, the error is so common that Google engineers decided to put development effort to highlight the mistake for webmasters).
2. Meta Keyword Tag Mistakes
Usually, meta keyword tags are not understood properly by most people. Google, Bing and Yahoo officially stated that they are not using the tag in order to see what the page is about and for years, this has been a policy that has greatly impacted how pages are ranked. The use of meta keywords stands out as a move that is not necessary and you will lose time in adding them on every single page. In addition, you are offering important information to competitors about keywords that are used in your optimization campaigns.
Interesting Fact: It was as early as 2007 when most search industry leaders agreed that the tag is nearly useless. Google declared in late 2009 that they have stopped supporting it. Yahoo officially declared the same a month later.
3. The Use of tobots.TXT to Block Content
In the past, people used robots.txt files in order to block content from search engines. This was done in order to have a much bigger content base when the official launch happened and in a belief that pages that are older will have a bigger weight in rankings. The truth is that it is a really bad idea to block the content from a search engine due to the fact that you want as much exposure as possible. If people cannot find your pages, they cannot use your service and a viral effect will never appear. Do not block your robot visitors by using a robots.txt file.
Interesting Fact: A lot of webmasters report that Google has note indexed their site after months only to find that their default robots.txt was blocking all content. More and more SEOs are opting for meta robots noindex tags instead of using robots.txt. You can also use robots.txt to specify location(s) of your sitemap(s).
4. Content Keyword Stuffing
This is a technique that worked wonders a few years ago but search engine results can no longer be manipulated by doing this. In the past, people simply added numerous keywords inside the content area and this would make the pages rank higher in organic results. Nowadays if you do this, you are more likely to receive a penalty and the site will receive a poor ranking or can even be blacklisted. If you have a really high bounce rate, it is usually a clear indicator that you have content that has keyword-heavy content.
Interesting Fact: Older search engines like Altavista supposedly used an inverted index for each keyword and ranked pages based on number of times a keyword appeared on a page (along with other factors).
5. Lack of Internal Links
The truth is that it is really easy to link the site’s pages together but it is also quite common to overlook this factor. You should perform it well as a good sitemap and proper internal links are going to raise the quality of your site and the weight of every single link.
Interesting Fact: Google can give an option to searchers to jump directly to a section of your website’s page if you have defined named anchors and using named links internally.
Avoid these common mistakes on your website and ensure you are not harming your site’s chances to get ranked higher.
6. Broken Links and Sitemaps
Any blog can gain many benefits from the use of a proper sitemap. This is due to the fact that it will connect all the pages and will offer the search crawler robot a way to understand which page links to what page. The robot that indexes your page will use the sitemap in order to know what page to go to after indexing the first one. The problem is that the crawler will act similarly to a human and when there are links that are broken, it will not prefer that. Make sure that you pay close attention to not having links that no longer exist.
7. Bad URL’s
Most people do not understand what a bad URL is and think that it is the same thing as a broken link. The bad URL can be defined as a link that cannot be found, visited, linked, clicked or submitted to any social networking site. The difference between a bad URL and a broken link stands in the fact that the broken link is not working anymore while the bad URL never worked. As with broken links, you need to make sure that there are no bad URLs present on your sites.
8. Flash Only Sites
This is a really common mistake that many will make because of a bad understanding of general search engine optimization principles. A flash only website will basically feature no text and a site that has no text will be really hard to rank high in search engines. In most situations, beginners will create a site by using only Flash due to the fact that it will look really good aesthetically. If you really want to use Flash, the best thing that can be done is to create 2 versions of the site. The first one would be a Flash version and the second one will be a HTML version.
9. No Alt Text for Images
This is a mistake that most will do because they simply do not realize it. When you insert an image in a page, there is the possibility to add a Title tag. This will count as extra text and it will be really important as the ALT text can add to the optimization of the entire page. Make sure that you use long tail keywords that are linked with the general content of the webpage so that you can help the page rank higher in search engines with the use of a really simple HTML code.
Interesting Fact: More and more SEOs believe that Google images search is going to make a big presence in Google’s unified search results in the year 2012.
10. The Use of Header Tags
Although this is the last point that is made, it is one of the most important that you need to be aware of when referring to on page SEO. Header tags like H1, H2, H3 and so on are always going to bring in benefits in SERP due to the fact that the search engines will weigh the words that you place under the header tags in their ranking process.
We hope you will find the following mistakes useful, and save yourself going down the wrong path. For more useful content please don’t forget to subscribe to the RSS-feed and follow Inspirationfeed on Twitter + Facebook! If you enjoyed the following article we humbly ask you to comment, and help us spread the word!