Anyone who want their website to be on top search results of Google and other search engines for long-term should pay attentions to few important things while optimizing their website. If you follow the Search Engine quality on-page and off-page guidelines and optimize your website properly, you will be rewarded with better ranking on SERPs. On the other hand, Google doesn’t like over optimized websites and pages and punish websites with pushing them down the search result rankings which can can be very daunting to recover from.
Instead of going on with the optimization haphazardly you should review and utilize the following basics of onpage page optimization of your websites.
On-page Search engine optimization (SEO) : common mistakes to avoid
A website is considered to have duplicate or near-duplicate content when two or more different pages in the website have same or nearly same content. Duplicate content is a poison for good ranking and very often leads to your website getting devalued or filtered out in the search engines. Even if you have used unique content in your website duplicate content problem can arise from few technical aspect of how your web host handles the domain. THe most common couse for duplicate content is if the site is accessible with and without www. It leads to search engines detecting two web pages for your homepage. It can be resolved by making certain redirect changes in the htaccss. File in your hosting panel.
Follow these steps to resolve the issue: Enter the following lines in the htaccess files (replace “domain” with your site’s name; replace “html” with “php” if relevant):
After that login to your Google webmastesr account and click on Settings and change the preference for the type of domain “With” or without “www”
Webpages with duplicate or missing title tags:
Each pages should have its own unique title. Title describes the topic of the website, and it should be less than 55 characters. Web pages with missing titles or duplicate titles doesn’t rank well on Search engines, as they can’t decide on what’s the page is about. To check if your website has duplicate titles, login to your Google webmasters tools and click on “Html Improvements”, go to duplicate Meta Page, and there you can find if your website has any duplicate or missing title tags. Resolve it by creating unique and relevent title tags on your website and submit a new site map.
Little or poor content:
A major focuse of on-page Search engine optimization lies in creating and publishing meaningful and relevent content on the website. High quality content always win on the search engines, and has become the most important factor for better placement of the website on Search engines. The content of your page should be unique, relevant and interesting, it should provide some value to the website users. Improve the page usability y adding interactive content like, images, infographics, or videos to the web pages. Further, remove junk code from your website to improve the code to text ratio.
Hidden Text and content:
One of the most important principle in Search engine optimization is that you should not utilize any shoddy tricks to improve your Search engine ranking. Hidden content is one of them. Hidden content is a content specifically created for Google Bots to artificially manipulate the ranking of the webpage. This content is not visible to the users. Using this trick will lead to punishment form Search engines sooner or later. One of the popular trick is using drop-elements whose text content are parsed by search engines based on the source code and its available to users only when they click on an active element. Further, hiding content by using same color background is also considered hidden content. Never use white content with white background.
Image optimization: missing or duplicate alt tags:
Alternative tags for images are the keyword phrases that visitors see in case the image doesn’t load. These tags tell Search engines what the images is about and improve the ranking in the organic and images search results. Webmasters should put in relevant terms and other information with the ALT and Title tags for images.
Bad and non-relevant keywords:
Keywords are lifeblood of any search engine optimization. That’s why a proper keyword research is a prerequisite before starting any SEO campaign, one should select the keywords carefully and make sure that it fits the theme of the website or the service you offer. Placing the keywords in dominant positions like paragraph headers improves its value. Further, html tags like, bold, underline, italics also improves its importance in the eyes of search engine. There has been a myth about keyword density going on for years. However, as confirmed by Matt Sutts, there is no base keyword density. The density depends on the quality article, if you have written a quality content the keyword density would be right in it. Over using or under using the keyword for search engines might result in negative results. Instead of focusing on keyword density, content writers should focus on user friendly informative content for the website.
Missing or non-relevant headers and headlines:
Headers provide useful information to search engines on the topic of a particular paragraph. Webmasters should create headers carefully and use proper keywords to indicate what the paragraph or section of the website is about. There are special html tags to donate headers (H1,H2, H3 ..), webmasters should use the tags according to the importance of the content on the webpage. The most important paragraphs should be tagged with H1, also it should be placed on top of all other headers.
Missing or incorrect canonical tags and urls:
Setting up Canonical URLs specifies the unique url to the content of your website. It’s primarily used to avoid duplicate content on your website by telling search engines about the preferred or representative URL of the webpage. A website can ve accessed with or without www, so with canonical url you set up the representative url to your website. Google webmasters tools has a whole section on canonical urls and their use and how to specify them. For more details, visit : https://support.google.com/webmasters/answer/139066?hl=en
Neglecting proper management of webpages:
Just putting website on internet and then not properly managing and updating it on regular basis is grossly negligent, and can lead to drop in rankings on search engines. You must renew your webpages with interesting and up to day content regularly and maintain the technical aspects and structure of the page. Make sure there are no duplicate titles or meta description, and clean up the broken link and create good quality 404 error page, and 301 redirects. Regularly not managing the website will increase the clutter and errors and decrease the quality of your website.
Internal link management:
Create a uniform internal link structure and make sure to link to all pages. Google decide on the importance of a page on a website based on the internal links coming to the webpage and the level of pages linking to it. Make sure to link pages which are directly related to each other, as it will help search engines decide the relevance of the content. Don’t link to irrelevant pages.
A high number of outbound, do follow or no follow links:
When acquiring or earning external links or backlinks, make sure they are as relevant to your website niche as possible. Backlinks form untrusted and nonrelevant websites can detain the link authority of your website. You should get both do-follow and no follow links. Getting only do follow links create an unnatural link profile and leads to google penalty.
Slow website load time:
XML sitemaps help Google and other major search engine understanding the structure of your website and index all the pages correctly. Without a sitemap, it may happen that some pages are not found by the search engines or new pages are indexed very late – so it’s advisable to provide an automated sitemap.xml in your website root directory. In general, the map is automatically recognized by the search engines and taken into account. You can also submit it to the webmasters account of Google and Bing.
Robot text behave similarly to the sitemap – there are not necessary if you are using index and no index tags on your html webpage. However, if not, you should add robot.txt to tell Search engines on which pages on your website should not be indexed. For example, admin area, or link to cart, and other payment pages. Once you have created the robot.txt file you should upload it to the root directory of your website and should only include the pages you don’t want to be indexed by Search engines.
I hope this post helped with you with the overview of the most common mistakes in on-page SEO optimization. In future we will discuss in details all the problems mentioned in this post. Next post will discuss the most common mistakes and basics of off-age SEO and Link building.