How To Tackle 5 Major SEO Issues Having An Adverse Effect On Your Site

March 1st, 2014

If your SEO has resorted to certain SEO techniques which are now creating an adverse effect on the site due to the latest algorithmic updates then you have to resort to undo all this by having a systematic and structured approach.
All the side effects of all the incorrect SEO methods implemented can be rectified and undone most of the times, but a lot of patience and step by step strategy is needed .

How To Tackle 5 Major SEO Issues Having An Adverse Effect On Your Site

Let us discuss a few issues which maybe hampering the search presence growth of the site.

  • Unwanted Links
  • Scraper Sites
  • Duplicate Content
  • Dynamic URLS
  • Page Load Time

Unwanted Links

Make a list of inbound links to your site using any tool like Screaming Frog or any other SEO tool. Make a list of inbound links which Google shows in Webmaster Tools. Filter out the links which according to you are spammy.

Try to manually remove those links if possible or send a request to the webmaster of that site to get the links removed.Use the Disavow tool even if you have not got a manual penalty in webmaster tools.Keep monitoring the external links section in the webmaster tools and if the links that you have been successful in removing are not showing in webmaster tools then you are progressing. Repeat the process till all the unwanted links are removed .

Duplicate Content:

I came across an issue with a site which had a lot of legal case studies on their site and the similar case studies were available on many other sites as these details were provided by the legal registrars in PDFs which could be added on the site as it is without modifying any content. This content was very helpful for the client as they could send links to these case studies as references when discussing similar issues and inform the about the legal decisions taken.

The site was a genuine site but due to the Panda update the site had been affected adversely. The action taken was to shift all these pages under a subdomain and refer that subdomain as a news site. The subdomain is regularly updated with fresh content and case studies with additional inputs from the client. This solved the issue of duplicate content and logically categorized the content.

But, if you have an issue of duplicate content within the site then using the canonical tag or a 301 redirect correctly is the only way out.

Scraper Sites

There are many sites which copy content from other sites and post them on their blogs or sites. Though Google can detect a scraper site as the original content will have an earlier indexed date but there may be times when the scraper site outranks the original site which may affect the search presence of the original site.

Google has an option for that now . You can fill the Google Scraper Report form . webmasters can submit scraper sites that has copied their own content by providing Google with the source URL, where the content was taken from, and the URL of the scraper site where the content is being republished or repurposed, and the keywords where the scraper site was ranking on.

Dynamic URLs

Dynamic URLs are generated when the content on the page depends on the selection the user makes and the content is displayed from the databases. Hence the content varies from user to user. Dynamic URLs often contain the following characters: ?, &, %, +, =, $, etc.

A dynamic URL is a page address that results from the search of a database-driven web site or the URL of a web site that runs a script. In contrast to static URLs, in which the contents of the web page stay the same unless the changes are hard-coded into the HTML, dynamic URLs are generated from specific queries to a site’s database.

SEOs used to have a SEO Friendly URLs for such pages which eliminated the parameters. As of today this is not considered correct. Recently Google announced that Google wants the necessary parameters in the URL and can index such URLs.

For example on the basis of the selection of area, type and budget I make on a real estate site the URL generated is :

This is itself a dataset which conveys a meaning that this page contains data in the range selected which can be very useful for a user who makes a similar search on Google.

Hence, retain the necessary parameters in the dynamic URLs and do not point them to a single SEO Friendly URL. As this may be considered incorrect by Google because the URL will remain constant for each selection but each time Google indexes it the content will keep varying which does not serve any purpose either to the user or the search engine.

Page Load Time

Regarding Page Load Time And Speed Google says: Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we’ve seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don’t just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that’s why we’ve decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.

For decreasing the page load time the best way is to login to the webmaster tools account and go to ‘Other Resources’ then select the ‘Page Speed Insights’ 80% of the issues will be tackled if you follow the suggestions on the PageSpeed Insights Page.

But at times when all the SEO effort put in to undo the harm does not show results then the best decision is to start afresh by killing the site. But before you take such a drastic decision, its worth the while to give it a try before saying quit.

About The Author

Founder of WebPro Technologies a Web solutions company based in India which focuses on building quality web presence for businesses. Bharati Ahuja is a SEO Trainer and speaker, Web Entrepreneur, Blog Writer, Internet Marketing Consultant.


Digital Marketing Agency India