Rewind To Review Your Past Voice With Your Present Say…. Is Comment Archiving A Good Idea?

I am sure we all in the web arena have posted comments on blogs, written blog posts or shared and had conversations on social media sites either at a personal level or at a professional level. Lately, I have been in the retro mood since I started working on server logs for SEO analysis.

I think it is a good thing to rewind the tape of life sometimes and get some insight about our past actions and clear our mind to plan for the future. Hence I decided to focus on the foot prints created by WebPro Technologies by way of blog commenting in the past. This is not a client report so there are no metrics to monitor but a true self analysis to be done to know what was discussed and what was quoted by us on other blogs and social media sites and how much has been adhered to, by us . This also gives us a chance to analyze what our views had been in the past as compared to the present changing scenario.

Post Panda and Penguin updates there are umpteen no. of posts on natural links and link pruning but we have always been of the opinion that the term “Link Buiding” itself is wrong you do not build natural links, they get built in the process of the quality footprints you make on the sands of the web during your web journey.

One of the ways to judge the knowledge and ideologies of an SEO company is to read about what they have written in the past by way of comments, blog posts, social media conversations, reviews and on forum discussions about various topics and issues and see if they still have the consistency in what they say and how those opinions in the past have shaped up in the real world as of today.

After all the words written on the canvas of the web on various platforms are not mere words but content in different forms around which the whole web world revolves. Authentic, valuable content should stand the test of time and add value to the authority factor of the author. I have been commenting on and discussing many topics related to the search industry and have also been sharing links related to posts written by me if there was a strong correlation with the blog topic and never bothered if there was a 'no follow' attribute on the comment links or if I got any amount of thumbs down for it. As my main purpose was to put forward an opinion which I strongly believed in or felt about.

The past content posted by a person can say a lot about the knowledge, beliefs and long term goals about that persona. This can reflect the solidarity of the viewpoints made by that person and if the present scenario can tell us if they have adhered to that and stood by what they had said and also proves that if they have any coherence in what they say and what they do. This kind of check can also make people think before they post and becomes a self check for ensuring that quality content is published on the web.

Our business associate @wasimalrayes suggested that this could be a comment archive which can be added to the site and updated with every comment made on the web. I think it is a good idea which not only adds your web voice to your site but also acts as a self check tool for responsible content addition to the WWW.

We all have a blog archive, why not a coment archive too as after all comments also are mini blog posts posted by us on the web which reflect our opinion and perspective regarding that relevant topic?

What do you think about the ‘Comment Archive ‘ section being added to the site?

We would like to share some of the past comments , blog posts and social media conversations we have had on the web regarding various topics. Since the blogosphere is brimming with the posts regarding the Penguin and the Panda updates I’ll start with Link Building:

Topic Link Building :
Some Blog Comments Made By Us In The Past Reflecting Our Views On Link Building:

Our Comment 

WebProTechnologies | January 3rd, 2012

All the predictions for this year are spot on. I agree to all the points predicted.

Regarding #3 I think Google might just give us a surprise this year by giving less importance to inbound links . Only the links which will come from trusted and high authority sites and editorial links will matter and will be taken into account. The major focus will be on the social media signals which will reflect the trust and the authority factor. Hence, in what context the links are being shared on social media and the discussions and reactions surrounding it will make a big impact.

Hence, stop the link building nuisance and focus on building quality content (in all forms, images, text, video, audio, etc. and share it on social media) and let the natural links get built...

 

Comment :

Bharati Ahuja

February 20, 2010 at 10:24 am

Totally agree.

In My opinion everybody has just gone too far thinking only about how to get more and more links. I am sure when the PageRank concept must have been framed, the main purpose must have been to judge the true goodwill and popularity of the website in direct proportion to the no. of inbound links it has.

But with all these ethical and unethical methods of gaining more and more links the whole purpose is defeated.

If the site is having good informative content and with ethical SEO practices it ranks high in the search engines then it automatically gets a lot of links from various sources.

As the main purpose of a genuine searcher is to search for what is available globally and locally. Once the searcher finds that it surely gets added and linked by him in various ways.

Instead all the energies and efforts should be concentrated on building the website qualitatively in various ways by adding more varied content.

Don’t run after links. Let them come to your website genuinely.

Our Comment:
WebProTechnologies
| May 21st, 2010

Aptly put at the very begining of this post that link building is a task which is detested by all .

I am of the opinion that the term 'link building' itself is an incorrect term. Links do not have to be built but they should get built naturally in the process as your website starts getting a wider web presence and preference.

As every link is like a vote to your site and goodwill of your company and that has to be earned as part of the web journey of the website.

If we focus on the quality content, have a good site internal linking architecture, have a site which is visitor friendly as well as robot friendly then getting high SERPs is not a difficult task.

Once you have high SERPs trust me there will loads of directories and portals adding your site in their listings even without you knowing about it, as they too are looking for quality listings.

Once upon a time the dmoz listing was something that you always wished for once you submitted your site in dmoz as that surely was a valuable link. I dont know if it still has that importance but I still manually add each site to dmoz.

Apart from a good qualitative site in all respects other genuine methods of gaining natural inbound links as your website goes from one milestone to another are as follows:

Focus all your efforts on making the site informative, qualitative and content rich to get links automatically.·

Do not neglect the On-Page Optimization Basics and just go after links. (Very important from the SEO perspective)·

Participate in social media networks for discussions and sharing of information and mention links to the relevant pages to your website. (It need not be the Home Page always)·

Have a social book marking button on your website.

Make RSS feeds available on your website.

Issue Press Releases periodically.

Our Comment:

WebProTechnologies | May 31st, 2011

Well, despite all the thumbs down my opinion still remains the same. Your quality content on your website and quality web presence on all the search options, blogs, discussions, social media, etc. will always be rewarded in an increasing manner in the long run by any search engine and will result to inbound targeted traffic.

As we do a fairly good job on SEO and rankings without focusing on link building but in the process educate and train our clients to effectively maintain their blogs, and social media accounts and in the bargain they end up getting quality links and it has worked for us.

Our Archived Blog Posts On Link Building:

http://www.webpro.in/what-is-all-this-link-building-about-why-are-all-the-webmasters-and-seos-falling-in-link-love/

Topic 2 Social And Search Integration:
Blog Article / Social Media Post

Our Comment:

Yes initially Altavista was THE SEARCH ENGINE and keyword spam was something that Google had to work on to improve the quality of search results for which they came up with the PageRank Technology to add value and quality to search results.

But as every coin has 2 sides this innovation also gave birth to the link building spam and despite the improvement in the search results which established Google as the top most search engine, it polluted the web with unnecessary content clutter.

But as people kept on flocking the social media sites the search engines thought of using the public opinion as the criteria for quality and word of mouth. How well the search engines will integrate the social media signals only time will tell.

But it is for sure that this will ensure more genuineness as you cannot manipulate public opinion. SEO is what you say about your company social media is what others say about your company. When both these messages are in sync a credibility is established. Hence the authority, credibility, WOM and an overall presence is the demand of the day for true SEO , which in the long run will ensure natural and quality inbound links on its own.

So first work on content, establish an identity, authority and an online credibility and then the links will follow. And if we go to see that was the main goal of the PageRank technology to check how many people vouch for a certain page content but with link spam it got negated . Now with social media signals and focus on quality content via the Panda Update this will surely be taken care off to a great extent.

The best way to achieve great online presence will be to have an equally great offline and real time business presence :http://blog.webpro.in/2011/10/best-way-to-assure-quality-content-is.html

I will not be surprised if in the coming year the blogosphere gets bombarded with blogpost meteors on "THE DEATH OF THE SPAMMY LINK BUILDING INDUSTRY" instead of SEO being dead.Blog Article / Social Media Post

Our Comment:

Bharati Ahuja 11th January 2012

I think the blending of social results in search is not only the inevitable evolution of search but the reflection of what took place when civilizations evolved. We can just say that the stone age of search is over and now search even has the ability to reflect what people in your community are talking about and recommending. It is basic human nature to search for a want and then discuss with peers about their opinions and then take a decision. Since ages we have been doing this but now we have to just adapt ourselves to the virtual world for this kind of an action.

To a certain extent I believe that if Google wants to improve the quality of search results and combat the spam on the web then yes, it is highly essential that the search engine can access data from a resource it has full control on. But, from the search engine perspective only time will tell how well Google succeeds in integrating the social signals from other social media sites from all over the web else with the kind of hold Google has over the search market it is going to be, Google Google all the way…

But its surely not the end of SEO. In fact all these changes are taking SEO to a more qualitative level.

Archived Blog Posts On Our Blog:

http://blog.webpro.in/2011/02/integration-of-social-and-search.html

http://blog.webpro.in/2010/06/search-seo-and-social-media-integration.html

Topic 3 “Not Provided Keyword Data” 


Our Guest Post On The Topic:

http://www.seocopywriting.com/content-marketing/why-googles-recent-changes-mean-good-news-for-the-seo-industry/

 Archived Blog Posts On Our Blog: 

http://blog.webpro.in/2011/11/search-queries-googles-encrypted-not.htmlU

Blog Article / Social Media Post


Bharati Ahuja Jan 9, 2012I think this piece is a great summary about how Google has been offering support to SEOs right from start but can do much more as they have all the data now in fact the data about social signals too.The awareness of SEO has also improved a period of time and if Google at this stage continues to share more and more information it will become increasingly difficult for Google to maintain and improve the quality of search results. We saw that by 2010 the content and link spam had reached to a great extent for which Google had to come up with the Panda Update.

IMHO especially with regard to Keyword Referrer Data:

2011 was a year of changes and I think it is a period of transition to a better web and better search results as SEO is much beyond keywords and rankings.

When the businesses are at a loss for the complete keyword data the focus is shifted to the search queries in WMT which have a good CTR which is a true measure of quality over quantity.

This restriction makes the website owner think from a larger perspective and focus on the correlation of content and keywords rather than rankings. This will take SEO campaigns above the metrics of keywords and rankings and the focus will be on other quality metrics like CTR , conversions, bounce rate, etc. which will improve the quality of the web overall as the websites besides being rich in content will have to focus on good landing pages, a proper call to action, page load speed and good navigation which will ensure a better UX .
This lack of data will draw the line of distinction between a PPC campaign and a SEO campaign. The quality metrics will be CR and the CTR which again will make the client focus on content and the landing page design which will again be a quality step towards a better web world rather that discussing about keywords the client will be open to discuss about content and design.

Have shared my views also on http://blog.webpro.in/2011/11/search-queries-googles-encrypted-not.html

How To Use Server Log Files For SEO

Server Logs
Server Logs (Photo credit: novas0x2a)

Data, Data From Everywhere On The Server And Not A Byte To Benefit From....

All those who are in the web solutions business since early 2000 know that prior to Google Analytics , the most trusted analytics data was the log files on the server. In fact those log files are still in fact the most accurate and raw data available for the actual activity taking place on the server.

Server logs are automatically created recording the activity on the server and they are saved as a log file on the server itself. Usually the log file is saved as a standardized text file but it may vary at times depending on the server. Log files can be used as a handy tool for web masters, SEOs and administrators. They record each activity on the server and offer details about – what happened, when and from where on the server related to that domain. This information can record faults and help their diagnosis. It can identify security breaches and other computer misuse. It can be used for auditing and accounting purposes too.

A plain text format minimizes dependency and assists logging at all phases . There are many ways to structure this data for analysis, for example storing it in a relational database would force the data into a query-able format. However, it would also make it more difficult to retrieve if the computer crashed, and logging would not be available unless the database was available.The W3C maintains a standard format for web server log files, but other proprietary formats exist. Different servers have different log formats. Nevertheless, the information available is very much the same. For example the fields available are as follows: ( It may not be necessarily recorded in the same order on all servers)

· IP address

· Remote log name

· Authenticated user name : Only available when accessing content which is password protected by web server authenticate system.

· Timestamp

· Access request : "GET / HTTP/1.1"

· The request made. In this case it was a "GET" request (i.e. "show me the page") for the file "/" (homepage) using the "HTTP/1.1" protocol.

· Detail information about HTTP protocol is available inhttp://en.wikipedia.org/wiki/HTTP.

· Result status code : "200"

· The resulting status code. "200" is success. This tells you whether the request was successful or not.

· For a list of possible codes, visit http://en.wikipedia.org/wiki/List_of_HTTP_status_codes.

· Bytes transferred : "10801"

· The number of bytes transferred. This tells you how many bytes were transferred to the user, i.e. the bandwidth used. In this case the home page file is 10801 bytes, or about 10K.

· Referrer URL

· User Agent

Following is the example of the data which was exported to Excel from the log file:

Example 1:

180.76.6.233 - - [29/Apr/2012:05:04:56 +0100] "GET /blog/microsoft-windows-vista-ultimate-with-sp2-64bit-oem/ HTTP/1.1" 404 39621 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"

Example 2:

On some servers the fields will be mentioned in the log file before recording the data as follows and then the corresponding data for that date will be displayed:

#Fields Per Record: date time cs-method cs-uri-stem cs-username c-ip cs-version cs(User-Agent) cs(Referer) sc-status sc-bytes

Data Per Record: 2012-05-01 01:19:17 GET /seo-web-design.htm - 207.46.204.233 HTTP/1.1 Mozilla/5.0+(compatible;+bingbot/2.0;++http://www.bing.com/bingbot.htm) - 200 12288

Well, it is not as geeky as it looks , in fact it is very smiple. The data from the log files can be retrieved easily by importing the text data in Excel or by using standard software available for extracting data from log file like the “WebLog Expert” the sample report generated can be viewed on http://www.weblogexpert.com/sample/index.htm

The analysis of the log files can offer some great insights about the traffic on the server and many times the spam on the server and a hacking attack can be detected early and the harm on the sites can be reduced to a great extent as the corrective action can be taken immediately. This can be a real boon to every SEO as this data will be reflected in WMT much later.

The data can be filtered out as per the fields which need to be tracked. For example you can see in the image below how the WebLog Expert software shows the data graphically and numerically for the filtering that we did to trace the Google, Bing and Baidu bot activity on a particular domain .

Keeping a track of this data can give us information related to the crawling of the bots , downloads, spam attacks, etc. Of course , after all it is all raw data and just data in itself is meaningless but how you correlate and connect the dots to come to correct conclusions to take the right decisions is what makes the difference.

For me it is a Déjà vu feeling as when we did not have Google Analytics the server log files and Webalizer were the only resource. Sometimes, going retro is the coolest thing to do because, some trends which seem to be new are actually very old.

Social Media Sharing Policy To Put Your Best Foot Forward For Search

Since social media especially Twitter, Facebook, Google+ are now integrated in a big way in the search engines and have the offshoot benefit on rankings and search presence, its high time that there is a policy for what should be shared via the company and personal social media accounts.

Common examples of social media include blogs; social networks like Facebook, LinkedIn, and micro-blogging sites like Twitter; media-sharing sites like YouTube and Flickr; etc.

Social media signals can leverage the search presence of a web entity and add to the digital assets for web presence but at the same time it also has an equal and opposite potential of becoming a liability and also affect the search presence adversely.

Image credit: http://goo.gl/iAUDT

Google continuously works on improving the quality of search results and combating the spam on the web is the primary task of the web spam team at Google headed by @mattcutts . Google since last year has been working continuously working on the content spam and has been clearing the web of the content clutter by the series of Panda Updates and also has been working on eradicating the link spam on the web which gives wrong signals to search and manipulate search results.

In real life you are what your thoughts are, online you are what you share and publish.

Similarly, social signals in the long run have the same potential to pollute the web by spam if the companies do not follow a code of conduct for social sharing or do not formulate social media sharing policies.

Many times this task of optimization and social media management is outsourced to a third party so it is assumed that the third party is responsible for all the actions but after all it is the reputation of the company which is at stake. Hence, one can delegate the task but not the entire responsibility and answerability.

As the search and social signals integration is still at the initial stages and the search engines are themselves exploring and experimenting with the search and social data , it is the right time for companies having a corporate social media policy which should be followed to avoid the spam and have a good online reputation.

In online social networks, the lines between public and private, personal and professional are blurred the need for clarity of identity is vital. It is essential that when one shares on behalf of the company on social media he or she clearly states how they are connected with the company and own up for the content that one shares.

IMO Sooner or later Google will see through the social media spam too and start penalizing the accounts which may not be genuine or maybe sharing just to boost a certain listing.

Hence, a social media policy will help the company to avoid such disastrous outcomes.

Some of the points which the policy should include to avoid spammy behavior resulting in bad online reputation are:

  •  Follow authority accounts and accounts which have quality posts related to your industry
  •  Identify the purpose for the social media presence Read more about this on:
    http://blog.webpro.in/2011/07/setting-purpose-of-your-social-media.html
  •  Define your audience
  •  Add value with each post and tweet
  •  Be consistent
  •  Share only if you like the content and want others to know about it not just because you know that person.
  •  Do not keep sharing posts of some accounts only. Have a varied mix and see that you connect with more and more relevant accounts over a period of time.
  • The posts should gradually add to the trust and the authority factor of the company’s online presence.
  • The quantity does not matter hence do not buy likes or +1s, they will end up adversely affecting the social scene just like the paid links.
  • Let the friends, followers and the circles increase naturally due to quality postings.
  • Build the brand along with the online persona of the people who share on the social media.
  • Do not be boisterous by only saying what you want to say, listen, reply, engage.
  • Accept negative comments with grace and clarify if necessary
  • Build blog communities and discuss the posts from all perspectives. Do not end up just creating mutual admiration groups which in the long run deprive you of valuable UGC .
  • Do not take anything personally.
  • Just as you keep your personal and professional lives separate in the real world, following the same thumb rule will surely be the best way to balance both the worlds.

All these points are applicable if you want the social presence have a direct and positive correlation to your search presence. For personal accounts these may not matter as for personal and professional personas the rules maybe different.

This infographic from Seo Smarty provides a handy one-page resource as to how each of the major social networks defines spam, what sort of spam you can expect and their position therein.

social media spam infographic

Enhanced by Zemanta

Is Your Website Markup Enriched To Support The Rich Snippets Updated By Google?

The latest buzz in the SEO world has the surround sound on the topic of “Semantic Web and Semantic Search”. This is the next step of the ever evolving SEO and it brings in an era where SEO’s have to give importance to microformats, schemas and go beyond the word to word mapping and focus on the meaning of the content of the website and try to correlate it with the meaning of the query being searched on the search engine.

The search engines have been working on semantic search since the advent of web 2.0. Google also has been supporting and promoting content by displaying the Rich Snippets in the search results.

Some of the type of content given priority by Google for Rich Snippets display in SERPs is:

Reviews
People
Businesses and organizations
Recipes
Events
Video

Yesterday , Google has announced the Product Rich Snippets too. The Rich Snippets Tool which verifies the structured data markup has been upgraded to support HTML Input also.

Rich snippets help you to: 

  • Attract potential buyers while they are searching for items to buy on Google.
  • Submit your product listings for free.
  • Control your product information. You can maintain the accuracy and freshness of your product information, so your customers find the relevant, current items they're looking for.

If you're a merchant, you can give Google detailed product information we can use to display rich snippets (for example, price, availability, and review ratings) right on our search results pages. (Right now, shopping rich snippets appear on search results pages only in the US—but we're working hard to make them available everywhere.)

Rich Snippets: Product Search

Product markup allows you to provide Google with specific information about the products on your site. Information about price, availability, reviews and more may be used in search results to help users more quickly and accurately identify relevant content. Visit http://www.schema.org/Product for more details.

More details on the Product Rich Snippets on http://support.google.com/webmasters/bin/answer.py?hl=en&answer=146750

A Product can include an Offer or Offer-aggregate; or an Offer or Offer-aggregate can include one or more Products. Use the structure that works best for your content.

A Product snippet should have details about :

  • Name Of The Product
  • The Image Of The Product
  • The Brand Of The Product
  • The Category Of The Product
  • Review About The Product
  • Identifier Which Includes Brand and Product Identification
  • Offer Details

Rich snippets add the logical meaning to the data and support semantic search but, just because the content on the page is expressed in mark up which supports the rich snippets and has been verified by the snippets tool, it is not certain to rank in the SERPs unless it is inter-linked by various quality signals from the social and the meta web.

Hence the SEOs now apart from the other on-page factors have to ensure that that atleast the content related to products, reviews, people, recipies, Address (using hCard) video, business and organizations are represented in the correct markup which has a potential to generate a rich snippet in SERPs and search visibility on various search options.

Forget The Keywords And Focus On The Keyness Factor Of The Content

Yes, we all want our websites on the numero uno position of Google where people can find us easily and reach us. All internet marketing efforts focus all the attention to reach that position on search, social and all over the web and try to be found where people will be searching for their products and services.

We keep on saying that an overall presence is necessary to achieve good visibility on the search engines but let us understand why this is necessary by understanding the meaning of each layer of the evolutionary semantic web.

Everyone wants the magical formula to achieve the above said target. Is there any secret formula ? No there is no single or an instant formula to achieve that. The fact is that , this kind of presence and magically being found everywhere on the digital front is a result of logically building the presence step by step.

Some of the quality factors influencing search presence are:

  • Content
  • Connectivity
  • Relevance
  • Correlation
  • Authority
  • Security
  • Trust

The web is shifting from the web of documents to the web of data. Hence in such a web the majority of the content is in RDFs or in the database

Websemantic
Websemantic (Photo credit: Wikipedia)

As you can see in the image above, we have URI and Unicode as the basis of the web. I would like to call it the first step which adds the keyness to the webpage and the content of that page that you want to promote organically on search engines or on the web.

The second and the third layer have the XML and RDF content. The content that we mostly have in this format is the RSS feeds and sitemaps which are in XML and the contact details, videos, reviews, events,etc now are being represented more and more in the RDF format. RDF has an XML syntax.

–Every Description element describes a resource

– Every attribute or nested element inside a Description is a property of that Resource

The URLs, titles and the descriptions are of the utmost importance in the XML files like the RSS feeds and sitemaps. If we focus on the keyness factor and try to cater to correlate to a wide range of keyword data rather than focusing on the keywords alone in the URLs, titles and descriptions it gets passed on to the next layer of the emerging web making the pages pass on more correlation and relevance and also syncs the on page signals with the off page signals which get directly indexed in the search engines in the form of feeds and sitemaps. Titles are not merely summaries, but are interpretative guidelines.

Now , from the URI, XML, RDFs we move to the ontology layer. The adoption of standards such as XML or RDF allows interoperability in principle, but does not solve the problem of the production, identification and evolution of knowledge. Ontologies reduce language to a nomenclature which describes neither textual structures nor the considerable variation of genres.

In general, an ontology provides a mechanism to capture information about the objects and the relationships that hold between them in some domain of interest.

According to Wikipedia: In computer science and information science, an ontology formally represents knowledge as a set of concepts within a domain, and the relationships between those concepts. It can be used to reason about the entities within that domain and may be used to describe the domain. An ontology renders shared vocabulary and taxonomy which models a domain with the definition of objects and/or concepts and their properties and relations. Ontologies are the structural frameworks for organizing information and are used in artificial intelligence and the Semantic Web

Ontologies help take the direction of the web from information to knowledge. Ontologies are considered as meta language descriptions of documents, conditioning access to them. Metadata are to be found in the document header, while the data themselves form the body of the text, or more precisely the intratext. From the “Web of Documents”, the transition is thus made to the “Web of Data”, and then even to the “Web of Metadata”: this is the conception currently in force with the Semantic Web.

The URI layer adds text, the RDF layer adds data, the ontology layer adds the relevance, correlation, logic and connections but the security and the encryption pillar is the pillar which is the backbone of such an evolutionary web. If your website has a presence on all these layers or is built keeping in mind the outcome of each layer then your site can have the potential of having a quality web presence and a good visibility factor and gain the trust factor which is like the icing on the cake and can be achieved after a period of time after your web presence has passed the quality test of each layer.

Search is moving from word to word mapping to content relevance and correlation. Its high time we moved on from quantitative metrics to qualitative metrics. .This can be done by moving our focus from the keywords to the keyness of the content.

For that we have to focus on the following metrics rather than rankings alone to judge if are web presence is passing the test of each layer

  • Content Keywords - Which give and idea of keyword variance in the index and relevance
  • Search Queries – Which give an idea about correlation
  • No. Of Impressions – Which give an idea about how well the search engine is able to combine and sync the on page signals with the off page signals.
  • Landing Pages – This gives us an idea about how well each URL is getting indexed along with proper relevance and the correlation factor.
  • Click Thru Rate – This indicates the success factor of the on-page titles and descriptions of the landing page for that relevant search query.

Also increasing visibility in more and more search options , rich snippets being displayed with the landing page, decreasing crawl errors and increasing no. of pages being indexed from the XML sitemaps are all quality indicators.

As SEOs let us focus on these quality factors and carve a special niche for SEO and establish the true meaning of SEO rather than ranting about the “Not Provided” data, suggesting a new term for SEO or start the age old story about how SEO is dying.

SEO is very much alive and has a quality dimension to it like never before. Thanks to Google for all the changes made in the algorithms , adding the encryption layer and also for giving more importance to rich snippets and social signals and thereby proving its constant endeavor to make search more qualitative.

Content + Connectivity = Branding. Is Your Content Contagious Enough To Create Or Promote Your Brand?

The digital world has changed the way a brand image is created in the minds of the people. People today are creating an image about your company by going through the company website, reading the buzz on social media and reading what you post on your blog and the company social media profiles.

Many times by the time your potential customer reaches your online store or real time shop to buy the product he already has a lot of information about you, your company, products and services. This information does not only come from the content that you post on the web but also from the content which others post about your company , products and services.

Armed with so much of info. and a platform to discuss about the brand openly the customer is more confident of taking a decision but at the same time has very high expectations depending on what impressions he has created about the brand in his mind.

In such a situation it is very important that the sales staff or the online buying procedure on the site clarifies each and every expectation realistically.

So, basically it is the content, which is creating the brand. Content in all forms images, text, videos, audio, news, PR Releases, etc. and also the content on social media which gets linked because of the connectivity and all this thus leads to a brand image.

Content + Connectivity = Brand 

If your content is directly related to the image of the brand then having a content strategy is a must. Hence, it is not only about adding blogposts, videos and images to your website or go on a posting spree on the social media sites.

Having a content strategy is the first step. The strategy has to come before the technology you select to host and promote it.

A Content Strategy should define:

· The audience for whom the content is created

· What type of content is right for which type of message

· It has to be on a regular basis with the quality improving over time

· It should preferably in a continuous , engaging, interactive story form.

· It should encourage and make people add to the existing content via comments, social media posts and discussions
The ultimate goal of all this content creation should be the conversion of this digital content to a digital asset. As acquisition and maintenance of any asset requires investment, content creation and promotion also requires strategic investment and planning.

Hence, investing in a right team to create , correlate and promote content is one of the important online marketing decisions.

The following video puts forward this idea very creatively.

Coca Cola Content 2020 : We will move from creative excellence to content excellence. The purpose of content excellence is to create ideas so contagious that they cannot be controlled….

Is your content contagious enough to create or promote your brand?

How The Semantic Web, HTML5, Microformats And SEO Are Inter-Linked

The only constant (K) on the World Wide Web  and the SEO world is change. SEO and the web are constantly evolving .Google’s constant endeavor for improving the quality of search results is a known fact. Google introduced the PageRank technology to deal with the keyword spam that was prevalent in the mid 2000. But this innovation brought with itself the link building spam as an off shoot.

It also recently came up with the Panda Update to control the content spam and improve the quality of search results further. But this focus on content has already started an exodus of content in all forms on the web. How Google is going to determine the grain from the chaff is going to be the next big challenge now.

The following 4 quadrants explain the information and social connectivity very effectively.

Photo credit: http://www.novaspivack.com/

created by Sebastian Faubel with Inkscape Vect...
created by Sebastian Faubel with Inkscape Vector Illustrator. (Photo credit: Wikipedia)

The next big evolution is the semantic search which is a much wider and deeper concept based on the science of semantics which has the potential to improve search accuracy by understanding searcher intent and the contextual meaning of terms as they appear in the web eco system or within a closed system like the search engine index, to generate more relevant results.

The Semantic Web is a set of technologies which are designed to enable a particular vision for the future of the Web – a future in which all knowledge exists on the Web in a format that software applications can understand and reason about. By making knowledge more accessible to software, software will essentially become able to understand knowledge, think about knowledge, and create new knowledge. In other words, software will be able to be more intelligent – not as intelligent as humans perhaps, but more intelligent than say, your word processor is today.

Semantics adds extra information to help you with the meaning of the information.

The Semantic Web does not only exist on Web pages.Web 3.0 works inside of applications and databases, not just on Web pages. Calling it a "Web" is a misnomer of sorts — it's not just about the Web, it's about all information, data and applications.

Data is the foundation on which such a web and search world can exist. Data in itself is meaningless but when data gets linked because of its relationships with various data sets available on the web, it becomes useful and meaningful and solves the purpose for which it was being searched as data becomes contextual due to the inter-connected relationships . The more it is connected the more powerful it becomes and gives more related information. Google Plus has that secret unlocked potential of correlating, connecting and linking all the data related to a profile and then integrating it with search with its data about people, places and pages.

Web 1.0 was about static pages linked together,

Web2.0 can be defined as the emergence of the web since the last boom of the .com’s (new web based sites/applications/technology- i.e.,Adsense, Wikipedia, Blogging), and the dynamic websites designed/programmed in terms of user interaction and web-based interaction .

Web 2.0 Meme Map

The classic example of the Web 2.0 era is the “mash-up” — for example, connecting a rental-housing Web site with Google Maps to create a new, more useful service that automatically shows the location of each rental listing.

Web 3.0 works inside of applications and databases, not just on Web pages. Calling it a "Web" is a misnomer of sorts — it's not just about the Web, it's about all information, data and applications.

For such a semantic web, HTML5 which is a collection of features, technologies, and APIs brings the power of the desktop and the vibrancy of multimedia experience to the web and amplifies the web’s core strengths of user interaction and connectivity.

HTML5 includes the fifth revision of the HTML markup language, CSS3, and a series of JavaScript APIs. Together, these technologies enable you to create complex applications that previously could be created only for desktop platforms.

This is where Microdata comes in and it’s going to fundamentally change the way we discover and consume content on the web.

Microdata is a component of HTML5 aimed at adding more semantics and contextual information to existing content on a page. By doing so, Microdata provides others, like search engines or browsers, with more information about the contents of a page.

Microdata is an attempt to provide a simpler way of annotating HTML elements with machine readable tags than the similar approaches of using RDFa and Microformats.

Microdata vocabularies provide the semantics, or meaning of an Item. Web developers can design a custom vocabulary or use vocabularies available on the web. A collection of commonly used (and Google Supported) Microdata vocabularies located at http://data-vocabulary.org which include: Person, Event,Organization, Product, Review, Review-aggregate, Breadcrumb, Offer, Offer-aggregate. Other markup vocabularies are provided by Schema.org schemas. Major search engines rely on this markup to improve search results.Content expressed as microdata on the web page gets correlated easily to the data vocabulary it is giving information about making it easy for the search engine find relevance and connectivity.

Google has stated they only support a handful of these Microdata types which include: reviews, people, products, businesses and organizations, recipes, events, music, and video content. If your website has any of these types of content, you’re eligible for a Microdata implementation.

In one of our previous posts we have explained how to use the hCard microformat on http://blog.webpro.in/2010/05/what-is-hcard-integration.html and it can be seen clearly that if you express the contact address on the web page in an hCard format it becomes data having specified fields for street address, city, country, etc. which makes the content more co-related and meaningful.

The semantic web adds more meaning to the query for a search rather than just mapping words during the search process.

As an SEO I think it is high time we started incorporating microformats wherever possible . Some of the content which can be represented as microformats or as data are as follows:

Reviews

People

Products

Businesses and organizations

Recipes

Events

Video

To check your markup, use the rich snippets testing tool.

Google rich Snippets Tool

 

More details about microformats on http://support.google.com/webmasters/bin/answer.py?hl=en&answer=146897

Panda or no Panda content has been king since the first web page was published on the web. It is Google who has become capable in picking quality content in 2012 and hence wants to reward sites with quality content with more visibility. But just adding content is not enough what has to be focused is how well the content is inter-connected to establish relevance.

More than the no. of inbound links it is important how well your web page gets inter-linked on the web as a whole. Sharing the links on social media and adding websites to Google web master tools and Bing web master tools help us achieve that purpose to a great extent but it is the representation of content in the form of data which ensures relationships and adds context will be the main task for an SEO in future.

The on-page SEO factors (used optimally, avoiding the over optimization) + The inter-linked connectivity of the aspects mentioned in the 4 quadrants above will determine the success of SEO campaigns in future.

 

My Views On The WSJ Post “Google Gives Search A Refresh” – Semantic Search, Google And Bing

Lately The Wall Street Journal Post is dominating the social scene among the search marketers and anyone interested in a search presence or rather a presence on Google.

Search has been the Google domain ever since they started the company. Since then Google has been working on the quality of the overall search experience from the search engine and the user perspective.

First and foremost let me mention that as a search marketer or rather a SEO (As there is a lot of confusion nowadays related to this term I better be very clear, yes I am an SEO and I am proud of it) I am very happy about all the changes that Google has made to its algorithm in 2011 and the updates seem to continue on a good pace in 2012 too.

I think all these changes are drawing a clear, distinct line between organic search campaigns and paid campaigns. All of these developments will make SEOs and website owners think beyond rankings and keywords, because the true meaning of SEO is to ensure quality search engine presence on maximum search options by focusing on overall quality web presence — enhancing the quality aspects of the website and reaching out to netizens via various modes of social media.

Finally the search industry has matured and is qualitatively marching ahead. But, this has not happened overnight the search engines (Google And Bing) have been constantly working to improve the quality and display of search results by giving more and more search options over a period of time. The study of the user behavior being at the base of all the decisions as the user behavior is also constantly evolving over the years.

The WSJ Post says:

Amit Singhal, a top Google search executive, said in a recent interview that the search engine will better match search queries with a database containing hundreds of millions of "entities"—people, places and things—which the company has quietly amassed in the past two years. Semantic search can help associate different words with one another, such as a company (Google) with its founders ( Larry Page and Sergey Brin).I don’t know which interview they are referring to, as there is no link to the interview. But, Amit Singhal has been discussing and announcing the updates about the Google Algorithm since 2 years all excited like a kid in a candy store and Google has been working on Semantic Search for quite some time now and the launch of SPYW is the first step towards it which just has the social dimension to the relevance factor.The semantic search is a much wider and deeper concept based on the science of semantics which has the potential to improve search accuracy by understanding searcher intent and the contextual meaning of terms as they appear in the web eco system or within a closed system like the search engine index, to generate more relevant results.

Currently Google focuses on the Keyword-Search System. The WSJ article clearly mentions that… …“ Google isn't replacing its current keyword-search system, which determines the importance of a website based on the words it contains, how often other sites link to it, and dozens of other measures.”

But it is also mentioned in the post that ….” Google is aiming to provide more relevant results by incorporating technology called "semantic search," which refers to the process of understanding the actual meaning of words.”

Of course the real impact and implementation of this kind of update on SERPs can be judged only when Google blogs about it on its official blog and starts rolling out such semantic search results.

According to me if your site has a lot of relevant content and the content is kept fresh by updating it with the latest happenings and developments and the links are widely shared on social media and have a good amplification factor along with the possibility and potential of that content being inter-linked on the web then one need not worry about the SERPs or the search traffic for that site to be affected adversely.

If at all Google in the near future starts rolling out semantic search the relevance to the logic of the intent, the keyword mapping in the query and the freshness of the content will all matter collectively. The search engines have been working on semantic search since the advent of web 2.0 and the following PPT which was created for the first SEO Training conducted by me in 2007 where I spoke about web 3.0 and Semantic Search has an example.

(Slide 90 to 93) explains an example of Semantic Search which seems to be becoming a reality in 2012.The slides explain Semantic Search as follows:
n  The Semantic Web is a set of technologies which
are designed to enable a particular vision for the future of the Web – a future
in which all knowledge exists on the Web in a format that software applications
can understand and reason about. By making knowledge more accessible to
software, software will essentially become able to understand knowledge, think
about knowledge, and create new knowledge. In other words, software will be
able to be more intelligent – not as intelligent as humans perhaps, but more intelligent
than say, your word processor is today.


  n  The classic example of the Web 2.0 era is the
“mash-up” — for example, connecting a rental-housing Web site with Google Maps
to create a new, more useful service that automatically shows the location of
each rental listing.
  n  In contrast,
the challenge for developers of the semantic Web is to build a system
that can give a reasonable and complete response to a simple question like:
“I’m looking for a warm place to vacation and I have a budget of $6,000. Oh,
and I have an 11-year-old child.”
  n  The Semantic Web does not only exist on Web
pages.Web 3.0 works inside of applications and databases, not just on Web
pages. Calling it a "Web" is a misnomer of sorts — it's not just about
the Web, it's about all information, data and applications.
View more PowerPoint from WebProtech

Bing too has been working on semantic search since a long
time. The following 2009 video of Webpronews.com says it all where Javed Panjwani, the Business Development Executive at Wolfram Alpha , explained, a traditional search engine creates a Web of documents and essentially generalizes what a user is looking for. A computation engine, on the other hand, personalizes the search results, where the reference was to Semantic Search Results.

As an SEO I feel more than happy and look forward for all the coming updates (if any in the near future hopefully), because when the search is based on the relevance of content, social mentions , outreach and amplification and the logic of the query rather than only the word by word mapping of the query adds quality to the whole search process.

With this kind of the basis for search SEO will gain more meaning and the SEO industry will gain more respect and authority as the prime edifice on which SEO in this kind of search model  will be based on content, intent , social graph and the trust factor rather than link graph.

As Matt Cutts has made it very clear in all his SEO videos that SEO is all about putting your best foot forward and SEO is not spam. One of our past guest posts also points out the sentiment behind this that SEO is not magic but pure logic and hard work.

I have not got a chance to see the SPYW results as they have not yet been rolled out universally but according to my experience and what I have read on the search blogs I think SEOs should not rant about it but indulge themselves in research and rejoice at the fact that it is the true growth of the search industry and finally search has come of age and matured. Of course only Google can tell us when this will become a reality.Waiting for the next Google blog post to shed some more light on this and let us know when the changes mentioned in the WSJ post will become a reality.

What Is Content Strategy And Why Do We Need One?

If you have a website and you are serious about its web presence then I am sure you must have heard the clichéd line “Content Is King”.

After the Panda update we see many people claiming themselves to be content writers and offering content writing services. I am saying this as the no. of emails that I used to receive for link requests which were nothing but requests for spammy link exchanges are now decreasing and the no. of emails from companies introducing themselves as content writers is on the rise.

Content has been king since the day the first web page was published but now its importance by Google as a ranking factor has made content the most discussed aspect of web presence.

In Information Architecture for the World Wide Web, Lou Rosenfeld and Peter Morville write, "We define content broadly as 'the stuff in your Web site.' This may include documents, data, applications, e-services, images, audio and video files, personal Web pages, archived e-mail messages, and more. And we include future stuff as well as present stuff."

In today’s context the definition of content has a much broader connotation as along with the stuff on your website it also include the opinions that you share across the social media platforms , comments that you posts on other blogs and also the guest posts that you write, etc.

For such a broad spectrum for content creation, definitely one needs planning and a strategy to implement it successfully.

Usually, people start by publishing blog posts on their blog and think that the bit regarding content creation is taken care of but in fact that is just the first step and then it has to go a long way to reach out to the targeted audience and invite their opinions to add on to the content and make it more qualitative.

The basic questions to ask before creating content are:

·What content do we want to put on the web?
·Why do we want to do publish that content?
·People ask question on search engines. To which questions we can provide answers for?
·How can we give more detailed information regarding the products and services we offer?
·What will be the best medium for that content (Text, video, infographic,etc)? 

The content on your site and associated to your persona on the web which is inter-linked on the web via a ripple effect which establishes your expertise. If there is consistency in the content created and shared, if the content is contextual and if your blog has developed a good community along with a good readership list then an authority is established.Hence, quality content contribute to the trust and authority factor.

One of our previous blog posts on http://blog.webpro.in/2011/10/after-link-building-spammers-it-is.html
answers the following related to content creation and content strategy?

Q1) Why do we need more quality content ? 
Q2) What is quality content?
Q3) Can borrowed content or content written just for the sake of adding words to your site work in your favour?
Q4) How to come up with quality content regularly?

Along with the regular content on the blog, posting comments on authority blogs, discussing on relevant topics on social media, uploading related videos are some other ways of adding on to the content correlated to your web presence.

In recent times the methodology of creating content has become very simple but to come up with the right strategy to accrue the maximum benefit from the quality content created is the major challenge.

Last but not the least creating fresh content is a constant and a continuous endeavor which does not reap immediate results. It takes years to establish an authority and an identity on the web hence keep going and do not quit.

I’ll share a poem I had read long ago but do not know the name of the author.

This proves the unique content you create may be remembered by generations and may outlive you so please make use of the Authorship markup by Google so that your content does not get published as “Anonymous” in future on the web like the poem below:

"Don't Quit," Author Unknown 

When things go wrong as they sometimes will;

When the road you're trudging seems all uphill;

When the funds are low, and the debts are high

And you want to smile, but have to sigh;

When care is pressing you down a bit-

Rest if you must, but do not quit.

Success is failure turned inside out;

The silver tint of the clouds of doubt;

And you can never tell how close you are

It may be near when it seems so far;

So stick to the fight when you're hardest hit-

It's when things go wrong that you must not quit.

 

SEO Or SEO Not - Do We Really Need To Redefine Or Rename SEO ?

The SEO blogosphere every year is bombarded with posts about “SEO IS DEAD” but from  Jan. 2012 and infact from the last quarter of 2011 there is a new trend of acknowledging the fact that “SEO IS VERY MUCH ALIVE” but is packaged with a message about :

  •  How it is different ?
  •  Why it is called as the NEW SEO ?
  •  How SEO is facing an identity crisis ?
  •  What are the new awesome metrics to measure SEO success ?
  •  Why it needs to renamed as something else ?
  •  Why Content Marketing is the main mantra ?
 
English: Puzzle Svenska: Pussel
Image via Wikipedia

I think this is because people who have invested in SEO campaigns are expecting results which accrue as a result of an all round web presence over a period of time. Without an ounce of doubt an over all quality web presence if planned and executed well in the long run has the potential of rewarding the business with an influx of the exponential online opportunities available on the web.

An over all web presence can be attained by an over all online marketing campaign.What is an overall online marketing campaign? Assuming that you have a reasonably well designed website ensuring a good UX thenAn Overall online marketing campaign = SEO + Blogging + Social Media + PPC (At the right stages of the website journey and purpose) + Content Creation + A/B Testing for effective landing pages + The Study of Analytics for regular analysis and improvementEvery website is different and has to be also analyzed and assessed, as per its purpose and the potential it has for it to be successful. Usually people hire SEO services and think that they will win the online market by just ranking high for the targeted terms. But, the reality is that your SEO may be working on the many ranking factors and may be successful in achieving the high rankings for your site but just ranking is not enough , once you rank high the next step is to assure that the ranking gets the click, once it gets the click then the conversion or the execution of the call to action of that page has to be the metric which needs to be looked into, once the conversion takes place only then the calculation of the ROI is possible.

If we analyze the whole cycle above each and every step is inter-related and when all the people involved at various steps work in coordination , understanding for each others’ scope of work and communicate effectively only then the true success can be achieved.

Every journey starts with some destination in mind and by selecting the mode of transport to reach there. Once you decide to create an online presence for your business and tread on the virtual sands of the web world focus on what you want to achieve with the website you already have. Hence, Putting the the right team in place to achieve the goals of your web presence is the first and foremost task.

Business owners usually appoint an SEO company or a PPC company and wait for the results to accrue. But, it is very important to have a strategy and a basic plan in place with the scope of work for each party clearly chalked out.

If your agency has a proper team of web designers, developers, social media experts, content writers, landing page A/B testing experts, Analytics experts,etc. then the company may be able to justice to the job . But with such a high end team the charges also for such a campaign will be sky rocketing which only companies with very high budgets can afford.

For SMEs who have small and limited budgets usually appoint individual SEOs who comparatively charge less because of their low cost of operation. But, they need a lot of support from clients for any all round campaign.

Panda or no Panda CONTENT IS KING since the day the first web page was published on the WWW. It is Google who has become capable in picking quality content in 2012 and hence wants to reward sites with quality content with more visibility.

Today the public opinion is also woven into the social fabric which adds to the quality factor which Google is giving importance to.For the social media signals and content marketing to accrue SEO benefit to the site the overall online presence is necessary, it is this correlation which has set the inter-connectivity between SEO , Social media and content marketing.Yes, as we said above “SEO IS VERY MUCH ALIVE “ and I think it cannot die as long as people are using search engines to find information on the web. SEO does not need a new name and is definitely not facing an identity crisis. Just like the on-page factors and the off-page factors which have been responsible for ranking , the social media signals and the freshness of content (On site and UGC) have been added to the list of ranking factors and the SEO needs to have a strategy for ensuring that the right and strong signals are generated via these platforms for the site.Content marketing and social media presence are not the new SEO or are replacing search ,they are supplementing SEO along with all the other on-page and off-page factors on which we have been working on till date.Every marketing campaign is about results and returns, but SEO is a part of the whole online marketing puzzle and to complete the full picture every piece needs to be placed in the correct place and the correct order.

editorial-policy-WebPro-Technologies-LLP-Ahmedabad

Editorial Policy: Human Expertise, Enhanced by AI

At WebPro Technologies, our content reflects over two decades of experience in SEO and digital strategy. We believe that valuable content is built on accuracy, clarity, and insight—and that requires human judgment at every step.

From 2024 onwards, we have been using AI tools selectively to brainstorm ideas, explore perspectives, and refine language, but AI is never the final author. Every article is researched, fact-checked, and edited by our team, ensuring relevance, accuracy, and originality. AI supports our workflow, but the responsibility for quality and credibility remains entirely human.

This hybrid approach allows us to combine the efficiency of technology with the depth of human expertise, so our readers get content that is both informative and trustworthy.

At WebPro, we see AI not as a replacement for human creativity, but as a tool that helps us raise the standard of excellence in the content we share.

SEO Ahmedabad

Contact Info

802, Astron Tech Park, Satellite Road, Opp. Gulmohar Park Mall, Ahmedabad 380015, India

+91 9825025904
info@webpro.in

Daily: 9:00 am - 6:00 pm
Sunday: Closed

Copyright 2025 WebPro Technologies LLP ©  All Rights Reserved