Thursday 1 November 2012

Google Panda Update vs. Google Penguin Updates

The SEO community has been a buzz this past week with the latest update from Google, named Penguin. Penguin came down the pipeline last week, right on the tail of the latest Panda update. Since most of the big updates in the past year have been focused on Panda, many site owners are left wondering what the real differences between Panda and Penguin are. Here is a breakdown:

Google Panda Update Overview:

According to Google’s official blog post when Panda launched,

This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.

Basically, Panda updates are designed to target pages that aren’t necessarily spam but aren’t great quality. This was the first ever penalty that went after “thin content,” and the sites that were hit hardest by the first Panda update were content farms (hence why it was originally called the Farmer update), where users could publish dozens of low-quality, keyword stuffed articles that offered little to no real value for the reader. Many publishers would submit the same article to a bunch of these content farms just to get extra links.

Panda is a site wide penalty, which means that if “enough” (no specific number) pages of your site were flagged for having thin content, your entire site could be penalized. Panda was also intended to stop scrappers (sites that would republish other company’s content) from outranking the original author’s content.

Here is a breakdown of all the Panda updates and their release dates. If your site’s traffic took a major hit around one of these times there is a good chance it was flagged by Panda

1. Panda 1.0 (aka the Farmer Update) on February 24th 2011
2. Panda 2.0 on April 11th 2011. (Panda impacts all English speaking countries)
3. Panda 2.1 on May 9th 2011 or so
4. Panda 2.2 on June 18th 2011 or so.
5. Panda 2.3 on around July 22nd 2011.
6. Panda 2.4 in August 2011(Panda goes international)
7. Panda 2.5 on September 28th 2011
8. Panda 2.5.1 on October 9th 2011
9. Panda 2.5.2 on October 13th 2011
10. Panda 2.5.3 on October 19/20th 2011
11. Panda 3.1 on November 18th 2011
12. Panda 3.2 on about January 15th 2012
13. Panda 3.3 on about February 26th 2012
14. Panda 3.4 on March 23rd 2012
15. Panda 3.5 on April 19th 2012

Search Engine Land recently created this great Google Panda update infographic to help walk site owners through the many versions of the Google Panda updates.

Many site owners complained that even after they made changes to their sites in order to be more “Panda friendly,” their sites didn’t automatically recover. Panda updates do not happen at regular intervals, and Google doesn’t re-index every site each time, so some site owners were forced to deal with low traffic for several months until Google got around to re-crawling their website and taking note of any positive changes.

Google Penguin Update Overview:

The Google Penguin Update launched on April 24. According to the Google blog, Penguin is an “important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.” Google mentions that typical black hat SEO tactics like keyword stuffing (long considered webspam) would get a site in trouble, but less obvious tactics (link incorporating irrelevant outgoing links into a page of content) would also cause Penguin to flag your site. Says Google,

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

Site owners should be sure to check their Google Webmaster accounts for any messages from Google warning about your past spam activity and a potential penalty. Google says that Penguin has impacted about 3.1% of queries (compared to Panda 1.0’s 12%). If you saw major traffic losses between April 24th and April 25th, chances are Penguin is the culprit, even though Panda 3.5 came out around the same time.

Unfortunately, Google has yet to outline exactly what signals Penguin is picking up on, so many site owners that were negatively impacted are in the dark as to where they want wrong with their onsite SEO. Many in the SEO community have speculated that some contributing factors to Penguin might be things like:

1. Aggressive exact-match anchor text
2. Overuse of exact-match domains
3. Low-quality article marketing & blog spam
4. Keyword stuffing in internal/outbound links

It’s important to remember that Panda is an algorithm update, not a manual penalty. A reconsideration request to Google won’t make much a difference–you’ll have to repair your site and wait for a refresh before your site will recover.  As always do not panic if you are seeing a down turn in traffic, in the past when there is a major Google update like this things often rebound.  If you do think you have some sort of SEO penalty as a result of either the Google Panda or Google Penguin updates, please contact your SEO service provider to help or start trouble shooting.

Free SEO Tools for Website Analysis


Most of the SEO Professionals needs to send number of SEO proposals to new clients often. Also, they would like to have quick analysis on each website before sending the proposal. Following are top 3 free online tools which will help you while doing website analysis.
        1. woorank
        2. Site Trail
        3. Hubspot's Marketing Grader

1. Woorank:
Woorank is one of the best free tool for basic SEO analysis. Woorank team is putting their maximum efforts to optimize this tool. SEO Professionals who would like to know about any site in a single click can approach this tool without any doubt. One good thing is it’s always free and clean interface with no ads. Following are the couple factors this tool analyzes about a website:
  • SEO Suggestions
  • Visitor’s information
  • Social Share data
  • SEO Tech Info
  • SEO Tags Data
  • Link Evaluation
  • Keyword Analysis
  • Website Authority & Back links
  • Usability & Security
  • Tracking & Technologies


Site Trail is another free SEO tool for website analysis. It’s little new compared to woorank. It’s also offering the same kind of features in woorank. However Site Trail team has few more additional features to their tool i.e. Hosting, Server, DNS Records & HTTP headers information. Following are the list of features it offers:
  • Social Media Stats
  • Basic SEO Data
  • Visitors Analysis
  • Traffic Rank Details
  • Content & Linking Analysis
  • Hosting & Server Info
  • DNS & HTTP header Records


It will give little in-depth analysis of any website. This tool is able to recognize whether your site is a blog or not. It’s good to test one of your sites personally on this tool, so that you will have a better idea. If you are marketing professional then you will worth testing it. Listed below major features:
  • Blogging analysis
  • SEO Info
  • Mobile Compatibility check
  • Social Stats
  • Analytics

5 Google Algorithm Changes I’d Like to See

Recently, Google killed off (or at least substantially reduced) the SEO benefit associated with building websites on exact match domains (EMDs). And while I think that that’s a great step forward, it’s clear that Google still has a ways to go in terms of wiping away low-quality listings from the SERPs.

And since you all know how much I love algorithm changes, here are a few additional steps that I hope Google will consider as it rolls out future changes to its search results:

Change #1: More Accurate Measurement of On-page Quality

It’s not exactly a secret that Google struggles with measuring on-page quality objectively. The whole reason we have search engine algorithms in the first place is because the Googlebot can’t just go out, read a couple of pages, and decide which one is qualitatively the best.

And because of that, I realize that saying, “Google should do a better job of measuring page quality” is somewhat of a moot point. Obviously, this is a priority for the Web’s largest search engine, and obviously, it’s something that Google is constantly addressing.

That said, is anyone else disappointed by the pace at which these changes are occurring? Perhaps one of the engine’s most obvious missteps was a situation that occurred immediately after the roll out of the Penguin algorithm update (a change that was implemented explicitly to weed Web spam out of the results) in which a Blogger site with no content earned a temporary first page ranking for the competitive keyword phrase, “make money online.”

Mistakes like these limit the public’s confidence in Google to provide a good search experience, not to mention the impact had on SEOs whose sites are either demoted in favor of unworthy results like these or caught in the cross-fire of updates whose effects on the SERPs weren’t fully anticipated.

Given this diminished confidence, I believe that it will be important for Google to speed up the pace on algorithm updates designed to provide high-quality SERPs listings, while also minimizing the impact that these future changes have on sites that are unintentionally penalized. Doing so will be critical to the engine’s ability to maintain public trust and its competitive advantage.

Change #2: Penalization for Spun Content

In particular, one type of low-quality content I want to call out is “spun content”—pages in which words, phrases, and sentences have been automatically replaced using spinning tools in order to generate “unique” content. Sure, the content’s unique, but that doesn’t mean that it’s fit for human consumption!

But although Google has made it its mission to weed out low-value results from the SERPs, I still encounter plenty of instances of spun content while I’m making my rounds on the Internet. While I recognize that the text produced by this process may look unique to the search engine bots, surely Google’s language processing algorithms make it possible to uncover phrasing patterns that indicate content has been spun (at least in the most egregious of examples)?

Really, I’m sure that this is something that Google is working on, as it does the company a disservice to have this type of junk content appear alongside legitimate results in the natural SERPs. I’m just hoping that they’re able to speed the process up and get this update rolled out ASAP!

Change #3: Better Integration of Facebook and Twitter Data

Sure, we all know that Google has a social integration problem, given that Bing has the contracts with Facebook and Twitter, and Google is left to rely primarily on social data generated by the fledgling Google+ network.

However, I for one, would love to see the engine take one for the team and make the compromises necessary to bring this data set to the Google results (wishful thinking, I know…). Or hell, if Google even bothered to use the public data made available by these two primary social networks, I bet that it’d be able to seriously improve upon the quality of its existing personalized results.

Here’s the deal: If Google wants to claim that its results are the best, it simply can’t do that without the social access granted to Bing by Facebook and Twitter. In this case, relying on information from the Google+ network is like a coach claiming that his minor league baseball team is the best, even compared to heavy hitters like the Cardinals or the Nationals. You’ve got to know something’s wrong when the Google SERPs initially list Mark Zuckerberg’s unused Google+ profile over his own Facebook page…

Do I think this is going to happen? No, probably not. But since this is my wish list, I get to say that I believe the Google SERPs would be stronger with either the integration of readily available Facebook and Twitter public data or with the types of contracts these networks currently enjoy with Bing.

Change #4: An End to the Benefit of Profile Links:

Honestly, this one’s bugged me for a long time. And despite all that Google has done to diminish the benefit earned by low-value linking schemes, it seems that the value given to profile links is still alive and well.

In theory, assigning value to links that originate from within an author’s website profile account makes sense. In many cases, these accounts are tied to participation on prestigious, high PageRank sites, so it’s natural to assume that users who add their own website links to their profiles are the ones who are actively contributing to their communities.

Unfortunately, once word got out that these profile links conferred an SEO benefit, the digital marketing community saw a surge in automated tools that would create profiles on as many of these high-ranking websites as possible. The result wasn’t a certain amount of ranking value given to active, contributing site participants. It was a link structuring scheme that wastes website bandwidth and storage space by pumping an otherwise well-intentioned site full of spam profiles that gave nothing back to the community.

I have to assume that Google’s on this one, as it’s a clear violation of the engine’s Webmaster Guidelines that prohibit link schemes that intentionally manipulate the search engines without providing any type of value in return. However, if they aren’t, consider this my impassioned plea that this loophole be closed as quickly as possible!

Change #5: Less Reliance on About.com, Wikihow.com, and Wikipedia

One last thing I want to mention is that I’m sick to death of typing in a search query and being inundated with results from About.com, Wikihow.com, and Wikipedia filling up the top spots. As far as I’m concerned, with the possible exception of Wikipedia (in some cases), the articles that come from these sites tend to be barely higher in quality than the articles found in the mass content directories that Google slapped down last year with the Panda update.

Now, if I were an SEO conspiracy theorist, I might suggest that Google preferentially lists these websites over other pages that offer better information, but without any existing Google monetization (as in the case of industry experts who write exceptional blog posts in order to sell their own products).

But whether that’s the reality or not doesn’t actually matter. What matters, from a search engine perspective, is that the people using a given engine find its results to be as useful as possible. I don’t think that I’m alone in growing increasingly frustrated with being served up poorly-written, error-riddled About.com articles, so I’m crossing my fingers that future algorithm changes include a greater diversity in voices and sources of information.

Tuesday 11 September 2012

OFF Page work (Promotion Work)


For Organic Ranking:
v  Promotion/OFF Page Work
Ø  Link Building Campaign
·               Directory/Search Engine Submission
(Directory submission may include Paid Directory submission as well like Yahoo Directory etc.)
·               Blog Creation
·               Article Creation/Submission
·               Link Exchange (If needed)
·               RSS Feed creation and submission

v  Submission in Social Media Sites
Ø  SMO – Web 2.0 (Social Media Optimization)
·               Tagging
·               Bookmarking
·               Promotion through social media searches like Facebook, Twitter, Swicki, Digg, Flickr, delicious, Stumbleupon and Technorati
For Pay Per Click Marketing (If needed as per Client Requirement and Budget)
Ø  PPC Campaign creation in major search engines including: Google, Yahoo, MSN and MIVA
Ø  AD Group/Ads Creation
Ø  Tracking (CTR, CPM, CTC, CPA, Conversions)
Ø  Banner Advertisement
Ø  Paid Listings
Ø  E-Mail Marketing
Ø  Paid Submissions
Ø  Press Releases
Maintenance and Follow-up:
We will create a series of descriptions and titles based on the clients key phrase list that will ensure solid placement within these locations once established. After submission to search engine, we will engage in a process of inclusion verification and follow-up submissions where needed (not to exceed 3 total submits for any search engine or directory).

Site Meta tags Creation (ON Page)


Create page-wise Meta tags with a view to re-writing page-wise content to incorporate these - without allowing the body text to lose ‘marketing appeal’.
This will include:
Ø  Appropriate Meta Tags creation for each and every page.


Apart of above mentioned ON page work, we will be looking for overall representation and Search Engine friendly format for the website.  There are lots of factors which we take care of while doing ON page work. Some of them will be:

Ø  Any folder should not have more than 30 files in it
Ø  All images to be in a folder called images
Ø  Page to be named after the top keyword, which is being targeted for that page.
Ø  Targeted keyword should occur 3-4 times in the body, in alt, one image to be renamed after that keyword, with same keyword in alt tag
Ø  Maximum use of H1 to H6 Tags and Web Design Must Be verify By W3C Validation
Ø  Keeping the header clean so that keywords can be found as early as possible in the body
Ø  Use of unique title tag (8-10 words), Meta description (less than 270 characters), Meta keywords (make sure you do include spaces and comma between each keyword) for each page.
Ø  The only META tags that you MUST have are the "description" and "keyword" tags
Ø  Must Create a Sitemap.html and Sitemap.xml
Ø  Use of sitemap is important, and should be linked from each page of the site.  
Ø  Use of robots.txt file

And lots more…
Remember that each search engine has a different ranking algorithm. This means that one may consider a particular factor like (Back links, Site Content, Site Page rank, Alaxa Ranking) to be important whereas another search engine may consider the same factor of no importance of whatsoever. Thereby we have listed more general analysis and recommendations above, that work on vide variety of search engines. We cannot provide search engine specific information for your site at the proposal state.

Sunday 9 September 2012

ON PAGE & WEB SITE ANALYSIS AND RECOMMENDATIONS

Search Engine friendly Content (ON Page)
The web page can offer far more Search Engine (SE) optimized content specifically related to its category. Spider–Food (SE-friendly content writing); Extensive keyword-rich content will need to be added to individual pages to improve rankings. Search Engine friendly, keyword rich content in line with Meta tags to be implemented as appropriate on the site to achieve high traffic and rankings. Text has to be added on the home page as well as inner pages for better indexing.
This will include:
Ø  Search Engine Friendly Content Creation
Ø  Keyword Stuffing
Ø  Keyword Density
 
Keywords analysis (ON Page)
Our recommendation is use of two or three-word phrases instead of single words, pair general keyword with more specific ones. Also, use of combination keywords that are distantly related, common misspelled related terms, service region specific keywords and long variation of keywords.

This will include
Full Fledged Keyword Research to find out:
Ø  Primary Keywords
Ø  Secondary Keywords
Ø  Main Keywords to be target for SEO