Because of the Panda Update, I sort of got disillusioned and almost gave up on my ventures into earning money online. I felt like I’ve come to a dead end where I have no idea how to proceed. Which probably is not unlike what Hubpages CEO Paul Edmondson felt when the Panda first struck. But unlike him, I have no employees and millions at my disposal to do the dirty work of searching for remedies to the predicament. I stopped working on my sites for fear that what I do, especially backlinking, might further bring them down. My postings on this blog even almost slowed down to a halt.
The latest news was this ‘freshness’ update purportedly made to encourage webmasters to produce fresh content on a regular basis. This is actually an offshoot of the Caffeine update, but ever since the Panda came into the scene, I see all updates henceforth as a Panda update.
Here are some facts (and speculations) I’ve gathered about the Panda update.
What Google Really Said
“This update is designed to reduce rankings for low-quality sites – sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites – sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
“you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”
“Google’s new Freshness Update affects 35 percent of queries. It prioritizes recent and timely results, and it’s based off their Caffeine infrastructure.”
What SEO Analysts are Saying
Knowing that Google never really reveals anything specific about the updates, SEO analysts are all over each other in giving their own interpretations and suggestions on what the new SEO game should be.
Panda is a content update, not a backlink update.
Google is basing its ranking re-structuring on the likes and dislikes of the ‘quality raters’ (the ones they gave questioners to regarding user experience ) they employed. And Google is incorporating the logic and process that these quality raters are using into their algorithm.
And they now know the equations and code to use to duplicate the human raters’ system of judgement? Aesthetics wise, can they now program a robot to pick its favorite color? Would each robot pick the same color, or would each pick a random one?
Any low quality pages within a site can bring down the general ranking of the whole site.
Thus, it must be safe to assume that the above premise, about low quality pages within a site bringing down the high quality pages, is true. But with regard to the subdomain technique, there are talks that imply Google being not happy with this loophole.
Suggestions involve the use of robots.txt file to block search engines from indexing your low quality pages. Another is to totally remove any low quality pages with your site.
Panda penalizes content that has obviously been made for SEO purposes. It wants content that makes everyone who see it want to share it and say, wow. It wants content that makes readers want to stay longer on the page.
In the case of scientific sites, what might the scientific users see in the site’s presentation of tabular data that would hold their attention more and want to share with their peers? Engaging writing? If this is the case, hiring writers who can translate a scientist’s knowledge into something engaging would be a requisite for every scientist who wants to share their knowledge to the world through their blogs. How would the quality raters rate, say, 35 different pages (each from different domains) displaying the same table of elements? Ok, this is where the consideration of the whole site as a whole comes in, I suppose.
Bounce rate. A low bounce rate means readers stay longer in the page and explore other pages within the same site. Here again is where user experience quality is given emphasis. People would want to share quality pages to their friends. Of course, this has been in Google’s algorithm since the beginning of Google time.
Some SEO practices that were effective then are now busted by Google.
Contextual links. So, how do you get contextual links? Off the top of my head, I’d say you can get them by:
- Guest blogging.
- Link baiting. Just simply posting great articles on your blog can endear others to point everybody to your pages through links from their own blogs and sites.
- Buying through Pay Per Post and other related sites.
- Ezinearticles and other article directories. The problem with this is that a lot of article directories have been hit and marked by Google in their Panda update. Which means that links from there might not carry that much weight anymore. Unless of course these directories manage to recover.
Everyone knows about this form of backlinking and how it draws traffic and Google love.
Google wants everyone to side with them and not the competition. Google wants us to use only [wpPopWizard item_id=kswppw_in1ine inltext=”<p><strong>YouTube </strong>is owned by <strong>Google </strong>and is their primary video platform.</p>” ]YouTube[/wpPopWizard] platforms for publishing our videos. For free publishing platforms, it wants you to use blogger/blogspot.
Google is taking social media more seriously this time.
Hint: Google +1
Personally, I’ve decided to try doing most of the above suggestions. Quality content? I’ll try my best then. Otherwise, the other best option would be to outsource. Let the games begin.
What, do you think is the Panda really up to?