Tuesday, January 7, 2014

SEO in 2014: Getting The Hang of AuthorRank

SEO in 2014: Getting The Hang of AuthorRank

As 2014 approaches, many are left wondering what is in store for SEO. No one want to be caught napping when Google makes their next move. Webmasters are more keenly keeping their eyes trained on Google. One of the biggest game-changing future additions to Google´s algorithm is AuthorRank. Industry insiders are predicting it will change the foundations of online marketing forever.
About PageRank
Google’s standard high-level metric that has been used to determine the quality of a web page is PageRank. This system has been used for many years to classify web pages according to their authority. A site with a higher page rank (PR) is noted as having a higher authority which also meant that they had a higher crawl rate. It would seem that the use of PageRank is set to come to an end as its effectiveness in providing quality metric has dwindled over the years. While it may still have some relevance if it is used along with other quality metrics such as domain authority and page authority, it can no longer be relied on as the only quality metrics tool.
About AuthorRank
Out with the old – in with the new? The year 2013 will see the introduction of a new quality metric tool to measure and classify a page. AuthorRank seems set to take its place as the new master of page quality and SEO ranking, but it will inform rather than replace PageRank. It would seem as if Google had this card up their sleeves for a few years now just waiting on the right time to play it. They filed for a patent for “Agent Rank” in 2005. This Agent Rank is one and the same AuthorRank.
David Minogue of Google – in the patent – made references to the use of the patented instrument to rank those he referred to as agents. The content received from these agents along with their interactions (which we would guess refers to web activity) were to be used to determine their rank. Those agents that were more well-received and popular would possibly see their associated content gain a higher ranking than those agents with unsigned content. We determine that the agent in the patent refers to web site owners and everyone who submit content to Google.
Agent Rank – it would seem – was put aside for several years because without a system to identify agents there was no way the ranking system could be implemented. Google evidently never lost sight of its mission to rank agents as the ultimate goal is to improve search quality. Google and other search engines have moved away from the practice of using just the authority of a domain or Web page to rank the quality of the content. The focus is now geared towards ranking based on the reputation and authority of the author of the content. How does Google know who’s the author?
The Role of Google Plus in Identifying Authors
In 2011 Google’s Eric Schmidt indicated that Google still had a desire to be able to identify agents as a means of improving search quality and getting rid of spammers. Google determined they needed a strong identity for agents if it was going to achieve that goal. Shortly after Schmidt’s proclamation Google Plus came on stream. Google Plus has so far been the most effective means of capturing the digital signature of those who supply content to the web. Once there is a Google profile it is much easier to get those signatures.
And Now What of Google Authorship?
We determine that this new roll out feature is meant to tie a Google signature or rather a Google profile to the content produced by those with a profile. Google has been working smartly and quietly behind the scenes developing a method to finally identify agents and the content they produce. With this accomplishment they can now roll out the now several years old Agent Rank to carry out the ranking process it was first meant to do.
Also read: How To Set Up Google Authorship Markup With rel=author
AuthorRank’s Ranking Process
Google Authorank 2013Google credit authors who submit content to the web by tying the pages they write their Google Plus profiles. Rich snippets are used to enable the process, and the engagement factors of the content the author produces is used to give the ranking. The engagement factors include activities coming social platforms such as shares and likes from Facebook, tweets from Twitter and +1′s from Google Plus. The influence and relevance of the endorsements and comments, along with the influence and relevance of those who endorse and comment on the content will be taken into consideration for the ranking process. The inbound links connected to the content will also come in for scrutiny based on their quality and relevance. Authors will receive higher rankings for the topics based on how much content they provide on the topic. The more content you produce on a particular topic and the more engagement you get for that content, the higher you will be ranked. The basic idea seems to be centered on giving those who are experts on a topic the edge over those who are not.
As Google prepares to roll out the new tool to help clean up the web, it would seem that those who are serious about providing quality content will be given high priority.

view the original article here

The “Google Memory” Explored: Fact or Fiction

The “Google Memory” Explored: Fact or Fiction

The Real Effect Link and Content Velocity Have on Your Website
Just like the nuns who ran the Catholic school I attended, Google remembers everything. By the 6th grade, I was a model student; I earned good grades, played sports, and was in several clubs. But do you think those accomplishments erased my 3rd grade shenanigans from their memories? Nope, those crafty nuns don’t forget… anything. And when it comes to your company’s site, neither does Google.
Unbeknownst to many, even some SEO “pros” who shall go unnamed, past activity on your site is remembered by Google, and what’s more, good ol’ G will call on these “memories” when determining your search engine result positions (SERPs).
That said, before you start analyzing every link you’ve ever built or word you’ve ever published, let’s discuss what Google will remember:
Link VelocityLink Velocity
Haven’t heard of link velocity? It’s ok, not many of us have. Link velocity refers to the rate at which your site adds new in-bound links.
Easy now, we’re not saying that high velocity is boss here. On the contrary, exponential consistency is king. Maintaining a natural pace in your link building efforts is what Google rewards

Inbound linking must appear natural, and to appear natural, growth must be exponential. We know that popularity increases exponentially. If a site has 10 new inbound links the first day, it makes sense that they have 100 on the second day, 1000 on the third day and so on. If, however, a site goes from 10 links to 50,000 overnight, Google is most definitely going to raise an eyebrow (or two).
Like my adored catholic school teachers of yore, Google is not opposed to corporeal punishment. When they notice this type of unnatural growth, they impose a “Google Slap,” which lowers your SERP in increments of -30, -60, or -90 places.
In the most extreme cases, Google has been known to remove websites from the search results entirely. This isn’t as rare as you might think. According to Matt Cutts, Google receives about 5,000 reconsideration requests per day. Yikes!

The implication here is that new inbound links should be added at a natural rate that does not raise any Google red flags.
Content Velocity
Content, like links should be added at a steady pace, growing exponentially over time. It makes sense that your company added 3 pages to your site yesterday, but it seems pretty unnatural that 300 pages were added overnight. Conversely, removing massive amounts of content can result in a “slap” as well.
What’s more, Google places a high value on an attribute called “site weight” or “site depth.” Simply said, the more quality content you’ve got for your audience to consume, the better.
* Caution – Sites that binge on content by using content spinners like Article Builder or The Best Spinner, which automatically generate articles on given topics, will get slapped! A content spinner may sound like a magic bullet solution, but they typically do more harm than good. The content these programs create is unintelligible at best and easily detected as “thin content”.
By following a few simple guidelines, you can be sure everything you do will push you one step closer to SERP royalty.
Tips and Tricks:

1. Keep It Natural.

Make sure your link and content building is natural. Ultimately, Google operates as a democracy where users access high quality sites that build authentic popularity through highly related link building and quality content.
Google’s punitive measures have been put in place to ensure we don’t have to wade through hundreds of spammy sites to find what we really want. They do the dirty work for us. Use authentic links and content to show Google that your site is legit and relevant.
Google descriminates in a good way

2. Google Discriminates…in a Good Way

Not all content and links are considered equal. More value is placed on links that originate from reputable and related sites. For example, my former teacher (the nun) runs a blog dedicated to St. Anthony, that has 30 readers. If she links to my site, that doesn’t get me much traction. Her site has nothing to do with mine, and anyway, nobody reads it. On the other hand, if you talk about the Insurance industry for example, and decide to write about pricing, etc. of products related to the topic, then Pinnacle Life Insurance quotes for example would make a lot more sense. Get the picture?
Similarly, bad content won’t get you ranked. Set your content apart by being original, insightful, entertaining and useful. Be sure to give actionable steps that readers can move on right away. Use a writing style that is engaging, entertaining, and (our fav) controversial. Google knows the difference between man-made content and the drivel spun out by content spinners.

3. Shout it From the Rooftops
Google SpiderImagine Google as a big mother spider sending her millions of babies on reconnaissance missions to the far-reaching corners of the web. Their job is to bring back intel on the millions of sites they crawl. Mama Spider sorts through this information and rates it.

2.5 quintillion bytes of data are uploaded to the web every day. This fact alone makes it increasingly hard for Google’s spiders to reach every corner of the web. Just because you’ve added new links or content won’t mean Google knows about it. To get your additions seen, you have to get them indexed. There are many ways to do this: social networking, social bookmarking, and creating links to your links are just a few.
Links are the spiders’ highway. In order for them to find you, you have to build on-ramps and signs that help them travel to your site. No one sees you back there in your log cabin down an unmarked dirt road, no matter what kind of great (or sketchy) stuff you’re doing down there.
Spider tip
You can depend on the fact that they’re always watching and tracking your site’s activity and will use that activity to determine your SERPs. Add authentic, quality, natural content and links on a consistent basis, Google will give you a gold star, and you get to sit back and watch your SERP rise and your conversion rates soar.

View the original article here

Bing SEO And Webmaster Tools

Bing SEO And Webmaster Tools

We can go all over this again and say it out loud, “there are many different ways to obtain traffic”. The only thing we should all know is that you
need to make sure you are exploring all of them. Relying solely on specific search engines for example, can be a recipe for disaster as they constantly change or update their algorithms to couple with changing trends, quality requirements, and so on. I’m sure you have seen people saying that my rankings dropped because of this and because of that. Needless to say that, the more diverse you apply your SEO efforts, the less risk you are exposed of not being found.

Take article marketing for example. It is a good promotional technique and often times it is forgotten that articles you submit on high page rank article directories should also be promoted by making them powerful with backlinks too. This is just proof that doing something and leaving it alone, waiting for miracles to happen, simply does not work.

This is also true for search engine optimization. While I understand why many people would prefer only optimizing for Google, it should be noted that Bing is also a growing important search engine and one should not neglect what possibilities optimizing for Bing can offer. Just think about having 100% of nothing versus 10% of something. What I mean by this is that it is obviously great to rank on Google’s front page (not easy) for certain keywords or keyphrases, but would it not be nice if you also had the same rankings on Bing? Of course.
Earlier, I have posted about using Bing’s webmaster tools as part of your SEO campaign. Microsoft has gotten a long way since then and has provided many “free” SEO tools that you might still not be aware of. Two such tools are the AdIntelligence Keyword Tool and the Bing SEO Toolkit. The third in Bing’s arsenal is obviously the Bing Webmaster Tools.

Welcome to the new Bing Webmaster Tools

Revamped, or better yet, a complete re-haul of Bing’s webmaster tools has recently been deployed. The site has been completely redesigned to be easier to use, and has an intuitive design that focuses on three main areas: crawl, index and traffic. New features, such as the Index Explorer, charting functionality, etc., now provides a more comprehensive view into how Bing crawls and indexes your sites. Similar to that of Google’s webmaster tools, you can now see some interesting stats such as queries, clicks, impressions, and CTR.

While there is still a long way to go, it is clear that Bing is committed to further enhance their presence on the search engine arena as proven with the recent announcement from their search alliance with Yahoo (see below).

Organic search update and tips

A key aspect of the Yahoo! and Microsoft Search Alliance is the transition of Yahoo! organic search listings (those found on the main body of the page). Assuming our testing continues to yield high quality results, we anticipate that our organic search results will be powered by Bing beginning in the August/September time frame.
As Bing Webmaster Tools continues to add new features, it is always a good idea to get your site submitted and analyzing how Bing reads your data. Nothing to loose and (maybe) a lot to gain. Recent “buzz” on the sphere indicated that the top SEO ranking factors for Google and Bing are different and it is important to note that none of them compete. In theory getting top rankings for Bing and Google will not compromise each other.
How about you? Have you already submitted your site to Bing’s webmaster tools or are you still neglecting the potential for ranking well on Bing? Just do it, its easy and is worth your while, considering the benefit it could bring.

View the original article here

Microsoft disavows workaround for SkyDrive problems on Windows 8.1

Microsoft disavows workaround for SkyDrive problems on Windows 8.1

Microsoft has backtracked on a suggested fix it offered repeatedly on its support forum for over two months for users having problems with SkyDrive on Windows 8.1.

In a permanent message on the SkyDrive section of its support site, Microsoft now offers detailed troubleshooting steps for affected users.

At the end of the post, it also disavows a workaround its officials had been dispensing that involved backing up SkyDrive, deleting its root folder and files and restarting the application. "If you've previously seen the instructions below on other posts, please DO NOT use them as they are no longer valid for SkyDrive on Windows 8.1," the post reads.

Microsoft published the note on Dec. 20, two days after IDG News Service first reported that the support site by then had more than 120 active discussion threads about SkyDrive malfunctions following upgrades to Windows 8.1.

Until then, Microsoft hadn't addressed the topic with a permanently "pinned" message on the site, opting instead to dispatch its forum moderators to the individual discussion threads.

Asked at the time for comment, Microsoft said via email that it was "aware of a small number of people discussing these issues on forums" and that it was working with them individually, often by phone, to solve their issues. "Most people using Windows 8.1 and SkyDrive, however, are having a good experience," a spokesman wrote at the time.

Aside from the now-disavowed workaround, forum moderators didn't seem able to pinpoint a cause nor prescribe a solution for the problems people were reporting, including nagging and persistent error messages, slow performance, difficulty uploading files, lost and corrupted folders and documents, and sync troubles, including duplicate files and processes caught in a loop.

A scan of the SkyDrive forum shows that affected users continue to file complaints about SkyDrive on Windows 8.1.

The problems for the affected users began after installing Windows 8.1, the update to Windows 8 that started shipping in mid-October.

The number of discussion threads dealing with the issues and the number of people participating in them seems very high for a problem that Microsoft so far maintains isn't generalized.

However, the fact that Microsoft has now pinned a permanent message about this issue in the support site indicates that the company recognizes it is a matter that deserves more prominent acknowledgement.

SkyDrive and Windows 8.1 were deeply meshed to make it easier for people to use SkyDrive than it had been on Windows 8, the company said in a blog post on Oct. 15. However, an early sign that trouble might be afoot was that most of the comments readers made to that post in the following weeks were negative.

View the original article here