PageRank gets its name from Google cofounder Larry Page. You can read the original ranking system to calculate PageRank here, if you want. Check out the original paper about how Google worked here, while you’re at it. But for dissecting how Google works today, these documents from 1998 and 2000 won’t help you much. Still, they’ve been pored over, analyzed and unfortunately sometimes spouted as the gospel of how Google operates now.
I think it is important you distinguish your advice about no-following INTERNAL links and no-following EXTERNAL links for user-generated content. Most popular UGC-heavy sites have no-followed links as they can’t possibly police them editorially & want to give some indication to the search engines that the links haven’t been editorially approved, but still might provide some user benefit.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors. Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day. It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic. In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
The original Random Surfer PageRank patent from Stanford has expired. The Reasonable Surfer version of PageRank (assigned to Google) is newer than that one, and has been updated via a continuation patent at least once. The version of PageRank based upon a trusted seed set of sites (assigned to Google) has also been updated via a continuation patent and differs in many ways from the Stanford version of PageRank. It is likely that Google may be using one of the versions of PageRank that they have control over (the exclusive license to use Stanford’s version of PageRank has expired along with that patent). The updated versions of PageRank (reasonable surfer and Trusted Seeds approach) both are protected under present day patents assigned to Google, and both have been updated to reflect modern processes in how they are implemented. Because of their existence, and the expiration of the original, I would suggest that it is unlikely that the random surfer model-base PageRank is still being used.
Hey Brian, this is an absolutely fabulous post! It caused me to come out of lurking mode on the Warrior Forum and post a response there as well. Only my second post in 4 years, it was that kickass… I’ve signed to your newsletter on the strength of this. You have a new follower on Twitter as well! I mean what I said on the Warrior Forum… Since 2001 I’ve worked in an SEO commercially, freelance and now from the comfort of my own home – I have bought IM ebooks with less useful information in them than covered by any one of your 17. You might not please everyone in our industry giving some of those secrets away for free though! All power to you my friend, you deserve success and lots of it!
This is what happens to the numbers after 15 iterations…. Look at how the 5 nodes are all stabilizing to the same numbers. If we had started with all pages being 1, by the way, which is what most people tell you to do, this would have taken many more iterations to get to a stable set of numbers (and in fact – in this model – would not have stabilized at all)
A decent article which encourages discussion and healthy debate. Reading some of the comments I see it also highlights some of the misunderstandings some people (including some SEOs) have of Google PageRank. Toolbar PageRank is not the same thing as PageRank. The little green bar (Toolbar PageRank) was never a very accurate metric and told you very little about the value of any particular web page. It may have been officially killed off earlier this year, but the truth is its been dead for many years. Real PageRank on the other hand, is at the core of Google’s algorithm and remains very important.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
“What does mean relevancy?”, – you may ask. Let’s imagine that you have blog about website building tips, but you have found an authoritative site about makeup trends. According to Google, this source won`t be a perfect one for you, because high authority sites should be closely related to yours. In other cases, it won’t work. The same thing goes for the content around which your link is inserted.
PageRank is often considered to be a number between 0 and 10 (with 0 being the lowest and 10 being the highest) though that is also probably incorrect. Most SEOs believe that internally the number is not an integer, but goes to a number of decimals. The belief largely comes from the Google Toolbar, which will display a page's PageRank as a number between 0 and 10. Even this is a rough approximation, as Google does not release its most up to date PageRank as a way of protecting the algorithm's details.
For Search Engines, backlinks help to determine the page’s importance and value (i.e. authority). Historically, the quantity of backlinks was an indicator of a page’s popularity. Today, due to the way backlinks are evaluated based on different industry-related ranking factors, it is less quantity focused and more about the quality of sites from which the links are coming.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.