Just a related note in passing: On October 6, 2013 Matt Cutts (Google’s head of search spam) said Google PageRank Toolbar won’t see an update before 2014. He also published this helpful video that talks more in depth about how he (and Google) define PageRank, and how your site’s internal linking structure (IE: Your siloing structure) can directly affect PageRank transfer. Here’s a link to the video: http://youtu.be/M7glS_ehpGY.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
The numbers didn’t quite sit right with me because there didn’t seem to be enough juicy inbound links to the winning page. Then I noticed that two key links were missing from the 10 node chart with the PageRank metrics on it when compared to the previous chart without the metrics. The two missing links are the two coming from node 2 to node 1. Suddenly it all made sense again and it was obvious why that page won.

Well, to make things worse, website owners quickly realized they could exploit this weakness by resorting to “keyword stuffing,” a practice that simply involved creating websites with massive lists of keywords and making money off of the ad revenue they generated. This made search engines largely worthless, and weakened the usefulness of the Internet as a whole. How could this problem be fixed?
I personally nofollow links to my privacy policy and contact form. Even though these are excluded in robots.txt, I prefer that extra layer of protection so that the pages are not indexed. Anyone that has ever had their contact form blasted continuously by spammers knows what I mean. And yes, one could add the noindex meta tag. But let’s face it, not everyone is a skilled PHP programmer. On dynamic sites its not as simple as adding a meta tag…
I think Matt Grenville’s comment is a very valid one. If your site, for whatever reason, can not attract links naturally and all of your competitors are outranking you by employing tactics that might breach Google’s TOS, what other options do you have? As well as this people will now only link to a few, trusted sites (as this has been clarified in your post as being part of Google’s algorithm) and put a limit on linking out to the smaller guys.

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.

[44] Matteo Pasquinelli reckons the basis for the belief that PageRank has a social component lies in the idea of attention economy. With attention economy, value is placed on products that receive a greater amount of human attention and the results at the top of the PageRank garner a larger amount of focus then those on subsequent pages. The outcomes with the higher PageRank will therefore enter the human consciousness to a larger extent. These ideas can influence decision-making and the actions of the viewer have a direct relation to the PageRank. They possess a higher potential to attract a user's attention as their location increases the attention economy attached to the site. With this location they can receive more traffic and their online marketplace will have more purchases. The PageRank of these sites allow them to be trusted and they are able to parlay this trust into increased business.


I agree that the more facts that you provide and if you were to provide the complete algorithm, people would abuse it but if it were available to everyone, would it not almost force people to implement better site building and navigation policies and white hat seo simply because everyone would have the same tools to work with and an absolute standard to adhere to.
A Web crawler may use PageRank as one of a number of importance metrics it uses to determine which URL to visit during a crawl of the web. One of the early working papers[56] that were used in the creation of Google is Efficient crawling through URL ordering,[57] which discusses the use of a number of different importance metrics to determine how deeply, and how much of a site Google will crawl. PageRank is presented as one of a number of these importance metrics, though there are others listed such as the number of inbound and outbound links for a URL, and the distance from the root directory on a site to the URL.
As they noted in their paper, pages stuffed fulled of useless keywords “often wash out any results that a user is interested in.” While we often complain when we run into spammy pages today, the issue was far worse then. In their paper they state that, “as of November 1997, only one of the top four commercial search engines finds itself (returns its own search page in response to its name in the top ten results).” That’s incredibly difficult to imagine happening now. Imagine searching for the word “Google” in that search engine, and not have it pull up www.google.com in the first page of results. And yet, that’s how bad it was 20 years ago.
PageRank is only a score that represents the importance of a page, as Google estimates it (By the way, that estimate of importance is considered to be Google’s opinion and protected in the US by the First Amendment. When Google was once sued over altering PageRank scores for some sites, a US court ruled: “PageRanks are opinions — opinions of the significance of particular Web sites as they correspond to a search query….the court concludes Google’s PageRanks are entitled to full constitutional protection.)
PageRank was influenced by citation analysis, early developed by Eugene Garfield in the 1950s at the University of Pennsylvania, and by Hyper Search, developed by Massimo Marchiori at the University of Padua. In the same year PageRank was introduced (1998), Jon Kleinberg published his work on HITS. Google's founders cite Garfield, Marchiori, and Kleinberg in their original papers.[5][18]
The SEO industry changes at an extreme pace, every year marketers evolve their strategies and shift their focus. However, backlinks remain just as crucial of a strategy as when they were first created. Currently, backlinks are a very common phase in the world of SEO, and if you are involved in the industry, you know backlinks are vital to a website’s performance.
This pagerank theme is getting understood in simplistic ways, people are still concerning about pagerank all the time (talking about SEOs). I just use common sense, if I were the designer of a search engine, besides of using the regular structure of analysis, I would use artificial intelligence to determine many factors of the analysis. I think this is not just a matter of dividing by 10, is far more complex. I might be wrong, but I believe the use of the nofollow attribute is not a final decision of the website owner any more is more like an option given to the bot, either to accept or reject the link as valid vote. Perhaps regular links are not final decision of the webmaster too. I think Google is seeing websites human would do, the pages are not analyzed like a parser will do, I believe is more like a neural network, bit more complex. I believe this change make a little difference. People should stop worrying about pagerank and start building good content, the algorithm is far more complex to determine what is next step to reach top ten at Google. However nothing is impossible.

Thanks Matt for the informative post. However I do have some questions regarding blog comments. Let say a blog post of mine have PR 10, the page has 10 links, 3 of them are my internal link to my other related post, the other 7 links are external links from blog comment. Based on your explanation, even the 7 external links are nofollow, my 3 internal link will only get 1 PR each which is still the same if the 7 external link is dofollow. Therefore there is no point of adding nofollow for the sake of keeping the PR flow within your own links. Is this correct?
The numbers didn’t quite sit right with me because there didn’t seem to be enough juicy inbound links to the winning page. Then I noticed that two key links were missing from the 10 node chart with the PageRank metrics on it when compared to the previous chart without the metrics. The two missing links are the two coming from node 2 to node 1. Suddenly it all made sense again and it was obvious why that page won. 

By using the Facebook tracking pixel or the Adwords pixel, you can help to define your audience and work to entice them to come back to your site. Let's say the didn't finish their purchase or they simply showed up and left after adding something to their shopping cart, or they filled out a lead form and disappeared, you can re-target those individuals.
Positioning of a webpage on Google SERPs for a keyword depends on relevance and reputation, also known as authority and popularity. PageRank is Google's indication of its assessment of the reputation of a webpage: It is non-keyword specific. Google uses a combination of webpage and website authority to determine the overall authority of a webpage competing for a keyword.[36] The PageRank of the HomePage of a website is the best indication Google offers for website authority.[37]
The SEO starter guide describes much of what your SEO will do for you. Although you don't need to know this guide well yourself if you're hiring a professional to do the work for you, it is useful to be familiar with these techniques, so that you can be aware if an SEO wants to use a technique that is not recommended or, worse, strongly discouraged.
×