Why do so many people spend so much time researching SEO and page rank? Its really not that hard to figure out, (I am speaking in a nice tone by the way =) – all you should need to be focusing on is advertising and building your website in a manner that is ethical, operational and practical for the content and industry that your website is in/about. If you are not up-to-something, then google will know it, and they will rank you accordingly. If you spend so much time trying to figure out how to get to the top, I bet you google spends triple that time figuring out how to figure out how your trying to get to the top. So and and so forth…and your not going to win. Have good content not copied, stay away from to many out bound links especially affiliates, post your backlinks at places that have something to do with your site, etc etc… Is it an American thing, I don’t seem to see it as bad in other places of the world, that is “always trying to figure out an easy way, a quick fix, a way to not have to put in the effort…” anyway… Thanks for letting me vent. Please not nasty replies. Keep it to your self = )
As mentioned above, the two versions of the algorithm do not differ fundamentally from each other. A PageRank which has been calculated by using the second version of the algorithm has to be multiplied by the total number of web pages to get the according PageRank that would have been caculated by using the first version. Even Page and Brin mixed up the two algorithm versions in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they claim the first version of the algorithm to form a probability distribution over web pages with the sum of all pages' PageRanks being one.
Digital marketing is probably the fastest-changing marketing field out there: New tools are being built, more platforms emerge and more channels need to be included into your marketing plan. How not to get overwhelmed while staying on top of the latest marketing trends? Here are a few tools that help you scale and automate some parts of your marketing routine making you a more productive and empowered marketer: Tools to Semi-Automate Marketing Tasks 1.
The name "PageRank" plays off of the name of developer Larry Page, as well as of the concept of a web page.[15] The word is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares of Google in exchange for use of the patent; it sold the shares in 2005 for $336 million.[16][17]
Suppose instead that page B had a link to pages C and A, page C had a link to page A, and page D had links to all three pages. Thus, upon the first iteration, page B would transfer half of its existing value, or 0.125, to page A and the other half, or 0.125, to page C. Page C would transfer all of its existing value, 0.25, to the only page it links to, A. Since D had three outbound links, it would transfer one third of its existing value, or approximately 0.083, to A. At the completion of this iteration, page A will have a PageRank of approximately 0.458.
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
Also given that the original reasons for implementing the ‘nofollow’ tag was to reduce comment spam (something that it really hasn’t had a great effect in combatting) – the real question I have is why did they ever take any notice of nofollow on internal links in the first place? It seems to me that in this case they made the rod for their own back.
Brian, just wanted to start off by saying great informative article, you had a lot of great of insight. I see it was mentioned a bit in the above comments, about the infographic, but I thought it is a great idea to include a textbox under the infographic with the coding that could be copied to be pasted on blogs (thus, earning additional backlinks from other websites). I’ve also noticed many infographics that have “resources” or “references” included in the image. My understanding is currently it is not recognized by google, because of the image format, but I foresee one day Google may be able to update their algorithm to recognize written text inside of an image, and thus potentially adding value to the written text in the image. What are your thoughts on that idea?

The green ratings bars are a measure of the importance Google’s assessment of the importance of a web page, as determined by Google’s patented PageRank technology and other factors. These PageRank bars tell you at a glance whether other people on the web consider Google considers a page to be a high-quality site worth checking out. Google itself does not evaluate or endorse websites. Rather, we measure what others on the web feel is important enough to deserve a link. And because Google does not accept payment for placement within our results, the information you see when you conduct a search is based on totally objective criteria.
All major crawler-based search engines leverage links from across of the web, but none of them report a static “importance” score in the way Google does via its Google Toolbar. That score, while a great resource for surfers, has also provided one of the few windows into how Google ranks web pages. Some webmasters, desperate to get inside Google, keep flying into that window like confused birds, smacking their heads and losing their orientation….
On a blog the page rank should go to the main article pages. Now it just gets “evaporated” if you use “nofollow” or scattered to all the far flung nooks and crannys which means google will not be able to see the wood for the trees. The vast majority of a site’s overall page rank will now reside in the long tail of useless pages such as commentors profile pages. This can only make it harder for google to serve up the most relevant pages.
TrustRank takes into consideration website foundational backlinks. Searching engines find quicker sites which are reliable and trustworthy and place them on the top of SERP. All doubtful websites you can find somewhere at the end of the rank if you decide to look what is there. As a rule, people take the information from the first links and stop searching, in case they have found nothing on first 20 top sites. Surely, your website may have that required information, service or goods but because of lack of authority, Internet users will not find them unless you have good foundational backlinks. What are backlinks which we call foundational? These are all branded and non-optimized backlinks on authority websites.
I am not worried by this; I do agree with Danny Sullivan (Great comment Danny, best comment I have read in a long time). I will not be changing much on my site re: linking but it is interesting too see that Google took over a year to tell us regarding the change, but was really happy to tell us about rel=”nofollow” in the first place and advised us all to use it.
The formula uses a model of a random surfer who gets bored after several clicks and switches to a random page. The PageRank value of a page reflects the chance that the random surfer will land on that page by clicking on a link. It can be understood as a Markov chain in which the states are pages, and the transitions, which are all equally probable, are the links between pages.
Baseline ranking assessment. You need to understand where you are now in order to accurately assess your future rankings. Keep a simple Excel sheet to start the process. Check weekly to begin. As you get more comfortable, check every 30 to 45 days. You should see improvements in website traffic, a key indicator of progress for your keywords. Some optimizers will say that rankings are dead. Yes, traffic and conversions are more important, but we use rankings as an indicator.
nofollow is beyond a joke now. There is so much confusion (especially when other engines’ treatment is factored in), I don’t know how you expect a regular publisher to keep up. The expectation seems to have shifted from “Do it for humans and all else will follow” to “Hang on our every word, do what we say, if we change our minds then change everything” and nofollow lead the way. I could give other examples of this attitude (e.g. “We don’t follow JavaScript links so it’s ‘safe’ to use those for paid links”), but nofollow is surely the worst.
Great article and writing in general. My company just published a 5,000 word Keyword targeting best practices guide for PPC and SEO, and we linked to your article “10 Reasons You Should Use Google Trends for More Than Just Keyword Research”. http://vabulous.com/keyword-research-targeting-for-ppc-and-seo-guide/ I would love if you checked it out and possibly shared it if you like it.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
Back in the ’90s, two students at Stanford named Larry Page and Sergey Brin started pondering how they could make a better search engine that didn’t get fooled by keyword stuffing. They realized that if you could measure each website’s popularity (and then cross index that with what the website was about), you could build a much more useful search engine. In 1998, they published a scientific paper in which they introduced the concept of “PageRank.” This topic was further explored in another paper that Brin and Page contributed to, “PageRank Citation Ranking: Bringing Order to the Web.”
You should fix all errors which can impair users’ expectations. By hurting user experience, you endanger the organic growth of your traffic because Google will surely limit it. Do this task thoroughly and don’t be in a hurry, otherwise, you might learn that your backlinks don’t work. Be responsible for each decision and action. Search Engine Optimization (SEO) works better when the technical optimization of your site meets the standards.
This is the argument that quickly emerged about blog comments recently. Say I have an article on a blog with 5 links in the editorial copy — some of those links leading back to other content within the blog that I hope to do well. Then I get 35 comments on the article, with each comment having a link back to the commenters’ sites. That’s 40 links in all. Let’s say this particular page has $20 in PageRank to spend. Each link gets 50 cents.
Great post. I’m posting a link back to this article from our blog along with some comments. I do have a question. In your article, you post “The only place I deliberately add a nofollow is on the link to my feed, because it’s not super-helpful to have RSS/Atom feeds in web search results.” Yet when I look at this article, I noticed that the comment links are “external, nofollow”. Is there a reason for that?
Google's core algorithms and its propensity to shroud its data in layers of obscurity is not something new. However, it is critical to any understanding of marketing on the internet simply because this visibility is at the heart of everything else that you do. Forget about social media and other forms of marketing for the time being. Search engine optimization (SEO) offers up the proverbial key to near-limitless amounts of traffic on the web.
I’ve never been particularly enamoured with nofollow, mainly because it breaks the “do it for humans” rule in a way that other robots standards do not. With other standards (e.g. robots.txt, robots meta tag), the emphasis has been on crawling and indexing; not ranking. And those other standards also strike a balance between what’s good for the publisher and what’s good for the search engine; whereas with nofollow, the effort has been placed on the publisher with most of the benefit enjoyed by the search engine.
Thanks for sharing this, Matt. I’m happy that you took the time to do so considering that you don’t have to. What I mean is, in an ideal world, there should be no such thing as SEO. It is the SE’s job to bring the right users to the right sites and it is the job of webmasters to cater to the needs of the users brought into their sites by SEs. Webmasters should not be concerned of bringing the users in themselves. (aside from offsite or sponsored marketing campaigns) The moment they do, things start to get ugly because SEs would now have to implement counter-measures. (To most SEO tactics) This becomes an unending spiral. If people only stick to their part of the equation, SEs will have more time to develop algorithms for making sure webmasters get relevant users rather than to develop algorithms for combating SEOs to ensure search users get relevant results. Just do your best in providing valuable content and Google will try their best in matching you with your users. Don’t waste time trying to second guess how Google does it so that you can present yourself to Google as having a better value than you really have. They have great engineers and they have the code—you only have a guess. At most, the SEO anyone should be doing is to follow the webmasters guidelines. It will benefit all.
On a blog the page rank should go to the main article pages. Now it just gets “evaporated” if you use “nofollow” or scattered to all the far flung nooks and crannys which means google will not be able to see the wood for the trees. The vast majority of a site’s overall page rank will now reside in the long tail of useless pages such as commentors profile pages. This can only make it harder for google to serve up the most relevant pages.
Online networking, when executed correctly, allows you to build valuable relationships in online forums and groups that can help you advance your business. You could meet peers and fellow experts with whom you could collaborate or partner up with for a project, or you could provide value to your target audience by sharing your knowledge and winning over some customers as a result. No matter what, though, the goal with this type of marketing is purely relationship building and not selling outright.

NOTE: You may be curious what your site’s or your competitor’s PR score is. But Google no longer reveals the PageRank score for websites. It used to display at the top of web browsers right in the Google Toolbar, but no more. And PR data is no longer available to developers through APIs, either. Even though it’s now hidden from public view, however, PageRank remains an important ingredient in Google’s secret ranking algorithms.


5. Link building. In some respects, guest posting – one popular tactic to build links, among many other benefits – is just content marketing applied to external publishers. The goal is to create content on external websites, building your personal brand and company brand at the same time, and creating opportunities to link back to your site. There are only a handful of strategies to build quality links, which you should learn and understand as well.
“With 150 million pages, the Web had 1.7 billion edges (links).” Kevin Heisler, that ratio holds true pretty well as the web gets bigger. A good rule of thumb is that the number of links is about 10x the number of pages. I agree that it’s pretty tragic that Rajeev Motwani was a co-author of many of those early papers. I got to talk to Rajeev a little bit at Google, and he was a truly decent and generous man. What has heartened me is to see all the people that he helped, and to see those people pay their respects online. No worries on the Consumer WebWatch–I’m a big fan of Consumer WebWatch, and somehow I just missed their blog. I just want to reiterate that even though this feels like a huge change to a certain segment of SEOs, in practical terms this change really doesn’t affect rankings very much at all.
If you build a new site and only used Domain Authority to create links, you could EASILY have got linked from the worst page possible, even if it was from the best domain, because of the INTERNAL LINKS of the other web pages! How on earth are you going to be able to see the strength of a link if that strength depends on the internal links on an entirely different website?!
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
Deliver value no matter what: Regardless of who you are and what you're trying to promote, always deliver value, first and foremost. Go out of your way to help others by carefully curating information that will assist them in their journey. The more you focus on delivering value, the quicker you'll reach that proverbial tipping point when it comes to exploding your fans or followers.
Google's core algorithms and its propensity to shroud its data in layers of obscurity is not something new. However, it is critical to any understanding of marketing on the internet simply because this visibility is at the heart of everything else that you do. Forget about social media and other forms of marketing for the time being. Search engine optimization (SEO) offers up the proverbial key to near-limitless amounts of traffic on the web.
A backlink’s value doesn’t only come from the website authority itself. There are other factors to consider as well. You’ll sometimes times hear those in the industry refer to “dofollow” and “nofollow” links. This goes back to the unethical linkbuilding tactics in the early days of SEO. One practices included commenting on blogs and leaving a link. It was an easy method and back then, search engines couldn’t tell the difference between a blog post and other site content.

If you’re Matt Cutts and a billion people link to you because you’re the Spam guy at Google, writing great content is enough. For the rest of us in hypercompetitive markets, good content alone is not enough. There was nothing wrong with sculpting page rank to pages on your site that make you money as a means of boosting traffic to those pages. It’s not manipulating Google, there’s more than enough of that going on in the first page of results for most competitive keywords. Geez Matt, give the little guy a break!


Before I start this, I am using the term ‘PageRank’ as a general term fully knowing that this is not a simple issue and ‘PageRank’ and the way it is calculated (and the other numerous methods Google use) are multidimensional and complex. However, if you use PageRank to imply ‘weight’ it make it a lot simpler. Also, ‘PageRank sculpting’ (in my view) is meant to mean ‘passing weight you can control’. Now… on with the comment!

Submit website to directories (limited use). Professional search marketers don’t sub­mit the URL to the major search engines, but it’s possible to do so. A better and faster way is to get links back to your site naturally. Links get your site indexed by the search engines. However, you should submit your URL to directories such as Yahoo! (paid), Business.com (paid) and DMOZ (free). Some may choose to include AdSense (google.com/adsense) scripts on a new site to get their Google Media bot to visit. It will likely get your pages indexed quickly.
Heading tags. Always use H tags to optimize your content layout. Try and use variations on your keyphrases in some headings, too. Don’t repeat keyphrases in headings unless it’s absolutely necessary. (This doesn’t stop you from needing to repeat the keyphrase in the body of your content). H tags are HTML codes – you can find a link to HTML codes and how to use them at the end of this section.
1. Apparently, external linking of any kind bleeds PR from the page. Following or nofollowing becomes a function of whether you want that lost PR to benefit the other site. Since nofollow has ceased to provide the benefit of retaining pagerank, the only reason to use it at all is Google Might Think This Link Is Paid. Conclusion: Google is disincentivizing external links of any kind.
We can’t know the exact details of the scale because, as we’ll see later, the maximum PR of all pages on the web changes every month when Google does its re-indexing! If we presume the scale is logarithmic (although there is only anecdotal evidence for this at the time of writing) then Google could simply give the highest actual PR page a toolbar PR of 10 and scale the rest appropriately.

You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.


Backlinks take place across the Internet when one website mentions another website and links to it. Also, referred to as “incoming links,” backlinks make their connection through external websites. These links from outside domains point to pages on your own domain. Whenever backlinks occur, it is like receiving a vote for a webpage. The more votes you get from the authoritative sites creates a positive effect on a site’s ranking and search visibility.
But if you do it properly, it can be worth your money. Also, press releases can be much more than just a block of text. In December 2018, we ran a press release through Business Wire that had multiple backlinks, stylized call outs, and even a video! If you put effort into them, press releases can be not just a source of backlinks, but also serve as a great marketing piece as well.
×