For example, what are the quality and quantity of the links that have been created over time? Are they natural and organic links stemming from relevant and high quality content, or are they spammy links, unnatural links or coming from bad link neighborhoods? Are all the links coming from the same few websites over time or is there a healthy amount of global IP diversification in the links?
Matt, you don’t mention the use of disallow pages via robots.txt. I’ve read that PageRank can be better utilised by disallowing pages that probably don’t add value to users searching on engines. For example, Privacy Policy and Terms of Use pages. These often appear in the footer of a website and are required by EU law on every page of the site. Will it boost the other pages of the site if these pages are added to robots.txt like so?
Do you regularly publish helpful, useful articles, videos or other types of media that are popular and well produced? Do you write for actual human beings rather than the search engine itself? Well, you should. Latest research from Searchmetrics on ranking factors indicates that Google is moving further towards longer-form content that understands a visitor’s intention as a whole, instead of using keywords based on popular search queries to create content.
i.e. the PageRank value for a page u is dependent on the PageRank values for each page v contained in the set Bu (the set containing all pages linking to page u), divided by the number L(v) of links from page v. The algorithm involves a damping factor for the calculation of the pagerank. It is like the income tax which the govt extracts from one despite paying him itself.

SEO is a marketing discipline focused on growing visibility in organic (non-paid) search engine results. SEO encompasses both the technical and creative elements required to improve rankings, drive traffic, and increase awareness in search engines. There are many aspects to SEO, from the words on your page to the way other sites link to you on the web. Sometimes SEO is simply a matter of making sure your site is structured in a way that search engines understand.
The PageRank algorithm outputs a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called "iterations", through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.
Check your robots.txt file. Make sure you learn how to hide content you don’t want indexed from search engines and that search engines can find the content you do want indexed, too. (You will want to hide things such as repeat content, which can be penalized by search engines but is still necessary on your site). You’ll find a link to how to modify the robots.txt at the end of this article.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam39, for example by using CAPTCHAs and turning on comment moderation.
This year, for the first time, Google stated that user experience would be a core part of gaining rankings for mobile websites. A poorer user experience would send your site hurtling down the rankings. This appeared to come as a shock to many in the SEO community and despite assurances that content was still king – many seemed to feel that this ...

The green ratings bars are a measure of the importance Google’s assessment of the importance of a web page, as determined by Google’s patented PageRank technology and other factors. These PageRank bars tell you at a glance whether other people on the web consider Google considers a page to be a high-quality site worth checking out. Google itself does not evaluate or endorse websites. Rather, we measure what others on the web feel is important enough to deserve a link. And because Google does not accept payment for placement within our results, the information you see when you conduct a search is based on totally objective criteria.
I think that removing the link to the sitemap shouldn’t be a big problem for the navigation, but I wonder what happens with the disclaimer and the contact page? If nofollow doesn’t sink the linked page, how can we tell the search engine that these are not content pages. For some websites these are some of the most linked pages. And yes for some the contact page is worth gaining rank, but for my website is not.
I think it is important you distinguish your advice about no-following INTERNAL links and no-following EXTERNAL links for user-generated content. Most popular UGC-heavy sites have no-followed links as they can’t possibly police them editorially & want to give some indication to the search engines that the links haven’t been editorially approved, but still might provide some user benefit.
It's clear that online marketing is no simple task. And the reason why we've landed in this world of "expert" internet marketers who are constantly cheerleading their offers to help us reach visibility and penetrate the masses is because of the layer of obscurity that's been afforded to us in part thanks to one key player: Google. Google's shrouded algorithms that cloud over 200+ ranking factors in a simple and easy-to-use interface has confounded businesses for well over a decade now.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
When Site A links to your web page, Google sees this as Site A endorsing, or casting a vote for, your page. Google takes into consideration all of these link votes (i.e., the website’s link profile) to draw conclusions about the relevance and significance of individual webpages and your website as a whole. This is the basic concept behind PageRank.
Fortunately, Google never gave up on the idea of backlinks; it just got better at qualifying them and utilizing other online signals to determine quality from disreputable tactics. Unethical methods can not only hurt your rankings, but can cause your domain to incur penalties from Google. Yes, your domain can be penalized and can even be removed from Google’s index if the offense is serious enough.

This pagerank theme is getting understood in simplistic ways, people are still concerning about pagerank all the time (talking about SEOs). I just use common sense, if I were the designer of a search engine, besides of using the regular structure of analysis, I would use artificial intelligence to determine many factors of the analysis. I think this is not just a matter of dividing by 10, is far more complex. I might be wrong, but I believe the use of the nofollow attribute is not a final decision of the website owner any more is more like an option given to the bot, either to accept or reject the link as valid vote. Perhaps regular links are not final decision of the webmaster too. I think Google is seeing websites human would do, the pages are not analyzed like a parser will do, I believe is more like a neural network, bit more complex. I believe this change make a little difference. People should stop worrying about pagerank and start building good content, the algorithm is far more complex to determine what is next step to reach top ten at Google. However nothing is impossible.


Most online marketers mistakenly attribute 100% of a sale or lead to the Last Clicked source. The main reason for this is that analytic solutions only provide last click analysis. 93% to 95% of marketing touch points are ignored when you only attribute success to the last click. That is why multi-attribution is required to properly source sales or leads.
Just think about any relationship for a moment. How long you've known a person is incredibly important. It's not the be-all-end-all, but it is fundamental to trust. If you've known someone for years and years and other people that you know who you already trust can vouch for that person, then you're far more likely to trust them, right? But if you've just met someone, and haven't really vetted them so to speak, how can you possibly trust them?

One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.


One of the consequences of the PageRank algorithm and its further manipulation has been the situation when backlinks (as well as link-building) have been usually considered black-hat SEO. Thus, not only Google has been combating the consequences of its own child's tricks, but also mega-sites, like Wikipedia, The Next Web, Forbes, and many others who automatically nofollow all the outgoing links. It means fewer and fewer PageRank votes. What is then going to help search engines rank pages in terms of their safety and relevance?
The Google algorithm's most important feature is arguably the PageRank system, a patented automated process that determines where each search result appears on Google's search engine return page. Most users tend to concentrate on the first few search results, so getting a spot at the top of the list usually means more user traffic. So how does Google determine search results standings? Many people have taken a stab at figuring out the exact formula, but Google keeps the official algorithm a secret. What we do know is this:

Email marketing - Email marketing in comparison to other forms of digital marketing is considered cheap; it is also a way to rapidly communicate a message such as their value proposition to existing or potential customers. Yet this channel of communication may be perceived by recipients to be bothersome and irritating especially to new or potential customers, therefore the success of email marketing is reliant on the language and visual appeal applied. In terms of visual appeal, there are indications that using graphics/visuals that are relevant to the message which is attempting to be sent, yet less visual graphics to be applied with initial emails are more effective in-turn creating a relatively personal feel to the email. In terms of language, the style is the main factor in determining how captivating the email is. Using casual tone invokes a warmer and gentle and inviting feel to the email in comparison to a formal style. For combinations; it's suggested that to maximize effectiveness; using no graphics/visual alongside casual language. In contrast using no visual appeal and a formal language style is seen as the least effective method.[48]
This must be one of the most controversial attributes ever. I participate in photographic communities. The textual content there is quite sparse, as it is a visual medium, with only basic descriptions. However, the community is very active and the participants leave a lot of meaningful comments. Now, with the “nofollow” used everywhere the photographic community is punishing itself for being active and interactive without knowing it. WordPress and Pixelpost now have “nofollow” built in almost on any list of links (blog-roll, comments etc). The plug-in and theme developers for these platforms followed suit and yes, you’ve guessed it – added “nofollow” almost on every link. So, every time I leave a comment without being an anonymous coward or if some one likes my blog and links to it in their blog-roll than I’m or they are diluting the rank of my blog? Does it mean for my own good I should stop participating in the community? Should I visit hundreds of blogs I visited in last three years and ask the owners to remove my comments and remove my site from their blog-roll to stop my PageRank from free falling?
We have a saying that “good data” is better than “big data.” Bid data is a term being thrown around a lot these days because brands and agencies alike now have the technology to collect more data and intelligence than ever before. But what does that mean for growing a business. Data is worthless without the data scientists analyzing it and creating actionable insights. We help our client partners sift through the data to gleam what matters most and what will aid them in attaining their goals.
While there are several platforms for doing this, clearly YouTube is the most popular for doing this. However, video marketing is also a great form of both content marketing and SEO on its own. It can help to provide visibility for several different ventures, and if the video is valuable enough in its message and content, it will be shared and liked by droves, pushing up the authority of that video through the roof.
Here’s my take on the whole pagerank sculpting situation. As I understand it, the basic idea is that you can increase your rankings in Google by channeling the page rank of your pages to the pages you want ranked. This used be done with the use of the ‘no folow’ tag. That said, things have changed, and Google has come out and said that the way ‘no follow’ use to work has changed. In short, using ‘no follow’ to channel that page rank juice is no longer as effective as it once was.
If the algorithm really works as Matt suggests, no one should use nofollow links internally. I’ll use the example that Matt gave. Suppose you have a home page with ten PR “points.” You have links to five “searchable” pages that people would like to find (and you’d like to get found!), and links to five dull pages with disclaimers, warranty info, log-in information, etc. But, typically, all of the pages will have links in headers and footers back to the home page and other “searchable” pages. So, by using “nofollow” you lose some of the reflected PR points that you’d get if you didn’t use “nofollow.” I understand that there’s a decay factor, but it still seems that you could be leaking points internally by using “nofollow.”
In the past, the PageRank shown in the Toolbar was easily manipulated. Redirection from one page to another, either via a HTTP 302 response or a "Refresh" meta tag, caused the source page to acquire the PageRank of the destination page. Hence, a new page with PR 0 and no incoming links could have acquired PR 10 by redirecting to the Google home page. This spoofing technique was a known vulnerability. Spoofing can generally be detected by performing a Google search for a source URL; if the URL of an entirely different site is displayed in the results, the latter URL may represent the destination of a redirection.

Of course, it’s possible that the algorithm has some method of discounting internally reflected (and/or directly reciprocal) links (particularly those in identical headers or footers) to such an extent that this isn’t important. Evidence to support this the fact that many boring pages that are linked to by every page in a good site can have very low PR.
Content is king. It always has been and it always will be. Creating insightful, engaging and unique content should be at the heart of any online marketing strategy. Too often, people simply don't obey this rule. The problem? This takes an extraordinary amount of work. However, anyone that tells you that content isn't important, is not being fully transparent with you. You cannot excel in marketing anything on the internet without having quality content.
Bob Dole (interesting name), you’re certainly welcome to use Bing if you prefer, but before you switch, you might check whether they do similar things. I know that Nate Buggia has strongly recommended not to bother with PageRank sculpting in the past, for example, or at least that was my perception from his comments at the last couple SMX Advanced conferences.
Thanks Matt for the informative post. However I do have some questions regarding blog comments. Let say a blog post of mine have PR 10, the page has 10 links, 3 of them are my internal link to my other related post, the other 7 links are external links from blog comment. Based on your explanation, even the 7 external links are nofollow, my 3 internal link will only get 1 PR each which is still the same if the 7 external link is dofollow. Therefore there is no point of adding nofollow for the sake of keeping the PR flow within your own links. Is this correct?
This year, for the first time, Google stated that user experience would be a core part of gaining rankings for mobile websites. A poorer user experience would send your site hurtling down the rankings. This appeared to come as a shock to many in the SEO community and despite assurances that content was still king – many seemed to feel that this ...
Thanks for the info on nofollow and pagerank. It makes sense that this will always be a moving target less everyone will eventually game the system until it’s worthless but at the same time it’s worth it to know a few tricks. I still have open concerns on how freshness of content factor in, the only time i’m ever annoyed by search results these days is when the only links available (on the first page at least) are articles from 4 years ago.
Pagerank has recently been used to quantify the scientific impact of researchers. The underlying citation and collaboration networks are used in conjunction with pagerank algorithm in order to come up with a ranking system for individual publications which propagates to individual authors. The new index known as pagerank-index (Pi) is demonstrated to be fairer compared to h-index in the context of many drawbacks exhibited by h-index.[63]
Conversion rate optimization is still possibly one of the most underutilized but critical functions of digital marketing. Every element of digital marketing is useless without considering conversion rates. This goes for SEO, SEM, Social Media, Email, and Display. The power of your SEO rankings are only as good as your click through rates and your traffic is only valuable of your website and landing pages foster some type of “action.” Why spend all the time and energy driving traffic through multiple different channels if you are not willing to spend the time and energy on conversion optimization? Yet many brands and agencies still put less emphasis on this crucial piece of the puzzle.
I dont know if Google gets its kicks out of keeping Search Engine Marketers and Webmasters jumping through hoops – or if they are in cahoots with the big SEM firms – so that they get this news and these updates before the average guy on the street. Either way, they are seriously getting a bit too big and powerful and the time is RIPE for a new search engine to step in and level the playing field.
Now, how much weight does PageRank carry? Like most every other part of the algorithm, it’s questionable. If we listed all the ranking factors, I don’t suspect it would be in the top 5, but it’s important to remember that the key to ranking well is to be the LESS IMPERFECT than your competition. IE: To have more of the right things that send the right signals in the right places so that Google sees you as a better, more relevant, candidate for the top three on page one. If you and your competitor both have optimized (on-page and technically) for the same keyword phrase perfectly, PR could be the deal breaker that pushes your blue link an inch up.

So, the probability for the random surfer reaching one page is the sum of probabilities for the random surfer following links to this page. Now, this probability is reduced by the damping factor d. The justification within the Random Surfer Model, therefore, is that the surfer does not click on an infinite number of links, but gets bored sometimes and jumps to another page at random.


If you’re not getting the clicks… you may need to invest more money per click. As you might expect, there are algorithms in play for SEM. Also, the more you pay, the more likely you are to be served with high-value (in terms of potential spending with your business) clicks. Or, you may just need to re-evaluate your keyphrase – maybe it’s not as popular as the figures, provided by Google Adwords, suggest?
PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by {\displaystyle PR(E).} Other factors like Author Rank can contribute to the importance of an entity.
Search engines are smart, but they still need help. The major engines are always working to improve their technology to crawl the web more deeply and return better results to users. However, there is a limit to how search engines can operate. Whereas the right SEO can net you thousands of visitors and increased attention, the wrong moves can hide or bury your site deep in the search results where visibility is minimal.
×