1. Now that we know that weight/PageRank/whatever will disappear (outside of the intrinsic wastage method that Google applies) when we use a ‘nofollow’ link, what do you think this will do to linking patterns? This is really a can of worms from an outbound linking and internal linking perspective. Will people still link to their ‘legals’ page from every page on their site? Turning comments ‘off’ will also be pretty tempting. I know this will devalue the sites in general, but we are not always dealing with logic here are we? (if we were you (as head of the web spam team) wouldn’t of had to change many things in the past. Changing the PageRank sculpting thing just being one of them).
A decent article which encourages discussion and healthy debate. Reading some of the comments I see it also highlights some of the misunderstandings some people (including some SEOs) have of Google PageRank. Toolbar PageRank is not the same thing as PageRank. The little green bar (Toolbar PageRank) was never a very accurate metric and told you very little about the value of any particular web page. It may have been officially killed off earlier this year, but the truth is its been dead for many years. Real PageRank on the other hand, is at the core of Google’s algorithm and remains very important.
As for the use of nofollow as a way to keep pages that shouldn’t be indexed out of Google (as with your feed example) is terrible advice. Your use of it on your feed link does nothing. If anyone links to your feed without nofollow, then it’s going to get indexed. Things that shouldn’t be indexed need to use either robots.txt or meta robots blocking. Nofollow on links to those items isn’t a solution.
We help clients increase their organic search traffic by using the latest best practices and most ethical and fully-integrated search engine optimization (SEO) techniques. Since 1999, we've partnered with many brands and executed campaigns for over 1,000 websites, helping them dominate in even highly competitive industries, via capturing placements that maximize impressions and traffic.
I work on a site that allows users to find what they are looking for by clicking links that take them deeper and deeper into the site hierarchy. Content can be categorised in lots of different ways. After about three steps the difference between the results pages shown is of significance to a user but not to a search engine. I was about to add nofollow to links that took the browser deeper than 3 levels but after this announcement I won’t be…
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Backlinks can be time-consuming to earn. New sites or those expanding their keyword footprint may find it difficult to know where to start when it comes to link building. That's where competitive backlink research comes in: By examining the backlink profile (the collection of pages and domains linking to a website) to a competitor that's already ranking well for your target keywords, you can gain insight about the link building that may have helped them. A tool like Link Explorer can help uncover these links so you can and target those domains in your own link building campaigns.


Our SEO professionals are all well-respected thought leaders in the space and have decades of combined experience and include the following credentials: Search Engine Workshop Certification, Google Analytics and Yahoo Certifications, PMP Certification, UNIX Certification, Computer Engineering degrees and MBA’s. Our SEO team members are acclaimed SEO speakers and bloggers. IMI’s SEO team members have been keynote presenters at Pubcon, SMX, SEMCon, Etail, and many more influential conferences.
Email marketing - Email marketing in comparison to other forms of digital marketing is considered cheap; it is also a way to rapidly communicate a message such as their value proposition to existing or potential customers. Yet this channel of communication may be perceived by recipients to be bothersome and irritating especially to new or potential customers, therefore the success of email marketing is reliant on the language and visual appeal applied. In terms of visual appeal, there are indications that using graphics/visuals that are relevant to the message which is attempting to be sent, yet less visual graphics to be applied with initial emails are more effective in-turn creating a relatively personal feel to the email. In terms of language, the style is the main factor in determining how captivating the email is. Using casual tone invokes a warmer and gentle and inviting feel to the email in comparison to a formal style. For combinations; it's suggested that to maximize effectiveness; using no graphics/visual alongside casual language. In contrast using no visual appeal and a formal language style is seen as the least effective method.[48]

Targeting, viewability, brand safety and invalid traffic: Targeting, viewability, brand safety and invalid traffic all are aspects used by marketers to help advocate digital advertising. Cookies are a form of digital advertising, which are tracking tools within desktop devices; causing difficulty, with shortcomings including deletion by web browsers, the inability to sort between multiple users of a device, inaccurate estimates for unique visitors, overstating reach, understanding frequency, problems with ad servers, which cannot distinguish between when cookies have been deleted and when consumers have not previously been exposed to an ad. Due to the inaccuracies influenced by cookies, demographics in the target market are low and vary (Whiteside, 2016).[42] Another element, which is affected within digital marketing, is ‘viewabilty’ or whether the ad was actually seen by the consumer. Many ads are not seen by a consumer and may never reach the right demographic segment. Brand safety is another issue of whether or not the ad was produced in the context of being unethical or having offensive content. Recognizing fraud when an ad is exposed is another challenge marketers face. This relates to invalid traffic as premium sites are more effective at detecting fraudulent traffic, although non-premium sites are more so the problem (Whiteside, 2016).[42]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
Deliver value no matter what: Regardless of who you are and what you're trying to promote, always deliver value, first and foremost. Go out of your way to help others by carefully curating information that will assist them in their journey. The more you focus on delivering value, the quicker you'll reach that proverbial tipping point when it comes to exploding your fans or followers.
There are also many keyword research tools (some free and some paid) that claim to take the effort out of this process. A popular tool for first timers is Traffic Travis, which can also analyse your competitors’ sites for their keyword optimization strategies and, as a bonus, it can deliver detailed analysis on their back-linking strategy, too. You can also use Moz.com’s incredibly useful keyword research tools – they’re the industry leader, but they come at a somewhat higher price.
There are simple and fast random walk-based distributed algorithms for computing PageRank of nodes in a network.[33] They present a simple algorithm that takes {\displaystyle O(\log n/\epsilon )} rounds with high probability on any graph (directed or undirected), where n is the network size and {\displaystyle \epsilon } is the reset probability ( {\displaystyle 1-\epsilon } is also called as damping factor) used in the PageRank computation. They also present a faster algorithm that takes {\displaystyle O({\sqrt {\log n}}/\epsilon )} rounds in undirected graphs. Both of the above algorithms are scalable, as each node processes and sends only small (polylogarithmic in n, the network size) number of bits per round.
Content is king. It always has been and it always will be. Creating insightful, engaging and unique content should be at the heart of any online marketing strategy. Too often, people simply don't obey this rule. The problem? This takes an extraordinary amount of work. However, anyone that tells you that content isn't important, is not being fully transparent with you. You cannot excel in marketing anything on the internet without having quality content.
However, the biggest contributing factors to a backlink’s effect on your rank is the website it’s coming from, measured by the acronym ART: authority, a measure of a site’s prestige/reliability — .edu and .gov sites are particularly high-authority); relevance, a measure of how related the site hosting the link is to the content; and trust, which is not an official Google metric, but relates to how much a site plays by the rules of search (i.e. not selling links) and provides good content.

2. Domain authority and page authority. Next, you should learn about domain authority and page authority, and how they predict your site’s search rankings. Here’s the basic idea; your site’s domain authority is a proprietary score, provided by Moz, of how “trustworthy” your domain is. It’s calculated based on the quantity and quality of inbound links to your website. The higher it is, the higher all your pages across your domain are likely to rank in organic search results. Page authority is very similar, but page-specific, and you can use it to engineer a link architecture that strategically favors some of your pages over others. Authority depends on the authority and volume of inbound links.


Matt, you don’t mention the use of disallow pages via robots.txt. I’ve read that PageRank can be better utilised by disallowing pages that probably don’t add value to users searching on engines. For example, Privacy Policy and Terms of Use pages. These often appear in the footer of a website and are required by EU law on every page of the site. Will it boost the other pages of the site if these pages are added to robots.txt like so?

For the purpose of their second paper, Brin, Page, and their coauthors took PageRank for a spin by incorporating it into an experimental search engine, and then compared its performance to AltaVista, one of the most popular search engines on the Web at that time. Their paper included a screenshot comparing the two engines’ results for the word “university.”
Yes the links we have are found elsewhere but our focus is saving our users and clients time so we consolidated the links because it takes hours and hours and hours of searching to find them and some searchers are not very savvy when it comes to looking for, and finding, good quality information. I look at the links like a library, my library has these books, so do a bunch of other libraries. I think it is a shame that I have to hide my books from Google because I have to many really good ones because it is seen as a BAD thing in Google’s eyes. Darned if you dont create a good site, and darned if you do.

For example this page. My program found almost 400 nofollow links on this page. (Each comment has 3). And then you have almost 60 navigation links. My real question is how much percentage of the PageRank on this page gets distributed to the 9 real links in the article? If it is a division of 469 which some SEO experts now are claiming it is really disturbing. You won’t earn much from the links if you follow what I am saying.
Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.

Great post, it is very valuable info indeed. I just want to ask, I am trying to market a business to business website, I find it quite hard to market the website in the appropriate categories as it is specialized. Many of the websites I am designing are FCA regulated so when it comes to advertising or giving advise, I am limited as to what I can and can’t do/say. What business to business websites do you recommend for backlinking specialist websites? I find that I am also limited in the social media area and its just LinkedIn that helps
Influencer marketing: Important nodes are identified within related communities, known as influencers. This is becoming an important concept in digital targeting. It is possible to reach influencers via paid advertising, such as Facebook Advertising or Google Adwords campaigns, or through sophisticated sCRM (social customer relationship management) software, such as SAP C4C, Microsoft Dynamics, Sage CRM and Salesforce CRM. Many universities now focus, at Masters level, on engagement strategies for influencers.
Retargeting is another way that we can close the conversion loop and capitalize on the traffic gained from the overall marketing campaign. Retargeting is a very powerful display advertising tool to keep your brand top of mind and keep them coming back. We track every single touch point up to the ultimate conversions and use that data to make actionable recommendations for further campaign optimization.
Start Value (In this case) is the number of actual links to each “node”. Most people actually set this to 1 to start, but there are two great reasons for using link counts. First, it is a better approximation to start with than giving everything the same value, so the algorithm stabilizes in less iterations and it is so useful to check my spreadsheet in a second… so node A has one link in (from page C)

Totally agree — more does not always equal better. Google takes a sort of ‘Birds of a Feather’ approach when analyzing inbound links, so it’s really all about associating yourself (via inbound links) with websites Google deems high quality and trustworthy so that Google deems YOUR web page high quality and trustworthy. As you mentioned, trying to cut corners, buy links, do one-for-one trades, or otherwise game/manipulate the system never works. The algorithm is too smart.
From a customer experience perspective, we currently have three duplicate links to the same URL i.e. i.e. ????.com/abcde These links are helpful for the visitor to locate relevant pages on our website. However, my question is; does Google count all three of these links and pass all the value, or does Google only transfer the weight from one of these links. If it only transfers value from one of these links, does the link juice disappear from the two other links to the same page, or have these links never been given any value?
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
“Google itself solely decides how much PageRank will flow to each and every link on a particular page. The number of links doesn’t matter. Google might decide some links don’t deserve credit and give them no PageRank. The use of nofollow doesn’t “conserve” PageRank for other links; it simply prevents those links from getting any PageRank that Google otherwise might have given them.”
Why do so many people spend so much time researching SEO and page rank? Its really not that hard to figure out, (I am speaking in a nice tone by the way =) – all you should need to be focusing on is advertising and building your website in a manner that is ethical, operational and practical for the content and industry that your website is in/about. If you are not up-to-something, then google will know it, and they will rank you accordingly. If you spend so much time trying to figure out how to get to the top, I bet you google spends triple that time figuring out how to figure out how your trying to get to the top. So and and so forth…and your not going to win. Have good content not copied, stay away from to many out bound links especially affiliates, post your backlinks at places that have something to do with your site, etc etc… Is it an American thing, I don’t seem to see it as bad in other places of the world, that is “always trying to figure out an easy way, a quick fix, a way to not have to put in the effort…” anyway… Thanks for letting me vent. Please not nasty replies. Keep it to your self = )
PageRank is often considered to be a number between 0 and 10 (with 0 being the lowest and 10 being the highest) though that is also probably incorrect. Most SEOs believe that internally the number is not an integer, but goes to a number of decimals. The belief largely comes from the Google Toolbar, which will display a page's PageRank as a number between 0 and 10. Even this is a rough approximation, as Google does not release its most up to date PageRank as a way of protecting the algorithm's details.
Backlinks take place across the Internet when one website mentions another website and links to it. Also, referred to as “incoming links,” backlinks make their connection through external websites. These links from outside domains point to pages on your own domain. Whenever backlinks occur, it is like receiving a vote for a webpage. The more votes you get from the authoritative sites creates a positive effect on a site’s ranking and search visibility.
While SEOs can provide clients with valuable services, some unethical SEOs have given the industry a black eye by using overly aggressive marketing efforts and attempting to manipulate search engine results in unfair ways. Practices that violate our guidelines may result in a negative adjustment of your site's presence in Google, or even the removal of your site from our index.
In my experience this means (the key words are “not the most effective way”) a page not scored by Google (“e.g. my private link” – password protected, disallowed via robots.txt and/or noindex meta robots) whether using or not using rel=”nofollow” attribute in ‘links to’ is not factored into anything… because it can’t factor in something it isn’t allowed.
Also, backlinks are important for the end user. With an end user, backlinks connect searchers with information that is similar to what is being written on other resources. An example of this happens when an end user is reading a page that discusses “how child care expenses are driving women out of the workforce.” As they scroll down, they might see another link with a study on “how the rise in child care costs over the last 25 years affected women’s employment.” In this case, a backlink establishes connection points for information that a searcher may be interested in clicking. This external link creates a solid experience because it transfers the user directly to additionally desirable information if needed.
Search engine optimization is a key part of online marketing because search is one of the primary ways that users navigate the web. In 2014, over 2.5 trillion searches were conducted worldwide across search engines such as Google, Bing, Yahoo, Baidu, and Yandex. For most websites, traffic that comes from search engines (known as "natural" or "organic" traffic) accounts for a large portion of their total traffic.
The combination of charisma, charm and intellect has helped catapult Sharpe to the top of the heap. In a recent conversation with him, I wanted to learn what it truly took to become an expert digital marketer. And one of the most important takeaways from that phone call was that if he could do it, anyone could do it. For someone who failed so devastatingly very early on in life, to rise from the ashes like a phoenix was no easy feat.

There is no secret that getting hiqh-quality backlinks is your website’s way to better ranking in Google. But how to differ good link from the bad one? Carefully choosing backlinks is a very tremulous and important task for everyone who wants to optimize their sites. There are a lot of different tools which can help you to check whether your backlinks are trustful and can bring your website value. 
Consumers today are driven by the experience. This shift from selling products to selling an experience requires a connection with customers on a deeper level, at every digital touch point. TheeDigital’s internet marketing professionals work to enhance the customer experience, grow your online presence, generate high-quality leads, and solve your business-level challenges through innovative, creative, and tactful internet marketing.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[67][68]
nofollow is beyond a joke now. There is so much confusion (especially when other engines’ treatment is factored in), I don’t know how you expect a regular publisher to keep up. The expectation seems to have shifted from “Do it for humans and all else will follow” to “Hang on our every word, do what we say, if we change our minds then change everything” and nofollow lead the way. I could give other examples of this attitude (e.g. “We don’t follow JavaScript links so it’s ’safe’ to use those for paid links”), but nofollow is surely the worst.
Replicating competitor’s backlinks is one of the smartest ways to find new link building opportunities and improve SEO. Get started by choosing your primary competitors, the websites that are ranking on the top 5 positions for your main keywords. If they’re ranking above you, it means they have a better link profile, and they have backlinks of higher quality. Once you’ve decide which competitors to spy on, you’ll have to analyze their backlinks.
Matt Cutts, it’s Shawn Hill from Longview, Texas and I’ve got to say, “you’re a semseo guru”. That’s obviously why Google retained you as they did. Very informative post! As head of Google’s Webspam team how to you intend to combat Social Networking Spam (SNS)? It’s becoming an increasingly obvious problem in SERPs. I’m thinking Blogspam should be the least of Google’s worries. What’s your take?
×