Something a lot of people seem to have overlooked was hinted at in Greg Boser’s comment above. Greg identified that there is a major (and unfair) disparity with how authority sites such as Wikipedia disrupt the linkscape by run-of-site nofollows. Once Wikipedia implemented the no-follows, previously high-value links from Wikipedia were rendered worthless making the site less of a target for spammers. Increasingly large sites are following suit in order to cleanse their own pages of spam.
Most people need to take a step back and understand where money is even coming from on the web. Sharpe says that, when asked, most individuals don't actually even know how money is being made on a high level. How does Facebook generate its revenues? How about Google? How do high-trafficked blogs become so popular and how do they generate money from all of that traffic? Is there one way or many?
You’ll want to capture users’ emails regularly, both when they purchase…and even before they become a customer. You can use lead magnets or discounts to incentivize email sign-ups and using an email management service like MailChimp allows you to create triggered autoresponders that will automatically send out pre-made welcome email campaigns when they subscribe.

Internet marketing, or online marketing, refers to advertising and marketing efforts that use the Web and email to drive direct sales via electronic commerce, in addition to sales leads from websites or emails. Internet marketing and online advertising efforts are typically used in conjunction with traditional types of advertising such as radio, television, newspapers and magazines.
I was exactly thinking the same thing what Danny Sullivan had said. If comments (even with nofollow) directly affect the outgoing PR distribution, people will tend to allow less comments (maybe usage of iframes even). Is he right? Maybe, Google should develop a new tag as well something like rel=”commented” to inform spiders about it to give less value and wordpress should be installed default with this attribute 🙂
To create an effective DMP, a business first needs to review the marketplace and set 'SMART' (Specific, Measurable, Actionable, Relevant and Time-Bound) objectives.[60] They can set SMART objectives by reviewing the current benchmarks and key performance indicators (KPIs) of the company and competitors. It is pertinent that the analytics used for the KPIs be customised to the type, objectives, mission and vision of the company.[61][62]
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.[27]
I agree that the more facts that you provide and if you were to provide the complete algorithm, people would abuse it but if it were available to everyone, would it not almost force people to implement better site building and navigation policies and white hat seo simply because everyone would have the same tools to work with and an absolute standard to adhere to.
There’s obviously a huge number of reasons why a website might link to another and not all of them fit into the categories above. A good rule of thumb on whether a link is valuable is to consider the quality of referral traffic (visitors that might click on the link to visit your website). If the site won’t send any visitors, or the audience is completely unrelated and irrelevant, then it might not really be a link that’s worth pursuing.
If (a) is correct that looks like bad news for webmasters, BUT if (b) is also correct then – because PR is ultimately calculated over the whole of the web – every page loses out relative to every other page. In other words, there is less PR on the web as a whole and, after a sufficient number of iterations in the PR calculation, normality is restored. Is this correct?
SEO should be a core tactic in any marketing strategy. While it might seem difficult to understand at first, as long as you find the right course, book or audiobook, and devote your time to learning, you'll be in good shape. Considering that there are over 200+ ranking factors in Google's current algorithms, learning, digesting and successfully implementing good SEO tactics is essential to the success of your website or blog.
What an article… thank you so much for the priceless information, we will be changing our pages around to make sure we get the highest page rank available to us, we are trying to get high page rank sites to link to us, hopefully there is more information out there to gather as we want to compete within our market to gain as much market-share as possible. 

Backlinks are important for a number of reasons. The quality and quantity of pages backlinking to your website are some of the criteria used by search engines like Google to determine your ranking on their search engine results pages (SERP). The higher you rank on a SERP, the better for your business as people tend to click on the first few search results Google, Bing or other search engines return for them.
Brian, just wanted to start off by saying great informative article, you had a lot of great of insight. I see it was mentioned a bit in the above comments, about the infographic, but I thought it is a great idea to include a textbox under the infographic with the coding that could be copied to be pasted on blogs (thus, earning additional backlinks from other websites). I’ve also noticed many infographics that have “resources” or “references” included in the image. My understanding is currently it is not recognized by google, because of the image format, but I foresee one day Google may be able to update their algorithm to recognize written text inside of an image, and thus potentially adding value to the written text in the image. What are your thoughts on that idea?

Instead of relying on a group of editors or solely on the frequency with which certain terms appear, Google ranks every web page using a breakthrough technique called PageRank™. PageRank evaluates all of the sites linking to a web page and assigns them a value, based in part on the sites linking to them. By analyzing the full structure of the web, Google is able to determine which sites have been “voted” the best sources of information by those


We combine our sophisticated Search Engine Optimization skills with our ORM tools such as social media, social bookmarking, PR, video optimization, and content marketing to decrease the visibility of potentially damaging content. We also work with our clients to create rebuttal pages, micro-sites, positive reviews, social media profiles, and blogs in order to increase the volume of positive content that can be optimized for great search results. 

Backlinks are a major ranking factor for most search engines, including Google. If you want to do SEO for your website and get relevant organic traffic, building backlinks is something you should be doing. The more backlinks your website has from authoritative domains, the higher reputation you’ll have in Google’s eyes. And you’ll dominate the SERPS.

A decent article which encourages discussion and healthy debate. Reading some of the comments I see it also highlights some of the misunderstandings some people (including some SEOs) have of Google PageRank. Toolbar PageRank is not the same thing as PageRank. The little green bar (Toolbar PageRank) was never a very accurate metric and told you very little about the value of any particular web page. It may have been officially killed off earlier this year, but the truth is its been dead for many years. Real PageRank on the other hand, is at the core of Google’s algorithm and remains very important.
With focus I mean making sure that your pages focus on the same keyword everywhere, and your site focuses on the same high level keywords and sections in your site focusing on their own high level (but not as high as the keywords for which you want your home page to rank) keywords. Focus few people really understand while the interesting thing is that you do this almost automatically right if you do your site architecture and understanding your customers, right.
A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.

When calculating PageRank, pages with no outbound links are assumed to link out to all other pages in the collection. Their PageRank scores are therefore divided evenly among all other pages. In other words, to be fair with pages that are not sinks, these random transitions are added to all nodes in the Web. This residual probability, d, is usually set to 0.85, estimated from the frequency that an average surfer uses his or her browser's bookmark feature. So, the equation is as follows:
The green ratings bars are a measure of the importance Google’s assessment of the importance of a web page, as determined by Google’s patented PageRank technology and other factors. These PageRank bars tell you at a glance whether other people on the web consider Google considers a page to be a high-quality site worth checking out. Google itself does not evaluate or endorse websites. Rather, we measure what others on the web feel is important enough to deserve a link. And because Google does not accept payment for placement within our results, the information you see when you conduct a search is based on totally objective criteria.
While there are several platforms for doing this, clearly YouTube is the most popular for doing this. However, video marketing is also a great form of both content marketing and SEO on its own. It can help to provide visibility for several different ventures, and if the video is valuable enough in its message and content, it will be shared and liked by droves, pushing up the authority of that video through the roof. 

As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3] In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.[4]
Excellent post! I’m reasonably savvy up to a certain point and have managed to get some of my health content organically ranking higher than WebMD. It’s taken a long time building strong backlinks from very powerful sites (HuffingtonPost being one of them), but I am going to take some time, plow through a few beers, and then get stuck into implementing some of these suggestions. Keep up the great work amigo. Cheers, Bill
The name "PageRank" plays off of the name of developer Larry Page, as well as of the concept of a web page.[15] The word is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares of Google in exchange for use of the patent; it sold the shares in 2005 for $336 million.[16][17]
1. Now that we know that weight/PageRank/whatever will disappear (outside of the intrinsic wastage method that Google applies) when we use a ‘nofollow’ link, what do you think this will do to linking patterns? This is really a can of worms from an outbound linking and internal linking perspective. Will people still link to their ‘legals’ page from every page on their site? Turning comments ‘off’ will also be pretty tempting. I know this will devalue the sites in general, but we are not always dealing with logic here are we? (if we were you (as head of the web spam team) wouldn’t of had to change many things in the past. Changing the PageRank sculpting thing just being one of them).
One final note is that if the links are not directly related to the subject, or you have no control over them, such as commentors’ website links, maybe you should consider putting them on another page, which links to your main content. That way you don’t leak page rank, and still gain hits from search results from the content of the comments. I may be missing something but this seems to mean that you can have your cake and eat it, and I don’t even think it is gaming the system or against the spirit of it. You might even gain a small sprinkling of page rank if the comment page accumulates any of it’s own.
The nofollow tag is being used for page rank sculpting and to stop blog spamming. In my mind this is tant amount to manipulating page rank and thus possibly ranking position in certain cases. I do post to regularly blogs and forums regarding web design and this improved my search ranking as a side effect. Whats wrong with making an active contribution to the industry blogs and being passed some Pagerank. Google needs to determine whether the post entry is relevant then decide to pass pagerank after the analysis or just decide that blog should not pass PR in any event. Whats gone wrong with the Internet when legitimate content pages do not pass PR?
Nathan: The comment by Mansi Rana helps answer your question. The fact is, the PageRank scores that were visible in the Google Toolbar hadn’t been updated in a long time (2+ YEARS), so they were probably getting more and more out-of-date anyway. The main reason Google would make them disappear, though, is that Google wants website owners to focus on the user and on quality content, not on trying to game the system with links.
Our backgrounds are as diverse as they come, bringing knowledge and expertise in business, finance, search marketing, analytics, PR, content creation, creative, and more. Our leadership team is comprised of successful entrepreneurs, business executives, athletes, military combat veterans, and marketing experts. The Executives, Directors, and Managers at IMI are all well-respected thought leaders in the space and are the driving force behind the company’s ongoing success and growth.

Personally, I wanted a bit more of the math, so I went back and read the full-length version of “The Anatomy of a Large-Scale Hypertextual Web Search Engine” (a natural first step). This was the paper written by Larry Page and Sergey Brin in 1997. Aka the paper in which they presented Google, published in the Stanford Computer Science Department. (Yes, it is long and I will be working a bit late tonight. All in good fun!)
A lot of the problem lies in the name “PageRank” itself. The term “PageRank” implies that a higher value automatically equates to better search engine ranking. It’s not necessarily the case, it hasn’t been the case for some time, but it sounds like it is. As stupid as it sounds, a semantic name change may solve a lot of this all by itself. Some of the old-school crowd will still interpret it as PageRank, but most of the new-school crowd will have a better understanding of what it actually is, why the present SEO crowd blows its importance way too far out of proportion and how silly the industry gets when something like this is posted.
Quality content is more likely to get shared. By staying away from creating "thin" content and focusing more on content that cites sources, is lengthy and it reaches unique insights, you'll be able to gain Google's trust over time. Remember, this happens as a component of time. Google knows you can't just go out there and create massive amounts of content in a few days. If you try to spin content or duplicate it in any fashion, you'll suffer a Google penalty and your visibility will be stifled.
After your site has been built out, creating a social media presence is the best second step for most businesses. All businesses should have a Facebook Page that’s fully fleshed out with plenty of information about your business. Depending on your audience, you can also start a Twitter, Instagram, and/or Pinterest account. Social media is a long-term commitment that requires frequently updating and monitoring, but it’s one of the best ways to build an online community around your business.

Online interviews are hot right now, and a great and easy way to earn backlinks to your website. Once you become the authority in your niche, you'll get lots of interview invitations, but until then, to get started, you have to make the first step. Look for websites that are running interviews and tell them you would like to participate and what knowledge you can contribute.


Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3

Matt Cutts, it’s Shawn Hill from Longview, Texas and I’ve got to say, “you’re a semseo guru”. That’s obviously why Google retained you as they did. Very informative post! As head of Google’s Webspam team how to you intend to combat Social Networking Spam (SNS)? It’s becoming an increasingly obvious problem in SERPs. I’m thinking Blogspam should be the least of Google’s worries. What’s your take?
Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?”
It is important for a firm to reach out to consumers and create a two-way communication model, as digital marketing allows consumers to give back feed back to the firm on a community based site or straight directly to the firm via email.[24] Firms should seek this long term communication relationship by using multiple forms of channels and using promotional strategies related to their target consumer as well as word-of mouth marketing.[24]

TrustRank takes into consideration website foundational backlinks. Searching engines find quicker sites which are reliable and trustworthy and place them on the top of SERP. All doubtful websites you can find somewhere at the end of the rank if you decide to look what is there. As a rule, people take the information from the first links and stop searching, in case they have found nothing on first 20 top sites. Surely, your website may have that required information, service or goods but because of lack of authority, Internet users will not find them unless you have good foundational backlinks. What are backlinks which we call foundational? These are all branded and non-optimized backlinks on authority websites.

If you’re a blogger (or a blog reader), you’re painfully familiar with people who try to raise their own websites’ search engine rankings by submitting linked blog comments like “Visit my discount pharmaceuticals site.” This is called comment spam, we don’t like it either, and we’ve been testing a new tag that blocks it. From now on, when Google sees the attribute (rel=“nofollow”) on hyperlinks, those links won’t get any credit when we rank websites in our search results.


Cross-platform measurement: The number of marketing channels continues to expand, as measurement practices are growing in complexity. A cross-platform view must be used to unify audience measurement and media planning. Market researchers need to understand how the Omni-channel affects consumer's behaviour, although when advertisements are on a consumer's device this does not get measured. Significant aspects to cross-platform measurement involves de-duplication and understanding that you have reached an incremental level with another platform, rather than delivering more impressions against people that have previously been reached (Whiteside, 2016).[42] An example is ‘ESPN and comScore partnered on Project Blueprint discovering the sports broadcaster achieved a 21% increase in unduplicated daily reach thanks to digital advertising’ (Whiteside, 2016).[42] Television and radio industries are the electronic media, which competes with digital and other technological advertising. Yet television advertising is not directly competing with online digital advertising due to being able to cross platform with digital technology. Radio also gains power through cross platforms, in online streaming content. Television and radio continue to persuade and affect the audience, across multiple platforms (Fill, Hughes, & De Franceso, 2013).[45]
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
By now, you've likely seen all the "gurus" in your Facebook feed. Some of them are more popular than others. What you'll notice is that the ads you see that have the highest views and engagement are normally the most successful. Use a site like Similar Web to study those ads and see what they're doing. Join their lists and embed yourself in their funnels. That's an important part of the process so that you can replicate and reverse engineer what the most successful marketers are doing.

PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by {\displaystyle PR(E).} Other factors like Author Rank can contribute to the importance of an entity.
Mathematical PageRanks for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value. If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. In the presence of damping, Page A effectively links to all pages in the web, even though it has no outgoing links of its own.
×