Just because some people have been turning their page, way to, pink (with the Firefox ‘nofollow’ indicator plug in installed) that is not a reason to devalue something that is OK to do. It would not of been that hard to plug in a change that would pick that up as spam and therefore put a ‘trust’ question mark against sites that have been ‘nofollowing’ everything.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally? 

I was exactly thinking the same thing what Danny Sullivan had said. If comments (even with nofollow) directly affect the outgoing PR distribution, people will tend to allow less comments (maybe usage of iframes even). Is he right? Maybe, Google should develop a new tag as well something like rel=”commented” to inform spiders about it to give less value and wordpress should be installed default with this attribute 🙂
2. Domain authority and page authority. Next, you should learn about domain authority and page authority, and how they predict your site’s search rankings. Here’s the basic idea; your site’s domain authority is a proprietary score, provided by Moz, of how “trustworthy” your domain is. It’s calculated based on the quantity and quality of inbound links to your website. The higher it is, the higher all your pages across your domain are likely to rank in organic search results. Page authority is very similar, but page-specific, and you can use it to engineer a link architecture that strategically favors some of your pages over others. Authority depends on the authority and volume of inbound links.
He is the co-founder of Neil Patel Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.
These are ‘tit-for-tat’ links. For instance, you make a deal with your friend who has a business website to have him place a link to your website, and in exchange your website links back to his. In the dark ages of SEO, this used to be somewhat effective. But these days, Google considers such 'link exchanges' to be link schemes, and you may get hit with a penalty if you're excessive and obvious about it. This isn't to say that swapping links is always bad, but if your only motive is SEO, then odds are that you shouldn't do it.
Brian, this is the web page that everybody over the entire Internet was searching for. This page answers the million dollar question! I was particularly interested in the food blogs untapped market, who doesn’t love food. I have been recently sent backwards in the SERP and this page will help immensely. I will subscribe to comments and will be back again for more reference.
Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?

Google's strategy works well. By focusing on the links going to and from a Web page, the search engine can organize results in a useful way. While there are a few tricks webmasters can use to improve Google standings, the best way to get a top spot is to consistently provide top quality content, which gives other people the incentive to link back to their pages.

Also, backlinks are important for the end user. With an end user, backlinks connect searchers with information that is similar to what is being written on other resources. An example of this happens when an end user is reading a page that discusses “how child care expenses are driving women out of the workforce.” As they scroll down, they might see another link with a study on “how the rise in child care costs over the last 25 years affected women’s employment.” In this case, a backlink establishes connection points for information that a searcher may be interested in clicking. This external link creates a solid experience because it transfers the user directly to additionally desirable information if needed.
Here’s my take on the whole pagerank sculpting situation. As I understand it, the basic idea is that you can increase your rankings in Google by channeling the page rank of your pages to the pages you want ranked. This used be done with the use of the ‘no folow’ tag. That said, things have changed, and Google has come out and said that the way ‘no follow’ use to work has changed. In short, using ‘no follow’ to channel that page rank juice is no longer as effective as it once was.

Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?”
The PageRank formula also contains a damping factor (d) . According to the PageRank theory, there is an imaginary surfer who is randomly clicking on links, and at some point he gets bored and eventually stops clicking. The probability that the person will continue clicking at any step is a damping factor. Thus, this factor is introduced to stop some pages having too much influence. As a result, their total vote is damped down by multiplying it by 0.85 (a generally assumed value).
But I also don’t wanna lose PageRank on every comment with a link… If I can give PageRank and lose none, I wanna let the comment there, even without nofollow. But if I lose PageRank on every link, even inside original post, EVEN MORE if nofollow also takes PageRank out of me, I may just start using JavaScript or simple text without anchor for links… I definetely don’t like this idea, but I dislike even more losing PageRank on each outlink on my site. I’d just link top quality sites that I actively wanna vote for Search Engines.
btw; All those SEO’s out there probably made some monies off clients, selling the sculpting thang to them. I know some are still insisting it worked, etc, but would they say in public that it didn’t work after they already took a site’s money to sculpt? How would anyone judge if it worked or not definitively? The funny thing is, the real issues of that site could have been fixed for the long term instead of applying a band aide. Of course; knowing the state of this industry right now, band aides are the in thing anyway.
Affiliate marketing - Affiliate marketing is perceived to not be considered a safe, reliable and easy means of marketing through online platform. This is due to a lack of reliability in terms of affiliates that can produce the demanded number of new customers. As a result of this risk and bad affiliates it leaves the brand prone to exploitation in terms of claiming commission that isn't honestly acquired. Legal means may offer some protection against this, yet there are limitations in recovering any losses or investment. Despite this, affiliate marketing allows the brand to market towards smaller publishers, and websites with smaller traffic. Brands that choose to use this marketing often should beware of such risks involved and look to associate with affiliates in which rules are laid down between the parties involved to assure and minimize the risk involved.[47]

1. The big picture. Before you get started with individual tricks and tactics, take a step back and learn about the “big picture” of SEO. The goal of SEO is to optimize your site so that it ranks higher in searches relevant to your industry; there are many ways to do this, but almost everything boils down to improving your relevance and authority. Your relevance is a measure of how appropriate your content is for an incoming query (and can be tweaked with keyword selection and content creation), and your authority is a measure of how trustworthy Google views your site to be (which can be improved with inbound links, brand mentions, high-quality content, and solid UI metrics).
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.

To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.

Influencer marketing: Important nodes are identified within related communities, known as influencers. This is becoming an important concept in digital targeting. It is possible to reach influencers via paid advertising, such as Facebook Advertising or Google Adwords campaigns, or through sophisticated sCRM (social customer relationship management) software, such as SAP C4C, Microsoft Dynamics, Sage CRM and Salesforce CRM. Many universities now focus, at Masters level, on engagement strategies for influencers.


The combination of charisma, charm and intellect has helped catapult Sharpe to the top of the heap. In a recent conversation with him, I wanted to learn what it truly took to become an expert digital marketer. And one of the most important takeaways from that phone call was that if he could do it, anyone could do it. For someone who failed so devastatingly very early on in life, to rise from the ashes like a phoenix was no easy feat.


The criteria and metrics can be classified according to its type and time span. Regarding the type, we can either evaluate these campaigns "Quantitatively" or "Qualitatively". Quantitative metrics may include "Sales Volume" and "Revenue Increase/Decrease". While qualitative metrics may include the enhanced "Brand awareness, image and health" as well as the "relationship with the customers".
Probably the most creative thing I’ve ever done was wrote a review on a restaurant (The Heart Attack Grill) that was hilarious, emailed it to the owner. He loved it so much he posted it on FB and even put it on his homepage for a while. I got thousands of visitors from this stupid article: https://www.insuranceblogbychris.com/buy-life-insurance-before-eating-at-heart-attack-grill/
That sort of solidifies my thoughts that Google has always liked and still likes sites that are most natural the best – so to me it seems like it’s best not to stress over nofollow and dofollow – regarding on-site and off-site links – and just link to sites you really think are cool and likewise comment on blogs you really like )and leave something useful)… if nothing else, if things change will nofollow again, you’ll have all those comments floating around out there so it can’t hurt. And besides, you may get some visitors from them if the comments are half-decent.
For instance, you might use Facebook’s Lookalike Audiences to get your message in front of an audience similar to your core demographic. Or, you could pay a social media influencer to share images of your products to her already well-established community. Paid social media can attract new customers to your brand or product, but you’ll want to conduct market research and A/B testing before investing too much in one social media channel.
Can I just remind Google that not all “great content” is going to “attract links”, this is something I think they forget. I have great content on my site about plumbers in Birmingham and accountants in London, very valuable, detailed, non-spammy, hand-crafted copy on these businesses, highly valuable to anyone looking for their services. But no-one is ever going to want to link to it; it’s not topical or quirky, is very locally-focussed, and has no video of cats playing pianos.
Say I have an article on a blog with 5 links in the editorial copy — some of those links leading back to other content within the blog that I hope to do well. Then I get 35 comments on the article, with each comment having a link back to the commenters’ sites. That’s 40 links in all. Let’s say this particular page has $20 in PageRank to spend. Each link gets 50 cents.
“With 150 million pages, the Web had 1.7 billion edges (links).” Kevin Heisler, that ratio holds true pretty well as the web gets bigger. A good rule of thumb is that the number of links is about 10x the number of pages. I agree that it’s pretty tragic that Rajeev Motwani was a co-author of many of those early papers. I got to talk to Rajeev a little bit at Google, and he was a truly decent and generous man. What has heartened me is to see all the people that he helped, and to see those people pay their respects online. No worries on the Consumer WebWatch–I’m a big fan of Consumer WebWatch, and somehow I just missed their blog. I just want to reiterate that even though this feels like a huge change to a certain segment of SEOs, in practical terms this change really doesn’t affect rankings very much at all.
When writing this guide, we reached out to the marketer community to collect case studies and learnings about creative marketing strategies. Most of these examples are included throughout the guide, but some didn’t quite fit. So we included those loose ends here, from the perspective of four awesome marketers. What better way to wrap up this guide than with you, our community?
I think that removing the link to the sitemap shouldn’t be a big problem for the navigation, but I wonder what happens with the disclaimer and the contact page? If nofollow doesn’t sink the linked page, how can we tell the search engine that these are not content pages. For some websites these are some of the most linked pages. And yes for some the contact page is worth gaining rank, but for my website is not.

How does this all relate to disallows in the robots.txt? My ecom site has 12,661 pages disallowed because we got nailed for duplicate content. We sale batteries so revisons to each battery where coming up as duplicate content. Is PageRank being sent (and ignored) to these internal disallowed links as well? One of our category levels has hundreds of links to different series found under models, the majority of these series are disallowed. If PageRank acts the same with disallows as it does with nofollows, are these disallowed links are hurting our

Finally, it’s critical you spend time and resources on your business’s website design. When these aforementioned customers find your website, they’ll likely feel deterred from trusting your brand and purchasing your product if they find your site confusing or unhelpful. For this reason, it’s important you take the time to create a user-friendly (and mobile-friendly) website.
Yep, please change things to stop keyword stuffing. Change them to stop cloaking. Definately change them to stop buying links that try to game Google. But, telling search engines to not give weight (that I control) to pages that are not what my site is about or are not really relevant. No way. This is logical stuff here. Maybe too logical. I think deep down you know this Matt too.

For some business owners, they’ll think of a website. Others may think of social media, or blogging. In reality, all of these avenues of advertising fall in the category internet marketing and each is like a puzzle piece in a much bigger marketing picture. Unfortunately, for new business owners trying to establish their web presence, there’s a lot of puzzle pieces to manage.
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
Peter made a very good point in all of this, and Michael Martinez did in a backhanded way as well. Talking about a concept related PageRank sounds cool. It doesn’t actually have to be useful or practical, and it usually isn’t; but as long as the impression of something productive is given off, then that can be all that matters in the eyes of those who lack sufficient knowledge.
Search engines often use the number of backlinks that a website has as one of the most important factors for determining that website's search engine ranking, popularity and importance. Google's description of its PageRank system, for instance, notes that "Google interprets a link from page A to page B as a vote, by page A, for page B."[6] Knowledge of this form of search engine rankings has fueled a portion of the SEO industry commonly termed linkspam, where a company attempts to place as many inbound links as possible to their site regardless of the context of the originating site. The significance of search engine rankings is pretty high, and it is regarded as a crucial parameter in online business and the conversion rate of visitors to any website, particularly when it comes to online shopping. Blog commenting, guest blogging, article submission, press release distribution, social media engagements, and forum posting can be used to increase backlinks.
Well – maybe for a few of you. But this algorithm is fundamental in understanding links and in particular, understanding why most links count for nothing or almost nothing. When you get to grips with Google’s algorithm, you will be light years ahead of other SEOs… but I never really see it properly explained. I guarantee that even if you know this algorithm inside out, you’ll see some unexpected results from this math by the end of this post and you will also never use the phrase “Domain Authority” in front of a customer again (at least in relation to links).
If I’m writing a page about the use of the vCard microformat on a page, it absolutely makes sense for me to link out to the definition where it was originally published, and improves user experience as well as lending authority to my arguments. Often as SEOs we get obsessed with the little things, claiming that its hard to get links on particular subjects, and that is pretty true, but its mainly our own selfishness in linking out to authority content that prevents other people giving us the same courtesy.
Jim Boykin blows my mind every time I talk to him. I have been doing SEO for 15 years and yet I am amazed at the deep stuff Jim comes up with. Simply amazing insights and always on the cutting edge. He cuts through the BS and tells you what really works and what doesn't. After our chat, I grabbed my main SEO guy and took him to lunch and said "you have to help me process all this new info..." I was literally pacing around the room...I have so many new ideas to experiment with that I would never have stumbled onto on my own. He is the Michael Jordan or the Jerry Garcia of links...Hope to go to NY again to Jim's amazing SEO classes. Thanks Jim! Michael G.
However, if you're like the hundreds of millions of other individuals that are looking to become the next David Sharpe, there are some steps that you need to take. In my call with this renowned online marketer, I dove deep the a conversation that was submerged in the field of internet marketing, and worked to really understand what it takes to be top earner. We're not just talking about making a few hundred or thousand dollars to squeak by here; we're talking about building an automated cash machine. It's not easy by any means.
Search engines are smart, but they still need help. The major engines are always working to improve their technology to crawl the web more deeply and return better results to users. However, there is a limit to how search engines can operate. Whereas the right SEO can net you thousands of visitors and increased attention, the wrong moves can hide or bury your site deep in the search results where visibility is minimal.
×