Wow Brian…I’ve been making and promoting websites full-time since 2006 and just when I thought I’ve seen it all, here you are introducing me to all these innovative ways of getting backlinks that I wasn’t aware of before. I never subscribe to newsletters, but yours is just too good to say no to! Thanks very much for this information. Off to read your other posts now…
I work on a site that allows users to find what they are looking for by clicking links that take them deeper and deeper into the site hierarchy. Content can be categorised in lots of different ways. After about three steps the difference between the results pages shown is of significance to a user but not to a search engine. I was about to add nofollow to links that took the browser deeper than 3 levels but after this announcement I won’t be…

The internet is full of business potential, but it is also rife with competition. In this situation, it becomes really tough to sell your products or services. Affiliate marketing can help you effectively promote your product on the web. By helping you reach out to a large potential customer base, affiliate programs help you to connect with millions of customers across the globe.
Google will index this link and see that ESPN has a high authority, and there is a lot of trust in that website, but the relevancy is fairly low. After all, you are a local plumber and they are the biggest sports news website in the world. Once it has indexed your website, it can see that they do not have a lot in common. Now, Google will definitely give you credit for the link, but there is no telling how much.
Something a lot of people seem to have overlooked was hinted at in Greg Boser’s comment above. Greg identified that there is a major (and unfair) disparity with how authority sites such as Wikipedia disrupt the linkscape by run-of-site nofollows. Once Wikipedia implemented the no-follows, previously high-value links from Wikipedia were rendered worthless making the site less of a target for spammers. Increasingly large sites are following suit in order to cleanse their own pages of spam.
@Ronny – At SMX Advanced it was noted by Google that they can, and do follow JavaScript links. They also said that there is a way to provide a nofollow to a JavaScript link but they didn’t go into much detail about it. Vanessa Fox recently wrote a lengthy article about it over on Search Engine Land which will likely address any questions you might have: http://searchengineland.com/google-io-new-advances-in-the-searchability-of-javascript-and-flash-but-is-it-enough-19881
It helps to improve your ranking for certain keywords. If we want this article to rank for the term ’SEO basics’ then we can begin linking to it from other posts using variations of similar anchor text. This tells Google that this post is relevant to people searching for ‘SEO basics’. Some experts recommend varying your anchor text pointing to the same page as Google may see multiple identical uses as ‘suspicious’.
Example: A blogger John Doe writes a very interesting article about a sports event. Another blogger Samantha Smith doesn’t agree with John’s article and writes about it in another article for an online magazine. She links to John’s article, so that her readers can understand both point of views. John’s blog gets a valuable backlink. On the other hand, Samantha’s article gets popular and many other websites link to her article. Samantha’s website gets many new backlinks. Even though John only got one backlink for his article, the value of his backlink is increased by the backlinks Samantha’s article generated.
3) Some people don’t believe things have changed. In fact, if things really changed substantially a year ago, you’d think a few of the advanced SEOs out there would have noticed this and talked about it. But nada. There are lots of reasons why the change could have happened and not been spotted. Sculpting might really have been a second or third order factor, as Matt calls it — not helping things as much as some have assumed. SEOs that spotted it might have stayed quiet. Or, it didn’t change — and still hasn’t changed — and sculpting does work even better than Google thought, so it wants to put out a message that it doesn’t, in hopes of putting the genie back in the bottle. That’s probably the major conspiracy theory out there.
Google's strategy works well. By focusing on the links going to and from a Web page, the search engine can organize results in a useful way. While there are a few tricks webmasters can use to improve Google standings, the best way to get a top spot is to consistently provide top quality content, which gives other people the incentive to link back to their pages.

Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.
Steve, sometimes good information to users is a consolidation of very high quality links. We have over 3000 links to small business sites within the SBA as well as links to the Harvard and Yale library, academic journals, etc. But because we have the understanding that there should be no more than a hundred links in a website (more now from what Matt said) we have used nofollow on all of them out of fear that Google will penalize our site because of the amount of links.
Moreover, the PageRank mechanism is entirely general, so it can applied to any graph or network in any field. Currently, the PR formula is used in bibliometrics, social and information network analysis, and for link prediction and recommendation. It's even used for system analysis of road networks, as well as biology, chemistry, neuroscience, and physics.

The better you learn and understand SEO and the more strides you take to learn this seemingly confusing and complex discipline, the more likely you'll be to appear organically in search results. And let's face it, organic search is important to marketing online. Considering that most people don't have massive advertising budgets and don't know the first thing about lead magnets, squeeze pages and sales funnels, appearing visible is critical towards long-term success.
There’s a need for a skilled SEO to assess the link structure of a site with an eye to crawling and page rank flow, but I think it’s also important to look at where people are actually surfing. The University of Indiana did a great paper called Ranking Web Sites with Real User Traffic (PDF). If you take the classic Page Rank formula and blend it with real traffic you come out with some interesting ideas……
I think Google will always be working to discern and deliver “quality, trustworthy” content and I think analyzing inbound links as endorsements is a solid tool the SE won’t be sunsetting anytime soon. Why would they? If the president of the United States links to your page that is undoubtedly an endorsement that tells Google you’re a legitimate trusted source. I know that is an extreme example, but I think it illustrates the principals of a linking-as-endorsement model well.
He is the co-founder of Neil Patel Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.

Matt, you don’t mention the use of disallow pages via robots.txt. I’ve read that PageRank can be better utilised by disallowing pages that probably don’t add value to users searching on engines. For example, Privacy Policy and Terms of Use pages. These often appear in the footer of a website and are required by EU law on every page of the site. Will it boost the other pages of the site if these pages are added to robots.txt like so?
There’s a need for a skilled SEO to assess the link structure of a site with an eye to crawling and page rank flow, but I think it’s also important to look at where people are actually surfing. The University of Indiana did a great paper called Ranking Web Sites with Real User Traffic (PDF). If you take the classic Page Rank formula and blend it with real traffic you come out with some interesting ideas……
So enough of these scary stories. Google actually likes backlinks and relies upon them. The whole idea behind them is that they help to tell Google what is good and useful out there. Remember, it is still an algorithm. It doesn’t know that your page describing the best technique for restoring a 1965 Ford Mustang bumper is all that great. But if enough people are talking about how great it is, and thereby referencing that page on other websites, Google will actually know.
Hey brian, it is extremely fantastic stuff, i am not getting words to appreciate your work..brilliant. No one dares to share their business secrets with others but you are awesome and thank you so much. Iam a beginner in digital marketing, iam learning consistently by following your posts, tips and tricks. eventually i became an intermediate person thanks for your help.
9. Troubleshooting and adjustment. In your first few years as a search optimizer, you’ll almost certainly run into the same problems and challenges everyone else does; your rankings will plateau, you’ll find duplicate content on your site, and you’ll probably see significant ranking volatility. You’ll need to know how to diagnose and address these problems if you don’t want them to bring down the effectiveness of your campaign.

Page Ranks Denver

×