Concerning broken link building, it can also sometimes be relevant to scan the whole domain (e.g. if the website is a blog within a specific niche as these often feature multiple articles closely related to the same) for broken external links using e.g. XENU, A1 Website Analyzer or similar. (Just be sure to enable checking of external links before crawling the website.)
Despite this many people seem to get it wrong! In particular “Chris Ridings of www.searchenginesystems.net” has written a paper entitled “PageRank Explained: Everything you’ve always wanted to know about PageRank”, pointed to by many people, that contains a fundamental mistake early on in the explanation! Unfortunately this means some of the recommendations in the paper are not quite accurate.
Yep, please change things to stop keyword stuffing. Change them to stop cloaking. Definately change them to stop buying links that try to game Google. But, telling search engines to not give weight (that I control) to pages that are not what my site is about or are not really relevant. No way. This is logical stuff here. Maybe too logical. I think deep down you know this Matt too.

An essential part of any Internet marketing campaign is the analysis of data gathered from not just the campaign as a whole, but each piece of it as well. An analyst can chart how many people have visited the product website since its launch, how people are interacting with the campaign's social networking pages, and whether sales have been affected by the campaign (See also Marketing Data Analyst). This information will not only indicate whether the marketing campaign is working, but it is also valuable data to determine what to keep and what to avoid in the next campaign.


Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.
There are numerous repositories to source affiliate products and services from. However, some of the biggest are sites like Clickbank, Commission Junction, LinkShare and JVZoo. You'll need to go through an application process, for the most part, to get approved to sell certain products, services or digital information products. Once approved, be prepared to hustle.
In order to do all that, you will need to acquire and apply knowledge in human psychology. If you understand how your customers think, you can design for their needs. This course is based on tried and tested psychological techniques that bring together content and design so as to deliver hands-on advice for how to improve your web design and increase your customer engagement.
As for the use of nofollow as a way to keep pages that shouldn’t be indexed out of Google (as with your feed example) is terrible advice. Your use of it on your feed link does nothing. If anyone links to your feed without nofollow, then it’s going to get indexed. Things that shouldn’t be indexed need to use either robots.txt or meta robots blocking. Nofollow on links to those items isn’t a solution.
Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?”
Targeting, viewability, brand safety and invalid traffic: Targeting, viewability, brand safety and invalid traffic all are aspects used by marketers to help advocate digital advertising. Cookies are a form of digital advertising, which are tracking tools within desktop devices; causing difficulty, with shortcomings including deletion by web browsers, the inability to sort between multiple users of a device, inaccurate estimates for unique visitors, overstating reach, understanding frequency, problems with ad servers, which cannot distinguish between when cookies have been deleted and when consumers have not previously been exposed to an ad. Due to the inaccuracies influenced by cookies, demographics in the target market are low and vary (Whiteside, 2016).[42] Another element, which is affected within digital marketing, is ‘viewabilty’ or whether the ad was actually seen by the consumer. Many ads are not seen by a consumer and may never reach the right demographic segment. Brand safety is another issue of whether or not the ad was produced in the context of being unethical or having offensive content. Recognizing fraud when an ad is exposed is another challenge marketers face. This relates to invalid traffic as premium sites are more effective at detecting fraudulent traffic, although non-premium sites are more so the problem (Whiteside, 2016).[42]
Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?
Having a ‘keyword rich’ domain name may lead to closer scrutiny from Google. According to Moz, Google has “de-prioritized sites with keyword-rich domains that aren’t otherwise high-quality. Having a keyword in your domain can still be beneficial, but it can also lead to closer scrutiny and a possible negative ranking effect from search engines—so tread carefully.”
As Rogers pointed out in his classic paper on PageRank, the biggest takeaway for us about the eigenvector piece is that it’s a type of math that let’s you work with multiple moving parts. “We can go ahead and calculate a page’s PageRank without knowing the final value of the PR of the other pages. That seems strange but, basically, each time we run the calculation we’re getting a closer estimate of the final value. So all we need to do is remember the each value we calculate and repeat the calculations lots of times until the numbers stop changing much.”
Backlinks are a major ranking factor for most search engines, including Google. If you want to do SEO for your website and get relevant organic traffic, building backlinks is something you should be doing. The more backlinks your website has from authoritative domains, the higher reputation you’ll have in Google’s eyes. And you’ll dominate the SERPS.
First of all, it’s necessary to sort out what a backlink is. There is no need to explain everything in detail. The main thing to understand is what it is for and how it works. A backlink is a kind of Internet manipulator. It links one particular site with other external websites which contain links to this site. In other words, when you visit external sites they will lead you to that particular site.
It is clear that something new should emerge to cover that unfollow emptiness. Here and there it is believed that some search engines may use so-called implied links to rank the page. Implied links are, for example, references to your brand. They usually come with a tone: positive, neutral, or negative. The tone defines the reputation of your site. This reputation serves as a ranking signal to search engines.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.

If you're serious about finding your voice and discovering the secrets to success in business, one of the best people to follow is Gary Vanyerchuck, CEO of Vayner Media, and early-stage invest in Twitter, Uber and Facebook, has arbitraged his way into the most popular social media platforms and built up massive followings and often spills out the secrets to success in a highly motivating and inspiring way.


There’s a lot of frustration being vented in this comments section. It is one thing to be opaque – which Google seems to be masterly at – but quite another to misdirect, which is what No Follow has turned out to be. All of us who produce content always put our readers first, but we also have to be sensible as far as on page SEO is concerned. All Google are doing with this kind of thing is to progressively direct webmasters towards optimizing for other, more reliable and transparent, ways of generating traffic (and no, that doesn’t necessarily mean Adwords, although that may be part of the intent).
Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
Also, by means of the iterative calculation, the sum of all pages' PageRanks still converges to the total number of web pages. So the average PageRank of a web page is 1. The minimum PageRank of a page is given by (1-d). Therefore, there is a maximum PageRank for a page which is given by dN+(1-d), where N is total number of web pages. This maximum can theoretically occur, if all web pages solely link to one page, and this page also solely links to itself.

For the purpose of their second paper, Brin, Page, and their coauthors took PageRank for a spin by incorporating it into an experimental search engine, and then compared its performance to AltaVista, one of the most popular search engines on the Web at that time. Their paper included a screenshot comparing the two engines’ results for the word “university.”
While ordinary users were not that interested in pages' scores, SEOs of a different caliber felt that this was a great opportunity to make a difference for their customers. This obsession of SEOs with PageRank made everyone feel that this ranking signal is more or less the only important one. In spite of the fact that pages with a lower PR score can beat those with a higher score! What did we receive then, as a result?
It's key to understand that nobody really knows what goes into PageRank. Many believe that there are dozens if not hundreds of factors, but that the roots go back to the original concept of linking. It's not just volume of links either. Thousands of links by unauthoritative sites might be worth a handful of links from sites ranked as authoritative.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.

PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by {\displaystyle PR(E).} Other factors like Author Rank can contribute to the importance of an entity.


In the beginning, it was rough for Sharpe. No one out there should think that it's going to be easy whatsoever. His journey took years and years to go from an absolute beginner, to a fluid and seasoned professional, able to clearly visualize and achieve his dreams, conveying his vast knowledge expertly to those hungry-minded individuals out there looking to learn how to generate a respectable income online.
In regards to link sculpting I think the pro’s of having the “no follow” attribute outweigh the few who might use it to link sculpt. Those crafty enough to link sculpt don’t actually need this attribute but it does make life easier and is a benefit. Without this attribute I would simply change the hierarchy of the internal linking structure of my site and yield the same results I would if the “no follow” attribute didn’t exist.
If your anchor text is aggressive and you distribute it the wrong way, your site will be deprived of ranking, and you may get a penalty. Most of your backlinks must be naked and branded. You should be very selective to anchors you use for your website, you can analyze your anchor list with the help of free backlink checker. It helps to understand what to improve in your link building strategy.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[49] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[50]
Likewise, ‘nofollowing’ your archive pages on your blog. Is this really a bad thing? You can get to the pages from the ‘tag’ index or the ‘category’ index, why put weight to a page that is truly navigational. At least the tag and category pages are themed. Giving weight to a page that is only themed by the date is crazy and does not really help search engines deliver ‘good’ results (totally leaving aside the duplicate content issues for now).
Backlinks are important for a number of reasons. The quality and quantity of pages backlinking to your website are some of the criteria used by search engines like Google to determine your ranking on their search engine results pages (SERP). The higher you rank on a SERP, the better for your business as people tend to click on the first few search results Google, Bing or other search engines return for them.
The new digital era has enabled brands to selectively target their customers that may potentially be interested in their brand or based on previous browsing interests. Businesses can now use social media to select the age range, location, gender and interests of whom they would like their targeted post to be seen by. Furthermore, based on a customer's recent search history they can be ‘followed’ on the internet so they see advertisements from similar brands, products and services,[38] This allows businesses to target the specific customers that they know and feel will most benefit from their product or service, something that had limited capabilities up until the digital era.
Before I start this, I am using the term ‘PageRank’ as a general term fully knowing that this is not a simple issue and ‘PageRank’ and the way it is calculated (and the other numerous methods Google use) are multidimensional and complex. However, if you use PageRank to imply ‘weight’ it make it a lot simpler. Also, ‘PageRank sculpting’ (in my view) is meant to mean ‘passing weight you can control’. Now… on with the comment!
This year, for the first time, Google stated that user experience would be a core part of gaining rankings for mobile websites. A poorer user experience would send your site hurtling down the rankings. This appeared to come as a shock to many in the SEO community and despite assurances that content was still king – many seemed to feel that this ...
Sharpe says that you shouldn't dive into internet marketing until you decide on a niche and figure out what you're passionate about. Do you want to join the make-money-online (MMO) niche? Or do you want to engage in another niche? For example, you could sell products or online courses about blogging or search engine optimization or anything else for that matter. Keep in mind that whatever you're selling, whatever niche you're in, that you need to embed yourself there deeply.
Excellent post! I’m reasonably savvy up to a certain point and have managed to get some of my health content organically ranking higher than WebMD. It’s taken a long time building strong backlinks from very powerful sites (HuffingtonPost being one of them), but I am going to take some time, plow through a few beers, and then get stuck into implementing some of these suggestions. Keep up the great work amigo. Cheers, Bill
PageRank is one of many, many factors used to produce search rankings. Highlighting PageRank in search results doesn’t help the searcher. That’s because Google uses another system to show the most important pages for a particular search you do. It lists them in order of importance for what you searched on. Adding PageRank scores to search results would just confuse people. They’d wonder why pages with lower scores were outranking higher scored pages.
Moreover, the PageRank mechanism is entirely general, so it can applied to any graph or network in any field. Currently, the PR formula is used in bibliometrics, social and information network analysis, and for link prediction and recommendation. It's even used for system analysis of road networks, as well as biology, chemistry, neuroscience, and physics.
SEO is an acronym for "search engine optimization" or "search engine optimizer." Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your site and reputation. Make sure to research the potential advantages as well as the damage that an irresponsible SEO can do to your site. Many SEOs and other agencies and consultants provide useful services for website owners, including:
Thanks for sharing this, Matt. I’m happy that you took the time to do so considering that you don’t have to. What I mean is, in an ideal world, there should be no such thing as SEO. It is the SE’s job to bring the right users to the right sites and it is the job of webmasters to cater to the needs of the users brought into their sites by SEs. Webmasters should not be concerned of bringing the users in themselves. (aside from offsite or sponsored marketing campaigns) The moment they do, things start to get ugly because SEs would now have to implement counter-measures. (To most SEO tactics) This becomes an unending spiral. If people only stick to their part of the equation, SEs will have more time to develop algorithms for making sure webmasters get relevant users rather than to develop algorithms for combating SEOs to ensure search users get relevant results. Just do your best in providing valuable content and Google will try their best in matching you with your users. Don’t waste time trying to second guess how Google does it so that you can present yourself to Google as having a better value than you really have. They have great engineers and they have the code—you only have a guess. At most, the SEO anyone should be doing is to follow the webmasters guidelines. It will benefit all.

One of the earliest adopters of Internet marketing in the world of Fortune 500 companies was the Coca-Cola Corporation. Today, this huge purveyor of soft drinks has one of the strongest online portfolios in the world. More than 12,000 websites link to the Coca-Cola homepage, which itself is a stunning display of Internet savvy. Their homepage alone sports an auto-updating social network column, an embedded video, a unique piece of advertising art, frequently rotating copy, an opt-in user registration tab, tie-in branding with pop culture properties, and even a link to the company's career opportunities page. Despite how busy that sounds, the Coca-Cola homepage is clean and easy to read. It is a triumph of Internet marketing for its confidence, personality, and professionalism.
Concerning broken link building, it can also sometimes be relevant to scan the whole domain (e.g. if the website is a blog within a specific niche as these often feature multiple articles closely related to the same) for broken external links using e.g. XENU, A1 Website Analyzer or similar. (Just be sure to enable checking of external links before crawling the website.)
Thanks for the clarification, Matt. We were just wondering today when we would hear from you on the matter since it had been a couple of weeks since SMX. I think we’d all be interested to know the extent to which linking to “trusted sites,” helps PageRank. Does it really mitigate the losses incurred by increasing the number of links? I ask because it seems pretty conclusive that the total number of outbound links is now the deciding metric for passing PageRank and not the number of DoFollow links. Any thoughts from you or others?

Can I just remind Google that not all “great content” is going to “attract links”, this is something I think they forget. I have great content on my site about plumbers in Birmingham and accountants in London, very valuable, detailed, non-spammy, hand-crafted copy on these businesses, highly valuable to anyone looking for their services. But no-one is ever going to want to link to it; it’s not topical or quirky, is very locally-focussed, and has no video of cats playing pianos.
From a customer experience perspective, we currently have three duplicate links to the same URL i.e. i.e. ????.com/abcde These links are helpful for the visitor to locate relevant pages on our website. However, my question is; does Google count all three of these links and pass all the value, or does Google only transfer the weight from one of these links. If it only transfers value from one of these links, does the link juice disappear from the two other links to the same page, or have these links never been given any value?

I compare the latest Google search results to this: Mcdonalds is the most popular and is #1 in hamburgers… they dont taste that great but people still go there. BUT I bet you know a good burger joint down the road from Google that makes awesome burgers, 10X better than Mcdonalds, but “we” can not find that place because he does not have the resources or budget to market his burgers effectively.


nofollow is beyond a joke now. There is so much confusion (especially when other engines’ treatment is factored in), I don’t know how you expect a regular publisher to keep up. The expectation seems to have shifted from “Do it for humans and all else will follow” to “Hang on our every word, do what we say, if we change our minds then change everything” and nofollow lead the way. I could give other examples of this attitude (e.g. “We don’t follow JavaScript links so it’s ‘safe’ to use those for paid links”), but nofollow is surely the worst.
Word of mouth communications and peer-to-peer dialogue often have a greater effect on customers, since they are not sent directly from the company and are therefore not planned. Customers are more likely to trust other customers’ experiences.[22] Examples can be that social media users share food products and meal experiences highlighting certain brands and franchises. This was noted in a study on Instagram, where researchers observed that adolescent Instagram users' posted images of food-related experiences within their social networks, providing free advertising for the products.[26]
I have not at all seen the results I would expect in terms of page rank throughout my site. I have almost everything pointing at my home page, with a variety of anchor text, but my rank is 1. There is a page on my site with 3, though, and a couple with 2, so it certainly is not all about links; I do try to have somewhat unique and interesting content, but some of my strong pages are default page content. I will explore the help forum. (I guess these comments are nofollow :P) I would not mind a piece of this page rank …
Search engines are smart, but they still need help. The major engines are always working to improve their technology to crawl the web more deeply and return better results to users. However, there is a limit to how search engines can operate. Whereas the right SEO can net you thousands of visitors and increased attention, the wrong moves can hide or bury your site deep in the search results where visibility is minimal.
×