What an article… thank you so much for the priceless information, we will be changing our pages around to make sure we get the highest page rank available to us, we are trying to get high page rank sites to link to us, hopefully there is more information out there to gather as we want to compete within our market to gain as much market-share as possible.
We have other ways to consider relevence. Topical Trust Flow is one and page titles and anchor texts are others. If you put a search term into our system (instead of a URL) you actually get back a search engine! we don’t profess to be a Google (yet) but we can show our customers WHY one page is more relevent on our algotithm than another page. This could prove useful for SEOs. We actually launched that in 2013, but the world maybe never noticed 🙂

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[67][68]
So, as you build a link, ask yourself, "am I doing this for the sake of my customer or as a normal marketing function?" If not, and you're buying a link, spamming blog comments, posting low-quality articles and whatnot, you risk Google penalizing you for your behavior. This could be as subtle as a drop in search ranking, or as harsh as a manual action, getting you removed from the search results altogether! 

Me, I didn’t like the sculpting idea from the start. I linked to what I thought should get links and figured that was pretty natural, to have navigational links, external links and so on — and natural has long been the think Google’s rewarded the most. So I didn’t sculpt, even after Matt helped put it out there, because it just made no long term sense to me.

Search engine optimization (SEO) is the process of affecting the online visibility of a website or a web page in a web search engine's unpaid results—often referred to as "natural", "organic", or "earned" results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers.[1] SEO may target different kinds of search, including image search, video search, academic search,[2] news search, and industry-specific vertical search engines. SEO differs from local search engine optimization in that the latter is focused on optimizing a business' online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services. The former instead is more focused on national or international searches.

One more important thing to keep in mind is that this factor is just part of the story about what helps pages to be displayed high in SERPs. Yes, it was the first one used by Google, but now there are lots of ranking factors, they all matter, and they all are taken into account for ranking. The most essential one is deemed content. You know this, content is king, there is no way around it. User experience is the new black (with the new Speed Update, it will become even more important).

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

Mega-sites, like http://news.bbc.co.uk have tens or hundreds of editors writing new content – i.e. new pages - all day long! Each one of those pages has rich, worthwile content of its own and a link back to its parent or the home page! That’s why the Home page Toolbar PR of these sites is 9/10 and the rest of us just get pushed lower and lower by comparison…
Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
If you are serious about improving web traffic to your website, we recommend you read Google Webmasters and Webmaster Guidelines. These contain the best practices to help Google (and other search engines) find, crawl, and index your website. After you have read them, you MUST try our Search Engine Optimization Tools to help you with Keyword Research, Link Building, Technical Optimization, Usability, Social Media Strategy and more.
Writing blog posts is especially effective for providing different opportunities to land on page one of search engines -- for instance, maybe your eyeglass store’s website is on page three of Google for “eyeglasses,” but your “Best Sunglasses of 2018” blog post is on page one, pulling in an impressive amount of traffic (over time, that blog post could also boost your overall website to page one).
I really appreciate that you keep us updated as soon as you can, but in some cases, e.g. WRT rel-nofollow, the most appreciated update would be the removal of this very much hated and pretty useless microformat. I mean, when you’ve introduced it because the Google (as well as M$, Yahoo and Ask) algos were flawed at this time, why not take the chance and dump it now when it’s no longer needed?
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
If you decide to go into affiliate marketing, understand that you will need a lot of very targeted traffic if you want to make any real money. Those affiliate offers also need to provide a high commission amount to you on each sale. You also need to ensure that the returns or chargebacks for those products or services are low. The last thing you want to do is to sell a product or service that provides very little value and gets returned often.

What is a useful place in search results? Ideally, you need to be in the top three search results returned. More than 70% of searches are resolved in these three results, while 90% are resolved on the first page of results. So, if you’re not in the top three, you’re going to find you’re missing out on the majority of potential business—and if you’re not on the first page, you’re going to miss out on nearly all potential business.
I agree that there is no point in trying to over analyze how the PageRank is flowing through your site. Just focus on great content. Link out when it actually helps the reader. This is what Google wants – for you to give good quality content to their users. So if you are doing that, they will reward you in the long run. No need to worry yourself with these types of link strategies.
SEM, on the other hand, costs money but can deliver very rapid results. Your website must be optimized to make sales or at least drive a customer to get in touch (GIT – in marketing terms) so you can make a sale. You should approach SEM with care and make sure you completely understand how much money you have exposed at any one time. Start slow and evaluate your results.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.

The combination of charisma, charm and intellect has helped catapult Sharpe to the top of the heap. In a recent conversation with him, I wanted to learn what it truly took to become an expert digital marketer. And one of the most important takeaways from that phone call was that if he could do it, anyone could do it. For someone who failed so devastatingly very early on in life, to rise from the ashes like a phoenix was no easy feat.
2. Was there really a need to make this change? I know all sites should be equally capable of being listed in search engines without esoteric methods playing a part. But does this really happen anyway (in search engines or life in general)? If you hire the best accountant you will probably pay less tax than the other guy. Is that really fair? Also, if nobody noticed the change for a year (I did have an inkling, but was totally and completely in denial) then does that mean the change didn’t have to be made in the first place? As said, we now have a situation where people will probably make bigger and more damaging changes to their site and structure, rather than add a little ‘nofollow’ to a few links.
Game advertising - In-Game advertising is defined as "inclusion of products or brands within a digital game."[49] The game allows brands or products to place ads within their game, either in a subtle manner or in the form of an advertisement banner. There are many factors that exist in whether brands are successful in their advertising of their brand/product, these being: Type of game, technical platform, 3-D and 4-D technology, game genre, congruity of brand and game, prominence of advertising within the game. Individual factors consist of attitudes towards placement advertisements, game involvement, product involvement, flow or entertainment. The attitude towards the advertising also takes into account not only the message shown but also the attitude towards the game. Dependent of how enjoyable the game is will determine how the brand is perceived, meaning if the game isn't very enjoyable the consumer may subconsciously have a negative attitude towards the brand/product being advertised. In terms of Integrated Marketing Communication "integration of advertising in digital games into the general advertising, communication, and marketing strategy of the firm"[49] is an important as it results in a more clarity about the brand/product and creates a larger overall effect.
Secondly, nofollow is also essential on links to off-topic pages, whether they’re internal or external to your site. You want to prevent search engines from misunderstanding what your pages are about. Linking relevant pages together reinforces your topic relevance. So to keep your topic silos clear, strategic use of the nofollow attribute can be applied when linking off-topic pages together.
Finally, it’s critical you spend time and resources on your business’s website design. When these aforementioned customers find your website, they’ll likely feel deterred from trusting your brand and purchasing your product if they find your site confusing or unhelpful. For this reason, it’s important you take the time to create a user-friendly (and mobile-friendly) website.

where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
This is also about expectations. Anyone that tries to sell you a get-rich-quick scheme is selling you short. There is no such thing. You have to put in the time and do the work, adding enormous amounts of value along the way. That's the truth of the matter and that's precisely what it takes. Once you understand that it's all about delivering sincere value, you need to understand where the money comes from.
Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?
PageRank is often considered to be a number between 0 and 10 (with 0 being the lowest and 10 being the highest) though that is also probably incorrect. Most SEOs believe that internally the number is not an integer, but goes to a number of decimals. The belief largely comes from the Google Toolbar, which will display a page's PageRank as a number between 0 and 10. Even this is a rough approximation, as Google does not release its most up to date PageRank as a way of protecting the algorithm's details.

Bad Idea, many, many websites nowadays use nofollow in every single external link to sustain the value of their PageRank, not only to prevent comments spam, which is in my opinion defeating the original purpose of nofollow. Google should’ve “marketed” nofollow as nospam without going into much details of how that links to PageRank to keep webmasters away from getting into the idea that nofollow means better PageRank
Influencer marketing: Important nodes are identified within related communities, known as influencers. This is becoming an important concept in digital targeting. It is possible to reach influencers via paid advertising, such as Facebook Advertising or Google Adwords campaigns, or through sophisticated sCRM (social customer relationship management) software, such as SAP C4C, Microsoft Dynamics, Sage CRM and Salesforce CRM. Many universities now focus, at Masters level, on engagement strategies for influencers.
It also seems that the underlying message is that google is constantly trying to find ways to identify the value of a page to it’s users and as it does so it will promote those pages more strongly in it’s search results and demote those that offer less real value, and it does not care how much you invest in trying to game the system by following ‘the rules’. As a small web site operator with no SEO budget and little time to apply the tricks and best practice, I think this is probably a good thing.
So, when you find a relevant forum, be sure that you have written an authorized profile description and toss in your main concept or word of great significance. Then study the forum, its rules, and the way it operates. Examine the forum to know whether its members share links in threads. Become a reliable person making more and more friends and placing posts interesting for the forum participants. Thanks to that you may get more internal linkage to your profile and gain authority. And, of course, threads will build your credibility.Why do you need all that?
Hi Brian, as usual solid and helpful content so thank you. I have a question which the internet doesn’t seem to be able to answer. i thought perhaps you could. I have worked hard on building back links and with success. However, they are just not showing up regardless of what tool I use to check (Ahrefs, etc). it has been about 60 days and there are 10 quality back links not showing. Any ideas? thanks!
Muratos – I’ve never nofollowed Amazon affiliate links on the theory that search engines probably recognize them for what they are anyway. I have a blog, though, that gets organic traffic from those Amazon products simply because people are looking for “Copenhagen ring DVD” and I hard-code the product names, musicians’ names, etc. on the page rather than use Amazon’s sexier links in iframes, etc.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[60] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[61] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[62] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
There’s a lot of frustration being vented in this comments section. It is one thing to be opaque – which Google seems to be masterly at – but quite another to misdirect, which is what No Follow has turned out to be. All of us who produce content always put our readers first, but we also have to be sensible as far as on page SEO is concerned. All Google are doing with this kind of thing is to progressively direct webmasters towards optimizing for other, more reliable and transparent, ways of generating traffic (and no, that doesn’t necessarily mean Adwords, although that may be part of the intent).
Despite this many people seem to get it wrong! In particular “Chris Ridings of www.searchenginesystems.net” has written a paper entitled “PageRank Explained: Everything you’ve always wanted to know about PageRank”, pointed to by many people, that contains a fundamental mistake early on in the explanation! Unfortunately this means some of the recommendations in the paper are not quite accurate.

Probably the most creative thing I’ve ever done was wrote a review on a restaurant (The Heart Attack Grill) that was hilarious, emailed it to the owner. He loved it so much he posted it on FB and even put it on his homepage for a while. I got thousands of visitors from this stupid article: https://www.insuranceblogbychris.com/buy-life-insurance-before-eating-at-heart-attack-grill/
An aesthetically pleasing and informational website is an excellent anchor that can easily connect to other platforms like social networking pages and app downloads. It's also relatively simple to set up a blog within the website that uses well-written content with “keywords” an Internet user is likely to use when searching for a topic. For example, a company that wants to market its new sugar-free energy drink could create a blog that publishes one article per week that uses terms like “energy drink,” “sugar-free,” and “low-calorie” to attract users to the product website.

Outreach to webmasters should be personalized. You can list reasons why you like their brand, think your brand would partner well with them or citing articles and other content they published are great ways to make them more receptive. Try to find an actual point-of-contact on professional sites like LinkedIn. A generic blast of “Dear Webmaster…” emails is really just a spam campaign.

Discoverability is not a new concept for web designers. In fact Search Engine Optimization and various forms of Search Engine Marketing arose from the need to make websites easy to discover by users. In the mobile application space this issue of discoverability is becoming ever more important – with nearly 700 apps a day being released on Apple’...
Internet marketing, or online marketing, refers to advertising and marketing efforts that use the Web and email to drive direct sales via electronic commerce, in addition to sales leads from websites or emails. Internet marketing and online advertising efforts are typically used in conjunction with traditional types of advertising such as radio, television, newspapers and magazines.

However, the biggest contributing factors to a backlink’s effect on your rank is the website it’s coming from, measured by the acronym ART: authority, a measure of a site’s prestige/reliability — .edu and .gov sites are particularly high-authority); relevance, a measure of how related the site hosting the link is to the content; and trust, which is not an official Google metric, but relates to how much a site plays by the rules of search (i.e. not selling links) and provides good content.
I think Matt Grenville’s comment is a very valid one. If your site, for whatever reason, can not attract links naturally and all of your competitors are outranking you by employing tactics that might breach Google’s TOS, what other options do you have? As well as this people will now only link to a few, trusted sites (as this has been clarified in your post as being part of Google’s algorithm) and put a limit on linking out to the smaller guys.
3. General on-site optimization. On-site optimization is a collection of tactics, most of which are simple to implement, geared toward making your website more visible and indexable to search engines. These tactics include things like optimizing your titles and meta descriptions to include some of your target keywords, ensuring your site’s code is clean and minimal, and providing ample, relevant content on every page. I’ve got a huge list of on-site SEO tactics you can check out here.
(spread across a number of pages) which lists something like 1,000 restaurants in a large city with contact details and a web link to each of those restaurant’s home page. Given that the outgoing links are relevant to my content, should I or should I not be using REL=nofollow for each link given the massive quantity of them? How will my ranking for pages containing those links and pages elsewhere on my site be affected if I do or don’t include REL=nofollow for those links? My fear is that if I don’t use REL=nofollow, Google will assume my site is just a generic directory of links (given the large number of them) and will penalize me accordingly.

All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

Is very telling and an important thing to consider. Taking the model of a university paper on a particular subject as an example, you would expect the paper to cite (link to) other respected papers in the same field in order to demonstrate that it is couched in some authority. As PageRank is based on the citation model used in university work, it makes perfect sense to incorporate a “pages linked to” factor into the equation.
When we talk about ad links, we're not talking about search ads on Google or Bing, or social media ads on Facebook or LinkedIn. We're talking about sites that charge a fee for post a backlink to your site, and which may or may not make it clear that the link is a paid advertisement. Technically, this is a grey or black hat area, as it more or less amounts to link farming when it's abused. Google describes such arrangements as "link schemes," and takes a pretty firm stance against them.
[43] Katja Mayer views PageRank as a social network as it connects differing viewpoints and thoughts in a single place. People go to PageRank for information and are flooded with citations of other authors who also have an opinion on the topic. This creates a social aspect where everything can be discussed and collected to provoke thinking. There is a social relationship that exists between PageRank and the people who use it as it is constantly adapting and changing to the shifts in modern society. Viewing the relationship between PageRank and the individual through sociometry allows for an in-depth look at the connection that results.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Before Sharpe ever came into close proximity with the internet marketing field, he was a construction worker. Needing a way to make ends meet, like millions of other people around the world, he turned to a field that could hopefully pay the bills. But try as he might, he was never able to actually get ahead. Until one day, when Sharpe discovered the amount of money being made online by internet marketers, his entire mindset changed.
Yep, please change things to stop keyword stuffing. Change them to stop cloaking. Definately change them to stop buying links that try to game Google. But, telling search engines to not give weight (that I control) to pages that are not what my site is about or are not really relevant. No way. This is logical stuff here. Maybe too logical. I think deep down you know this Matt too.