5. Link building. In some respects, guest posting – one popular tactic to build links, among many other benefits – is just content marketing applied to external publishers. The goal is to create content on external websites, building your personal brand and company brand at the same time, and creating opportunities to link back to your site. There are only a handful of strategies to build quality links, which you should learn and understand as well.

Baseline ranking assessment. You need to understand where you are now in order to accurately assess your future rankings. Keep a simple Excel sheet to start the process. Check weekly to begin. As you get more comfortable, check every 30 to 45 days. You should see improvements in website traffic, a key indicator of progress for your keywords. Some optimizers will say that rankings are dead. Yes, traffic and conversions are more important, but we use rankings as an indicator.


All in all, PageRank sculpting (or whatever we should call it) didn’t really rule my world. But, I did think that it was a totally legitimate method to use. Now that we know the ‘weight’ leaks, this will put a totally new (and more damaging) spin on things. Could we not have just left the ‘weight’ with the parent page? This is what I thought would happen most of the time anyway.

Thanks for the article (and lead-off links as they were good info too) but I did not quite get – if there was a penalisation by Google for sculpting – from the article or whether it was just bad practice? And also to echo what someone else asked ‘is it WORTH actually undoing this type of work on websites SEO’s have worked on’ or simply change the way we work with new sites?
Matt Cutts, it’s Shawn Hill from Longview, Texas and I’ve got to say, “you’re a semseo guru”. That’s obviously why Google retained you as they did. Very informative post! As head of Google’s Webspam team how to you intend to combat Social Networking Spam (SNS)? It’s becoming an increasingly obvious problem in SERPs. I’m thinking Blogspam should be the least of Google’s worries. What’s your take?

Hi, Norman! PageRank is an indicator of authority and trust, and inbound links are a large factor in PageRank score. That said, it makes sense that you may not be seeing any significant increases in your PageRank after only four months; A four-month old website is still a wee lad! PageRank is a score you will see slowly increase over time as your website begins to make its mark on the industry and external websites begin to reference (or otherwise link to) your Web pages.
Well, something similar happened with PageRank, a brilliant child of Google founders Larry Page (who gave his name to the child and played off the concept of a web-page) and Sergey Brin. It helped Google to become the search giant that dictates the rules for everybody else, and at the same time it created an array of complicated situations that at some point got out of hand.
An aesthetically pleasing and informational website is an excellent anchor that can easily connect to other platforms like social networking pages and app downloads. It's also relatively simple to set up a blog within the website that uses well-written content with “keywords” an Internet user is likely to use when searching for a topic. For example, a company that wants to market its new sugar-free energy drink could create a blog that publishes one article per week that uses terms like “energy drink,” “sugar-free,” and “low-calorie” to attract users to the product website.

When you comment on a blog post, you are usually allowed to include a link back to your website. This is often abused by spammers and can become a negative link building tool. But if you post genuine comments on high-quality blog posts, there can be some value in sharing links, as it can drive traffic to your site and increase the visibility of your brand.
Can I just remind Google that not all “great content” is going to “attract links”, this is something I think they forget. I have great content on my site about plumbers in Birmingham and accountants in London, very valuable, detailed, non-spammy, hand-crafted copy on these businesses, highly valuable to anyone looking for their services. But no-one is ever going to want to link to it; it’s not topical or quirky, is very locally-focussed, and has no video of cats playing pianos.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Make sure your backlinks appear to be natural. Don’t ask webmasters to link back to your pages with a specific anchor text since this can haphazardly result in a pattern that may get noticed by search engines and cause you to get a linking penalty, a la Penguin. Also, don’t do anything shady or unnatural to create backlinks, like asking a site to put a link in the footer of every page on their site.
PageRank has been used to rank spaces or streets to predict how many people (pedestrians or vehicles) come to the individual spaces or streets.[51][52] In lexical semantics it has been used to perform Word Sense Disambiguation,[53] Semantic similarity,[54] and also to automatically rank WordNet synsets according to how strongly they possess a given semantic property, such as positivity or negativity.[55]
Our digital agency offers both traditional targeted online display advertising as well as behavioral retargeting. Through an intense discovery process, our team will determine the most optimal marketing mix for your online media plan. We will leverage ad network partnerships for planning the ideal media buys and negotiating the best possible pricing.
How many times do we need to repeat the calculation for big networks? That’s a difficult question; for a network as large as the World Wide Web it can be many millions of iterations! The “damping factor” is quite subtle. If it’s too high then it takes ages for the numbers to settle, if it’s too low then you get repeated over-shoot, both above and below the average - the numbers just swing about the average like a pendulum and never settle down.
Simple question – Lets say I have a blog/site with lot of outgoing links (avg 10 links per page). All the outgoing links (in the editorial content and user generated ones) are nofollowed, while all the internal links are “open”. I might have manually “opened up” some links in the editorial content because I’m so sure of their authority (ex:-google faq pages).
Establishment of customer exclusivity: A list of customers and customer's details should be kept on a database for follow up and selected customers can be sent selected offers and promotions of deals related to the customer's previous buyer behaviour. This is effective in digital marketing as it allows organisations to build up loyalty over email.[22]

Sharpe says that you shouldn't dive into internet marketing until you decide on a niche and figure out what you're passionate about. Do you want to join the make-money-online (MMO) niche? Or do you want to engage in another niche? For example, you could sell products or online courses about blogging or search engine optimization or anything else for that matter. Keep in mind that whatever you're selling, whatever niche you're in, that you need to embed yourself there deeply.
Great post. I’m posting a link back to this article from our blog along with some comments. I do have a question. In your article, you post “The only place I deliberately add a nofollow is on the link to my feed, because it’s not super-helpful to have RSS/Atom feeds in web search results.” Yet when I look at this article, I noticed that the comment links are “external, nofollow”. Is there a reason for that?
Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?”
This is the argument that quickly emerged about blog comments recently. Say I have an article on a blog with 5 links in the editorial copy — some of those links leading back to other content within the blog that I hope to do well. Then I get 35 comments on the article, with each comment having a link back to the commenters’ sites. That’s 40 links in all. Let’s say this particular page has $20 in PageRank to spend. Each link gets 50 cents.
What that means to us is that we can just go ahead and calculate a page’s PR without knowing the final value of the PR of the other pages. That seems strange but, basically, each time we run the calculation we’re getting a closer estimate of the final value. So all we need to do is remember the each value we calculate and repeat the calculations lots of times until the numbers stop changing much.
One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you. 

Is there anyway “no follow” links can positively affect your search rankings? Most bloggers have to assign this attribute as they can’t trust every site they link to nor do they want to decrease their own PR. Is there any potential for search engine to reward a page that accumulates “no follow” links even if they accrue 1/10 the weight they previously did? This would reward those who take the time to actively blog.
Quality content is more likely to get shared. By staying away from creating "thin" content and focusing more on content that cites sources, is lengthy and it reaches unique insights, you'll be able to gain Google's trust over time. Remember, this happens as a component of time. Google knows you can't just go out there and create massive amounts of content in a few days. If you try to spin content or duplicate it in any fashion, you'll suffer a Google penalty and your visibility will be stifled.
This isn't about off-the-shelf solutions. You need to really convey something illustrious and beautiful, then fill it with incredible MVP content. Over time, this will become a thriving hotbed of activity for you, where people will come by and check-in repeatedly to see what you're talking about and what value you're delivering. Keep in mind that this won't happen quickly. It will take years. Yes, I said years.
I compare the latest Google search results to this: Mcdonalds is the most popular and is #1 in hamburgers… they dont taste that great but people still go there. BUT I bet you know a good burger joint down the road from Google that makes awesome burgers, 10X better than Mcdonalds, but “we” can not find that place because he does not have the resources or budget to market his burgers effectively.
It is clear that something new should emerge to cover that unfollow emptiness. Here and there it is believed that some search engines may use so-called implied links to rank the page. Implied links are, for example, references to your brand. They usually come with a tone: positive, neutral, or negative. The tone defines the reputation of your site. This reputation serves as a ranking signal to search engines.
A content specialist needs to be a Jack or Jill of all trades, utilizing excellent written and verbal communication skills, above-average computer literacy, and a natural interest in trends. This job is ultimately about translating the key aspects of the product into content the target demographic finds appealing. This is part art, part critical thinking, and 100% attention to detail.
Goals and Objectives. Clearly define your objectives in advance so you can truly measure your ROI from any programs you implement. Start simple, but don’t skip this step. Example: You may decide to increase website traffic from a current baseline of 100 visitors a day to 200 visitors over the next 30 days. Or you may want to improve your current conversion rate of one percent to two in a specified period. You may begin with top-level, aggregate numbers, but you must drill down into specific pages that can improve products, services, and business sales.

“With 150 million pages, the Web had 1.7 billion edges (links).” Kevin Heisler, that ratio holds true pretty well as the web gets bigger. A good rule of thumb is that the number of links is about 10x the number of pages. I agree that it’s pretty tragic that Rajeev Motwani was a co-author of many of those early papers. I got to talk to Rajeev a little bit at Google, and he was a truly decent and generous man. What has heartened me is to see all the people that he helped, and to see those people pay their respects online. No worries on the Consumer WebWatch–I’m a big fan of Consumer WebWatch, and somehow I just missed their blog. I just want to reiterate that even though this feels like a huge change to a certain segment of SEOs, in practical terms this change really doesn’t affect rankings very much at all.

4. The facets of content marketing. Though content marketing can be treated as a distinct strategy, I see it as a necessary element of the SEO process. Only by developing high-quality content over time will you be able to optimize for your target keywords, build your site’s authority, and curate a loyal recurring audience. You should know the basics, at the very least, before proceeding with other components of SEO.
Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.
Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?”
SEO is an acronym for "search engine optimization" or "search engine optimizer." Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your site and reputation. Make sure to research the potential advantages as well as the damage that an irresponsible SEO can do to your site. Many SEOs and other agencies and consultants provide useful services for website owners, including:
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
The development of digital marketing is inseparable from technology development. One of the key points in the start of was in 1971, where Ray Tomlinson sent the very first email and his technology set the platform to allow people to send and receive files through different machines.[8] However, the more recognisable period as being the start of Digital Marketing is 1990 as this was where the Archie search engine was created as an index for FTP sites. In the 1980s, the storage capacity of computer was already big enough to store huge volumes of customer information. Companies started choosing online techniques, such as database marketing, rather than limited list broker.[9] This kind of databases allowed companies to track customers' information more effectively, thus transforming the relationship between buyer and seller. However, the manual process was not so efficient.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Say I have an article on a blog with 5 links in the editorial copy — some of those links leading back to other content within the blog that I hope to do well. Then I get 35 comments on the article, with each comment having a link back to the commenters’ sites. That’s 40 links in all. Let’s say this particular page has $20 in PageRank to spend. Each link gets 50 cents.
Baseline ranking assessment. You need to understand where you are now in order to accurately assess your future rankings. Keep a simple Excel sheet to start the process. Check weekly to begin. As you get more comfortable, check every 30 to 45 days. You should see improvements in website traffic, a key indicator of progress for your keywords. Some optimizers will say that rankings are dead. Yes, traffic and conversions are more important, but we use rankings as an indicator.
Journalists and writers are always on the lookout for experts to contribute quotes for their articles. Some (but not all) will include backlinks to their sources’ websites. Getting quotes in media outlets is a great way to not only get backlinks, but also build credibility within your industry. Even in instances where you don't get backlinks, this profile page for PMM's CEO Josh Rubin is a good example of how you can showcase your media appearances - something which both Google and your clients value when it comes to evaluating your authority.
This year, for the first time, Google stated that user experience would be a core part of gaining rankings for mobile websites. A poorer user experience would send your site hurtling down the rankings. This appeared to come as a shock to many in the SEO community and despite assurances that content was still king – many seemed to feel that this ...

What I like the most about Monitor Backlinks is that we can keep track of every single link, and that we can see the status of those links when they change or become obsolete. The details and the whole overview of Monitor Backlinks, is exactly what I need and no more, because there are a lot of SEO programmes on the market today, which promise to do what's necessary, but don't. Monitor Backlinks is exactly what I need for my SEO, and no more than that needed.
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
There is no secret that getting hiqh-quality backlinks is your website’s way to better ranking in Google. But how to differ good link from the bad one? Carefully choosing backlinks is a very tremulous and important task for everyone who wants to optimize their sites. There are a lot of different tools which can help you to check whether your backlinks are trustful and can bring your website value. 
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
Google's strategy works well. By focusing on the links going to and from a Web page, the search engine can organize results in a useful way. While there are a few tricks webmasters can use to improve Google standings, the best way to get a top spot is to consistently provide top quality content, which gives other people the incentive to link back to their pages.
Check your robots.txt file. Make sure you learn how to hide content you don’t want indexed from search engines and that search engines can find the content you do want indexed, too. (You will want to hide things such as repeat content, which can be penalized by search engines but is still necessary on your site). You’ll find a link to how to modify the robots.txt at the end of this article.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3

A Web crawler may use PageRank as one of a number of importance metrics it uses to determine which URL to visit during a crawl of the web. One of the early working papers[56] that were used in the creation of Google is Efficient crawling through URL ordering,[57] which discusses the use of a number of different importance metrics to determine how deeply, and how much of a site Google will crawl. PageRank is presented as one of a number of these importance metrics, though there are others listed such as the number of inbound and outbound links for a URL, and the distance from the root directory on a site to the URL.


For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.
The probability that the random surfer visits a page is its PageRank. And, the d damping factor is the probability at each page the “random surfer” will get bored and request another random page. One important variation is to only add the damping factor d to a single page, or a group of pages. This allows for personalization and can make it nearly impossible to deliberately mislead the system in order to get a higher ranking. We have several other extensions to PageRank…

We have other ways to consider relevence. Topical Trust Flow is one and page titles and anchor texts are others. If you put a search term into our system (instead of a URL) you actually get back a search engine! we don’t profess to be a Google (yet) but we can show our customers WHY one page is more relevent on our algotithm than another page. This could prove useful for SEOs. We actually launched that in 2013, but the world maybe never noticed 🙂
Probably the most creative thing I’ve ever done was wrote a review on a restaurant (The Heart Attack Grill) that was hilarious, emailed it to the owner. He loved it so much he posted it on FB and even put it on his homepage for a while. I got thousands of visitors from this stupid article: https://www.insuranceblogbychris.com/buy-life-insurance-before-eating-at-heart-attack-grill/
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.[27]
Matt, this is an excellent summary. I finally got around to reading “The Search” by John Battelle and it was very enlightening to understand much of the academia behind what led to the creation of Backrub.. er Google.Looking at how many times the project was almost shutdown due to bandwidth consumption (> 50% of what the university could offer at times) as well as webmasters being concerned that their pages would be stolen and recreated. It’s so interesting to see that issues we see today are some of the same ones that Larry and Sergey were dealing with back then. As always, thanks for the great read Matt! 

Totally agree — more does not always equal better. Google takes a sort of ‘Birds of a Feather’ approach when analyzing inbound links, so it’s really all about associating yourself (via inbound links) with websites Google deems high quality and trustworthy so that Google deems YOUR web page high quality and trustworthy. As you mentioned, trying to cut corners, buy links, do one-for-one trades, or otherwise game/manipulate the system never works. The algorithm is too smart.
As Google becomes more and more sophisticated, one of the major cores of their algorithm, the one dealing with links (called Penguin) aims to value natural, quality links and devalue those unnatural or spammy ones. As a search engine, if they are to stay viable, they have to make sure their results are as honest and high-quality as possible, and that webmasters can't manipulate those results to their own benefit.
I think Matt Grenville’s comment is a very valid one. If your site, for whatever reason, can not attract links naturally and all of your competitors are outranking you by employing tactics that might breach Google’s TOS, what other options do you have? As well as this people will now only link to a few, trusted sites (as this has been clarified in your post as being part of Google’s algorithm) and put a limit on linking out to the smaller guys.

The Open Directory Project (ODP) is a Web directory maintained by a large staff of volunteers. Each volunteer oversees a category, and together volunteers list and categorize Web sites into a huge, comprehensive directory. Because a real person evaluates and categorizes each page within the directory, search engines like Google use the ODP as a database for search results. Getting a site listed on the ODP often means it will show up on Google.


Despite this many people seem to get it wrong! In particular “Chris Ridings of www.searchenginesystems.net” has written a paper entitled “PageRank Explained: Everything you’ve always wanted to know about PageRank”, pointed to by many people, that contains a fundamental mistake early on in the explanation! Unfortunately this means some of the recommendations in the paper are not quite accurate.
×