Can I just remind Google that not all “great content” is going to “attract links”, this is something I think they forget. I have great content on my site about plumbers in Birmingham and accountants in London, very valuable, detailed, non-spammy, hand-crafted copy on these businesses, highly valuable to anyone looking for their services. But no-one is ever going to want to link to it; it’s not topical or quirky, is very locally-focussed, and has no video of cats playing pianos.
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
This guide is designed for you to read cover-to-cover. Each new chapter builds upon the previous one. A core idea that we want to reinforce is that marketing should be evaluated holistically. What you need to do is this in terms of growth frameworks and systems as opposed to campaigns. Reading this guide from start to finish will help you connect the many moving parts of marketing to your big-picture goal, which is ROI.
Mega-sites, like http://news.bbc.co.uk have tens or hundreds of editors writing new content – i.e. new pages - all day long! Each one of those pages has rich, worthwile content of its own and a link back to its parent or the home page! That’s why the Home page Toolbar PR of these sites is 9/10 and the rest of us just get pushed lower and lower by comparison…
There is no secret that getting hiqh-quality backlinks is your website’s way to better ranking in Google. But how to differ good link from the bad one? Carefully choosing backlinks is a very tremulous and important task for everyone who wants to optimize their sites. There are a lot of different tools which can help you to check whether your backlinks are trustful and can bring your website value. 
A decent article which encourages discussion and healthy debate. Reading some of the comments I see it also highlights some of the misunderstandings some people (including some SEOs) have of Google PageRank. Toolbar PageRank is not the same thing as PageRank. The little green bar (Toolbar PageRank) was never a very accurate metric and told you very little about the value of any particular web page. It may have been officially killed off earlier this year, but the truth is its been dead for many years. Real PageRank on the other hand, is at the core of Google’s algorithm and remains very important.
3) Some people don’t believe things have changed. In fact, if things really changed substantially a year ago, you’d think a few of the advanced SEOs out there would have noticed this and talked about it. But nada. There are lots of reasons why the change could have happened and not been spotted. Sculpting might really have been a second or third order factor, as Matt calls it — not helping things as much as some have assumed. SEOs that spotted it might have stayed quiet. Or, it didn’t change — and still hasn’t changed — and sculpting does work even better than Google thought, so it wants to put out a message that it doesn’t, in hopes of putting the genie back in the bottle. That’s probably the major conspiracy theory out there.
Hi, Norman! PageRank is an indicator of authority and trust, and inbound links are a large factor in PageRank score. That said, it makes sense that you may not be seeing any significant increases in your PageRank after only four months; A four-month old website is still a wee lad! PageRank is a score you will see slowly increase over time as your website begins to make its mark on the industry and external websites begin to reference (or otherwise link to) your Web pages.
At the time I was strongly advocating page rank sculting by inclusion of no follow links on “related product” links. It’s interesting to note that my proposed technique would have perhaps worked for a little while then would have lost its effectiveness. Eventualy I reached the point where my efforts delivered diminishing returns which was perhaps unavoidable.
If the assumption here is that webmasters will remove the nofollow attributes in response to this change, then why did take “more than a year” for someone from Google to present this information to the public? It seems that if this logic had anything at all to do with the decision to change the nofollow policy, Google would have announced it immediately in order to “encourage” webmasters to change their linking policies and allow access to their pages with “high-quality information.”
And if you really want to know what are the most important, relevant pages to get links from, forget PageRank. Think search rank. Search for the words you’d like to rank for. See what pages come up tops in Google. Those are the most important and relevant pages you want to seek links from. That’s because Google is explicitly telling you that on the topic you searched for, these are the best.
First, it’s important to know that not all backlinks are created equal. Those published on PR0 (“PR” stands for “page rank”—the “0” means the lowest value) sites offer very little weight in search; those published on PR9 (the highest page rank) sites offer very great weight in searches (in fact, a single backlink on a PR9 site might be enough to deliver top-three rankings for a keyphrase in some cases). Examples of high page rank sites include Wikipedia, the BBC, The New York Times, Mashable, etc.
First, it’s important to know that not all backlinks are created equal. Those published on PR0 (“PR” stands for “page rank”—the “0” means the lowest value) sites offer very little weight in search; those published on PR9 (the highest page rank) sites offer very great weight in searches (in fact, a single backlink on a PR9 site might be enough to deliver top-three rankings for a keyphrase in some cases). Examples of high page rank sites include Wikipedia, the BBC, The New York Times, Mashable, etc.
Internet Marketing Inc. is one of the fastest growing full service Internet marketing agencies in the country with offices in San Diego, and Las Vegas. We specialize in providing results driven integrated online marketing solutions for medium-sized and enterprise brands across the globe. Companies come to us because our team of well-respected industry experts has the talent and creativity to provide your business with a more sophisticated data-driven approach to digital marketing strategy. IMI works with some clients through IMI Ventures, and their first product is VitaCup.

Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Two other practical limitations can be seen in the case of digital marketing. One,digital marketing is useful for specific categories of products,meaning only consumer goods can be propagated through digital channels.Industrial goods and pharmaceutical products can not be marketed through digital channels. Secondly, digital marketing disseminates only the information to the prospects most of whom do not have the purchasing authority/power. And hence the reflection of digital marketing into real sales volume is skeptical.[citation needed]
Google's founders, in their original paper,[18] reported that the PageRank algorithm for a network consisting of 322 million links (in-edges and out-edges) converges to within a tolerable limit in 52 iterations. The convergence in a network of half the above size took approximately 45 iterations. Through this data, they concluded the algorithm can be scaled very well and that the scaling factor for extremely large networks would be roughly linear in {\displaystyle \log n} , where n is the size of the network.
As for the use of nofollow as a way to keep pages that shouldn’t be indexed out of Google (as with your feed example) is terrible advice. Your use of it on your feed link does nothing. If anyone links to your feed without nofollow, then it’s going to get indexed. Things that shouldn’t be indexed need to use either robots.txt or meta robots blocking. Nofollow on links to those items isn’t a solution.
Well, it seems that what this article says, is that the purpose of the no-follow link is to take the motivation away from spammers to post spam comments for the purpose of the link and the associated page rank flow; that the purpose of no-follow was never to provide a means to control where a page’s pagerank flow is directed. It doesn’t seem that shocking to me folks.
This pagerank theme is getting understood in simplistic ways, people are still concerning about pagerank all the time (talking about SEOs). I just use common sense, if I were the designer of a search engine, besides of using the regular structure of analysis, I would use artificial intelligence to determine many factors of the analysis. I think this is not just a matter of dividing by 10, is far more complex. I might be wrong, but I believe the use of the nofollow attribute is not a final decision of the website owner any more is more like an option given to the bot, either to accept or reject the link as valid vote. Perhaps regular links are not final decision of the webmaster too. I think Google is seeing websites human would do, the pages are not analyzed like a parser will do, I believe is more like a neural network, bit more complex. I believe this change make a little difference. People should stop worrying about pagerank and start building good content, the algorithm is far more complex to determine what is next step to reach top ten at Google. However nothing is impossible.

Most schools / universities have just an [email protected]… or [email protected]…. email address, which goes to the reception. I don’t really know who to address this email to, as I believe a lot of the time the admin person receiving it ignore and delete it without passing it on to someone relevant, e.g. the school’s or universities’ communications manager. Hope you can help me on this one! Thanks so much in advance!
An entrepreneur or freelancer has two main strategies to tap into when marketing online. Search Engine Optimization (SEO), which attempts to rank your website on search engines “organically”, and Search Engine Marketing (SEM), which ranks your website in search results in exchange for money. Both strategies can be used to build a business successfully—but which one is right for you?
2. Was there really a need to make this change? I know all sites should be equally capable of being listed in search engines without esoteric methods playing a part. But does this really happen anyway (in search engines or life in general)? If you hire the best accountant you will probably pay less tax than the other guy. Is that really fair? Also, if nobody noticed the change for a year (I did have an inkling, but was totally and completely in denial) then does that mean the change didn’t have to be made in the first place? As said, we now have a situation where people will probably make bigger and more damaging changes to their site and structure, rather than add a little ‘nofollow’ to a few links.

In my experience this means (the key words are “not the most effective way”) a page not scored by Google (“e.g. my private link” – password protected, disallowed via robots.txt and/or noindex meta robots) whether using or not using rel=”nofollow” attribute in ‘links to’ is not factored into anything… because it can’t factor in something it isn’t allowed.
Brian, you are such an inspiration. I wonder how do you get all these hacks and then publish them for all of us. I have been reading your stuff from quite a time now, but I have a problem. Every time I read something you post I feel overwhelmed but I haven’t been really able to generate any fruitful results on any of my sites. I just don’t know where to start. Imagine I don’t even have an email list.
Ah – well the Reasonable Surfer is a different patent (and therefore a different algorithm) to PageRank. I would imagine that initially, only the first link counted – simply because there either IS or IS NOT a relationship between the two nodes. This mean it was a binary choice. However, at Majestic we certainly think about two links between page A and Page B with separate anchor texts… in this case in a binary choice, either the data on the second link would need to be dropped or, the number of backlinks can start to get bloated. I wrote about this on Moz way back in 2011!
With this change, I can still get the $4 if I simply don’t allow comments. Or I show comments, but I use an iframe, so that the comment actually reside on a different page. In either case, I’m encouraged to reduce the number of links rather than let them be on the page period, nofollow regardless. If I’m worried my page won’t seem “natural” enough to Google without them, maybe I allow 5 comments through and lock them down after that.
As I was telling Norman above, these days what we’ve come to call content marketing is really a big part of “link building.” You can’t buy links, and “you link to me I’ll link to you” requests often land on deaf ears. Its really all about creating high quality content (videos, images, written blog posts) that appeals to the needs/wants of your target market, and then naturally earning inbound links from sources that truly find what you have to offer worth referencing.
Start Value (In this case) is the number of actual links to each “node”. Most people actually set this to 1 to start, but there are two great reasons for using link counts. First, it is a better approximation to start with than giving everything the same value, so the algorithm stabilizes in less iterations and it is so useful to check my spreadsheet in a second… so node A has one link in (from page C)
SEM, on the other hand, costs money but can deliver very rapid results. Your website must be optimized to make sales or at least drive a customer to get in touch (GIT – in marketing terms) so you can make a sale. You should approach SEM with care and make sure you completely understand how much money you have exposed at any one time. Start slow and evaluate your results.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

There’s a lot of frustration being vented in this comments section. It is one thing to be opaque – which Google seems to be masterly at – but quite another to misdirect, which is what No Follow has turned out to be. All of us who produce content always put our readers first, but we also have to be sensible as far as on page SEO is concerned. All Google are doing with this kind of thing is to progressively direct webmasters towards optimizing for other, more reliable and transparent, ways of generating traffic (and no, that doesn’t necessarily mean Adwords, although that may be part of the intent).
A great number of people who deal with SEO confuse backlink building and backlink earning. These notions are different. What is backlink building? It means to create conditions for SEO backlink referring to your site. To earn a backlink means to deserve it. Is that really possible? Yes! If you want your site to be worth earning backlinks, you must do everything possible and impossible to please your guests and users.
9. Troubleshooting and adjustment. In your first few years as a search optimizer, you’ll almost certainly run into the same problems and challenges everyone else does; your rankings will plateau, you’ll find duplicate content on your site, and you’ll probably see significant ranking volatility. You’ll need to know how to diagnose and address these problems if you don’t want them to bring down the effectiveness of your campaign.

One thing that has worked well for me lately that can work well (and may help with the infographic promotion) is surveys. Google Forms allow you to create a survey for free. Think of interesting questions to your niche and start promoting the survey (ask well known influencers in your niche to share the survey with their social followers to help with responses. Offer them a link as a contributor once the survey is complete). Once you have a few hundred responses, you can create a commentary about your findings (Google also puts the data into graphs). If you have enough responses and the information is interesting, get in touch with the same bloggers who helped push it out there to see if they would be happy to share the results. The beauty of this method is that if the results are interesting enough, you might end up getting a link back from a huge news site.


He is the co-founder of Neil Patel Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.
Matt, I’ve been a firm believer of the thought that webmasters shouldn’t really bother too much about the calculations that Google would do while spotting external links on a site. Leave that to Google. You write the content and if you find relevant resources, link to it. Why worry over PR ? In case you’re so sure about the linked site to be “kinda spammy” then nofollow it. That’s it.
But I'm not talking about any kind of link building. I'm talking about organic link building by getting out there and creating insatiable "anchor content" on your website, then linking to that content with equally-great content that's created on authority sites like Medium, Quora, LinkedIn and other publishing platforms. It's not easy by any measure. Google is far more wary of newcomers these days than it once used to be.
PageRank was once available for the verified site maintainers through the Google Webmaster Tools interface. However, on October 15, 2009, a Google employee confirmed that the company had removed PageRank from its Webmaster Tools section, saying that "We've been telling people for a long time that they shouldn't focus on PageRank so much. Many site owners seem to think it's the most important metric for them to track, which is simply not true."[67] In addition, The PageRank indicator is not available in Google's own Chrome browser.

We must be careful with our reciprocal links. There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple. We could begin preparing for this future change in the search engine algorithm by being choosier with which we exchange links right now. By choosing only relevant sites to link with, and sites that don't have tons of outbound links on a page, or sites that don't practice black-hat SEO techniques, we will have a better chance that our reciprocal links won't be discounted.
Re: Cameron’s Comment. Google transparent? Maybe. Great products for users – yes… but they operate from lofty towers. Can’t get a hold of them. Can’t contact them. They are the ONLY company in the world with zero customer support for their millions of users. Who really knows what they are doing from one month to the month in regards to ranking sites… etc.
PageRank was developed by Google founders Larry Page and Sergey Brin at Stanford. In fact the name. PageRank is a likely play on Larry Page's name. At the time that Page and Brin met, early search engines typically linked to pages that had the highest keyword density, which meant people could game the system by repeating the same phrase over and over to attract higher search page results. Sometimes web designers would even put hidden text on pages to repeat phrases. 
×