The majority of web traffic is driven by the major commercial search engines, Google, Bing, and Yahoo!. Although social media and other types of traffic can generate visits to your website, search engines are the primary method of navigation for most Internet users. This is true whether your site provides content, services, products, information, or just about anything else.
A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
There’s obviously a huge number of reasons why a website might link to another and not all of them fit into the categories above. A good rule of thumb on whether a link is valuable is to consider the quality of referral traffic (visitors that might click on the link to visit your website). If the site won’t send any visitors, or the audience is completely unrelated and irrelevant, then it might not really be a link that’s worth pursuing.

Brian, you are such an inspiration. I wonder how do you get all these hacks and then publish them for all of us. I have been reading your stuff from quite a time now, but I have a problem. Every time I read something you post I feel overwhelmed but I haven’t been really able to generate any fruitful results on any of my sites. I just don’t know where to start. Imagine I don’t even have an email list.


Search engines use complex mathematical algorithms to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.

So, now we're getting to backlinks that have relatively little, or even negative value. The value of web directories has diminished dramatically in recent years. This shouldn't come as a surprise. After all, when was the last time that you used a web directory to find anything, rather than just doing a Google search? Google recognizes that directories don't have any real world worth, and so they don't accord much value to backlinks on them. But there is an exception to this rule. Submitting your website to local, industry-specific and niche directories can net you worthwhile backlinks. But if you can't imagine a circumstance where someone would use a certain directory, then it's probably not worth your time.
One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you. 

Keyword analysis. From nomination, further identify a targeted list of key­words and phrases. Review competitive lists and other pertinent industry sources. Use your preliminary list to determine an indicative number of recent search engine queries and how many websites are competing for each key­word. Prioritize keywords and phrases, plurals, singulars and misspellings. (If search users commonly misspell a keyword, you should identify and use it). Please note that Google will try to correct the term when searching, so use this with care.


The Open Directory Project (ODP) is a Web directory maintained by a large staff of volunteers. Each volunteer oversees a category, and together volunteers list and categorize Web sites into a huge, comprehensive directory. Because a real person evaluates and categorizes each page within the directory, search engines like Google use the ODP as a database for search results. Getting a site listed on the ODP often means it will show up on Google.


Well, it seems that what this article says, is that the purpose of the no-follow link is to take the motivation away from spammers to post spam comments for the purpose of the link and the associated page rank flow; that the purpose of no-follow was never to provide a means to control where a page’s pagerank flow is directed. It doesn’t seem that shocking to me folks.
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.

Muratos – I’ve never nofollowed Amazon affiliate links on the theory that search engines probably recognize them for what they are anyway. I have a blog, though, that gets organic traffic from those Amazon products simply because people are looking for “Copenhagen ring DVD” and I hard-code the product names, musicians’ names, etc. on the page rather than use Amazon’s sexier links in iframes, etc.


On-page SEO is the work you do on your own website to get a high rank in search engines. Your goal is obviously that your website will show on the first page and perhaps even among the first three search results. On-page SEO does not carry as much weight as off-page SEO in the rankings, but if you don’t get the basics right… it’s unlikely that your off-page SEO will deliver results, either.
Suppose instead that page B had a link to pages C and A, page C had a link to page A, and page D had links to all three pages. Thus, upon the first iteration, page B would transfer half of its existing value, or 0.125, to page A and the other half, or 0.125, to page C. Page C would transfer all of its existing value, 0.25, to the only page it links to, A. Since D had three outbound links, it would transfer one third of its existing value, or approximately 0.083, to A. At the completion of this iteration, page A will have a PageRank of approximately 0.458.
Our Website Design & Development team has the skill and creativity to take your vision and translate that into an amazing interactive experience. Our designers and usability experts use best practices for combining amazing designs with an effective user experience. Our web developers and SEO developers work together to create websites that have both aesthetic appeal and SEO friendly code structure. Our proprietary methods for SEO website development will ensure that you hit the ground running upon launch!
In my experience this means (the key words are “not the most effective way”) a page not scored by Google (“e.g. my private link” – password protected, disallowed via robots.txt and/or noindex meta robots) whether using or not using rel=”nofollow” attribute in ‘links to’ is not factored into anything… because it can’t factor in something it isn’t allowed.
While most search engine companies try to keep their processes a secret, their criteria for high spots on SERPs isn't a complete mystery. Search engines are successful only if they provide a user links to the best Web sites related to the user's search terms. If your site is the best skydiving resource on the Web, it benefits search engines to list the site high up on their SERPs. You just have to find a way to show search engines that your site belongs at the top of the heap. That's where search engine optimization (SEO) comes in -- it's a collection of techniques a webmaster can use to improve his or her site's SERP position.

Understand that whatever you're going to do, you'll need traffic. If you don't have any money at the outset, your hands will be tied no matter what anyone tells you. The truth is that you need to drive traffic to your offers if you want them to convert. These are what we call landing pages or squeeze pages. This is where you're coming into contact with the customers, either for the first time or after they get to know you a little bit better.
The PageRank algorithm outputs a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called “iterations”, through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.
The best strategy to get backlinks is to create great content and let other people promote your content. However, to get started, you can create your own links to content on your social media platform, ask your friends to share your content on their websites and social media, and if you can find questions in forums that your content answers, you can always post it there.
We have a saying that “good data” is better than “big data.” Bid data is a term being thrown around a lot these days because brands and agencies alike now have the technology to collect more data and intelligence than ever before. But what does that mean for growing a business. Data is worthless without the data scientists analyzing it and creating actionable insights. We help our client partners sift through the data to gleam what matters most and what will aid them in attaining their goals.
If you are serious about improving web traffic to your website, we recommend you read Google Webmasters and Webmaster Guidelines. These contain the best practices to help Google (and other search engines) find, crawl, and index your website. After you have read them, you MUST try our Search Engine Optimization Tools to help you with Keyword Research, Link Building, Technical Optimization, Usability, Social Media Strategy and more.

There are plenty of guides to marketing. From textbooks to online video tutorials, you can really take your pick. But, we felt that there was something missing — a guide that really starts at the beginning to equip already-intelligent professionals with a healthy balance of strategic and tactical advice. The Beginner’s Guide to Online Marketing closes that gap.


As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
A backlink’s value doesn’t only come from the website authority itself. There are other factors to consider as well. You’ll sometimes times hear those in the industry refer to “dofollow” and “nofollow” links. This goes back to the unethical linkbuilding tactics in the early days of SEO. One practices included commenting on blogs and leaving a link. It was an easy method and back then, search engines couldn’t tell the difference between a blog post and other site content.
Our backgrounds are as diverse as they come, bringing knowledge and expertise in business, finance, search marketing, analytics, PR, content creation, creative, and more. Our leadership team is comprised of successful entrepreneurs, business executives, athletes, military combat veterans, and marketing experts. The Executives, Directors, and Managers at IMI are all well-respected thought leaders in the space and are the driving force behind the company’s ongoing success and growth.
Native on-platform analytics, including Facebook’s Insights, Twitter’s Analytics, and Instagram’s Insights. These platforms can help you evaluate your on-platform metrics such as likes, shares, retweets, comments, and direct messages. With this information, you can evaluate the effectiveness of your community-building efforts and your audience’s interest in your content.
But this leads to a question — if my husband wants to do a roundup of every Wagner Ring Cycle on DVD, that’s about 8 Amazon links on the page, all bleeding PR away from his substantive insights. If he, instead, wants to do a roundup of every Ring Cycle on CD, that’s about two dozen items worth discussing. The page would be very handy for users, and would involve considerably more effort on his part… but no good deed goes unpunished, and in the eyes of Google the page would be devalued by more than two thirds.

Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?
This must be one of the most controversial attributes ever. I participate in photographic communities. The textual content there is quite sparse, as it is a visual medium, with only basic descriptions. However, the community is very active and the participants leave a lot of meaningful comments. Now, with the “nofollow” used everywhere the photographic community is punishing itself for being active and interactive without knowing it. WordPress and Pixelpost now have “nofollow” built in almost on any list of links (blog-roll, comments etc). The plug-in and theme developers for these platforms followed suit and yes, you’ve guessed it – added “nofollow” almost on every link. So, every time I leave a comment without being an anonymous coward or if some one likes my blog and links to it in their blog-roll than I’m or they are diluting the rank of my blog? Does it mean for my own good I should stop participating in the community? Should I visit hundreds of blogs I visited in last three years and ask the owners to remove my comments and remove my site from their blog-roll to stop my PageRank from free falling?
Internet marketing, or online marketing, refers to advertising and marketing efforts that use the Web and email to drive direct sales via electronic commerce, in addition to sales leads from websites or emails. Internet marketing and online advertising efforts are typically used in conjunction with traditional types of advertising such as radio, television, newspapers and magazines.
Pagerank has recently been used to quantify the scientific impact of researchers. The underlying citation and collaboration networks are used in conjunction with pagerank algorithm in order to come up with a ranking system for individual publications which propagates to individual authors. The new index known as pagerank-index (Pi) is demonstrated to be fairer compared to h-index in the context of many drawbacks exhibited by h-index.[63]
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.[27]
Your social media strategy is more than just a Facebook profile or Twitter feed. When executed correctly, social media is a powerful customer engagement engine and web traffic driver. It’s easy to get sucked into the hype and create profiles on every single social site. This is the wrong approach. What you should do instead is to focus on a few key channels where your brand is most likely to reach key customers and prospects. This chapter will teach you how to make that judgment call.

While SEOs can provide clients with valuable services, some unethical SEOs have given the industry a black eye by using overly aggressive marketing efforts and attempting to manipulate search engine results in unfair ways. Practices that violate our guidelines may result in a negative adjustment of your site's presence in Google, or even the removal of your site from our index.

Finally, it’s critical you spend time and resources on your business’s website design. When these aforementioned customers find your website, they’ll likely feel deterred from trusting your brand and purchasing your product if they find your site confusing or unhelpful. For this reason, it’s important you take the time to create a user-friendly (and mobile-friendly) website.
Moreover, the PageRank mechanism is entirely general, so it can applied to any graph or network in any field. Currently, the PR formula is used in bibliometrics, social and information network analysis, and for link prediction and recommendation. It's even used for system analysis of road networks, as well as biology, chemistry, neuroscience, and physics.
Nice word is not enough for this. You show that Blogging is like Apple vs Samsung. You can create lot of post and drive traffic (which is Samsung like lot of phone every year) or you can create high quality post like apple (which is you) and force higher rank site to make content like you copy content from you blog. Now i will work hard on already publish post until they will not get traffic.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
Digital marketing planning is a term used in marketing management. It describes the first stage of forming a digital marketing strategy for the wider digital marketing system. The difference between digital and traditional marketing planning is that it uses digitally based communication tools and technology such as Social, Web, Mobile, Scannable Surface.[57][58] Nevertheless, both are aligned with the vision, the mission of the company and the overarching business strategy.[59]
“Even when I joined the company in 2000, Google was doing more sophisticated link computation than you would observe from the classic PageRank papers. If you believe that Google stopped innovating in link analysis, that’s a flawed assumption. Although we still refer to it as PageRank, Google’s ability to compute reputation based on links has advanced considerably over the years.”
Google has a very large team of search quality raters that evaluate the quality of search results, that gets fed into a machine learning algorithm. Google’s search quality rater guidelines provide plenty of detail and examples of what Google class as high or low quality content and websites, and their emphasis on wanting to reward sites that clearly show their expertise, authority and trust (EAT).
Google might see 10 links on a page that has $10 of PageRank to spend. It might notice that 5 of those links are navigational elements that occur a lot throughout the site and decide they should only get 50 cents each. It might decide 5 of those links are in editorial copy and so are worthy of getting more. Maybe 3 of them get $2 each and 2 others get $1.50 each, because of where they appear in the copy, if they’re bolded or any of a number of other factors you don’t disclose. 

Could the nofollow change could be interpreted as a form of usability guidance? For instance, I’ve recently removed drop-down menus from a handful of sites because of internal link and keyword density issues. This wasn’t done randomly. Tests were done to measure usage and value of this form of navigation that made it easy to make the change – allowing usability and SEO to dovetail nicely.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
People think about PageRank in lots of different ways. People have compared PageRank to a “random surfer” model in which PageRank is the probability that a random surfer clicking on links lands on a page. Other people think of the web as an link matrix in which the value at position (i,j) indicates the presence of links from page i to page j. In that case, PageRank corresponds to the principal eigenvector of that normalized link matrix.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

I would like to know how Google is handling relevancy with so many websites now jumping on the “no follow” wagon? Seems like just about every major website has no follow links, so with the Panda updates this year what’s happening to all that lost link power? Seem’s like this tactic will stagnate the growth of up-and-coming websites on the internet to me. Am I right here?
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land, Marketing Land, MarTech Today and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

nofollow is beyond a joke now. There is so much confusion (especially when other engines’ treatment is factored in), I don’t know how you expect a regular publisher to keep up. The expectation seems to have shifted from “Do it for humans and all else will follow” to “Hang on our every word, do what we say, if we change our minds then change everything” and nofollow lead the way. I could give other examples of this attitude (e.g. “We don’t follow JavaScript links so it’s ’safe’ to use those for paid links”), but nofollow is surely the worst.
Adjusting how Google treats nofollows is clearly a major shift (as the frenzy in the SEO community has demonstrated). So, if Google were to adjust how they treat nofollows they would need to phase it in gradually. I believe this latest (whether in 2008 or 2009) change is simply a move in the direction of greater changes to come regarding nofollow. It is the logical first step. 

Another illicit practice is to place "doorway" pages loaded with keywords on the client's site somewhere. The SEO promises this will make the page more relevant for more queries. This is inherently false since individual pages are rarely relevant for a wide range of keywords. More insidious, however, is that these doorway pages often contain hidden links to the SEO's other clients as well. Such doorway pages drain away the link popularity of a site and route it to the SEO and its other clients, which may include sites with unsavory or illegal content.
Of course, important pages mean nothing to you if they don’t match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all dozens of aspects of the page’s content (and the content of the pages linking to it) to determine if it’s a good match for your query.
Yes, the more links on a page the smaller the amount of page rank it can pass on to each, but that was as it was before. With regard to what happens to the ‘missing’ page rank, it seems that if this is the case all over the Internet, and it will be, the total amount of page rank flow is reduced the same all over so you don’t need as much page rank flow to your good links to maintain relative position.

An entrepreneur or freelancer has two main strategies to tap into when marketing online. Search Engine Optimization (SEO), which attempts to rank your website on search engines “organically”, and Search Engine Marketing (SEM), which ranks your website in search results in exchange for money. Both strategies can be used to build a business successfully—but which one is right for you?
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
What are backlinks doing for your SEO strategy? Well, Google considers over 200 SEO ranking factors when calculating where a page should rank, but we know that backlinks are one of the top three (the other two are content and RankBrain, Google’s AI). So while you should always focus on creating high-quality content, link-building is also an important factor in ranking your pages well on Google.
Having a ‘keyword rich’ domain name may lead to closer scrutiny from Google. According to Moz, Google has “de-prioritized sites with keyword-rich domains that aren’t otherwise high-quality. Having a keyword in your domain can still be beneficial, but it can also lead to closer scrutiny and a possible negative ranking effect from search engines—so tread carefully.”
SEO experts have a really bad habit: They like to throw around strange words and industry jargon when they talk to customers without checking to make sure that their clients understand the topic at hand. Some do this intentionally to paper over the fact that they use black hat techniques that will ultimately hurt their customers. But for most, it’s simply a matter of failing to recognize that part of their job is to educate their clients.

Hi Brian, as usual solid and helpful content so thank you. I have a question which the internet doesn’t seem to be able to answer. i thought perhaps you could. I have worked hard on building back links and with success. However, they are just not showing up regardless of what tool I use to check (Ahrefs, etc). it has been about 60 days and there are 10 quality back links not showing. Any ideas? thanks!
First of all, it’s necessary to sort out what a backlink is. There is no need to explain everything in detail. The main thing to understand is what it is for and how it works. A backlink is a kind of Internet manipulator. It links one particular site with other external websites which contain links to this site. In other words, when you visit external sites they will lead you to that particular site.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
If the algorithm really works as Matt suggests, no one should use nofollow links internally. I’ll use the example that Matt gave. Suppose you have a home page with ten PR “points.” You have links to five “searchable” pages that people would like to find (and you’d like to get found!), and links to five dull pages with disclaimers, warranty info, log-in information, etc. But, typically, all of the pages will have links in headers and footers back to the home page and other “searchable” pages. So, by using “nofollow” you lose some of the reflected PR points that you’d get if you didn’t use “nofollow.” I understand that there’s a decay factor, but it still seems that you could be leaking points internally by using “nofollow.”
And if you really want to know what are the most important, relevant pages to get links from, forget PageRank. Think search rank. Search for the words you’d like to rank for. See what pages come up tops in Google. Those are the most important and relevant pages you want to seek links from. That’s because Google is explicitly telling you that on the topic you searched for, these are the best.
Backlink is a link one website gets from another website. Backlinks make a huge impact on a website’s prominence in search engine results. This is why they are considered very useful for improving a website’s SEO ranking. Search engines calculate rankings using multiple factors to display search results. No one knows for sure how much weight search engines give to backlinks when listing results, however what we do know for certain is that they are very important.
The course work of a marketing program will consist of real-world and hands-on components, such as case studies of both successful and failed marketing campaigns, and simulated businesses marketed by students using the concepts they have learned. This will include diving into several computer programs like Adobe InDesign and Dreamweaver, as well as both free and proprietary website analytics software.
It doesn’t mean than you have to advertise on these social media platforms. It means that they belong to that pyramid which will function better thanks to their support. Just secure them and decide which of them will suit your goal better. For example, you can choose Instagram because its audience is the most suitable for mobile devices and bits of advice of their exploitation distribution.
There’s a lot of frustration being vented in this comments section. It is one thing to be opaque – which Google seems to be masterly at – but quite another to misdirect, which is what No Follow has turned out to be. All of us who produce content always put our readers first, but we also have to be sensible as far as on page SEO is concerned. All Google are doing with this kind of thing is to progressively direct webmasters towards optimizing for other, more reliable and transparent, ways of generating traffic (and no, that doesn’t necessarily mean Adwords, although that may be part of the intent).
×