There is another way to gain quality backlinks to your site, in addition to related site themes: anchor text. When a link incorporates a keyword into the text of the hyperlink, we call this quality anchor text. A link's anchor text may be one of the under-estimated resources a webmaster has. Instead of using words like "click here" which probably won't relate in any way to your website, using the words "Please visit our tips page for how to nurse an orphaned kitten" is a far better way to utilize a hyperlink. A good tool for helping you find your backlinks and what text is being used to link to your site is the Backlink Anchor Text Analysis Tool. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something incorporating relevant keywords. This will also help boost your quality backlinks score.

Adjusting how Google treats nofollows is clearly a major shift (as the frenzy in the SEO community has demonstrated). So, if Google were to adjust how they treat nofollows they would need to phase it in gradually. I believe this latest (whether in 2008 or 2009) change is simply a move in the direction of greater changes to come regarding nofollow. It is the logical first step.
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
If you can leave a guest post, leave it. Why? Because it can create relevant referral traffic to the website, you own. Everything you should do is to make your post valuable and without spam. Just important core information which won’t be spoiled by backlinks injecting. It’s better to have contextual linking. In other words, the links are to merge into your text.
@matt: I notice a bit of WordPress-related talk early in the comments (sorry, Dont have time to read all of them right now..), I was wondering if you’d like to comment on Trac ticket( – Related to the use of nofollow on non-js-fallback comment links which WordPress uses – Its linking to the current page with a changed form.. the content and comments should remain the same, just a different form.. I think the original reason nofollow was added there was to prevent search engines thinking the site was advertising multiple pages with the same content..
I am not worried by this; I do agree with Danny Sullivan (Great comment Danny, best comment I have read in a long time). I will not be changing much on my site re: linking but it is interesting too see that Google took over a year to tell us regarding the change, but was really happy to tell us about rel=”nofollow” in the first place and advised us all to use it.
Matt, this is an excellent summary. I finally got around to reading “The Search” by John Battelle and it was very enlightening to understand much of the academia behind what led to the creation of Backrub.. er Google.Looking at how many times the project was almost shutdown due to bandwidth consumption (> 50% of what the university could offer at times) as well as webmasters being concerned that their pages would be stolen and recreated. It’s so interesting to see that issues we see today are some of the same ones that Larry and Sergey were dealing with back then. As always, thanks for the great read Matt!
Internet Marketing Inc. provides integrated online marketing strategies that help companies grow. We think of ourselves as a business development consulting firm that uses interactive marketing as a tool to increase revenue and profits. Our management team has decades of combined experience in online marketing as well as graduate level education and experience in business and finance. That is why we focus on creating integrated online marketing campaigns designed to maximize your return on investment.

We must be careful with our reciprocal links. There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple. We could begin preparing for this future change in the search engine algorithm by being choosier with which we exchange links right now. By choosing only relevant sites to link with, and sites that don't have tons of outbound links on a page, or sites that don't practice black-hat SEO techniques, we will have a better chance that our reciprocal links won't be discounted.
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
Finally, start building links in relevant sites like business directories (especially local directories) relevant niche blogs and forums, and industry publications. Success at link building will result from a combination of good PR, smart marketing strategy, and of course, great content. Google has said that social media doesn’t impact rankings, but reaching out to social influencers can give your content traction on other channels that can be useful.

I really appreciate that you keep us updated as soon as you can, but in some cases, e.g. WRT rel-nofollow, the most appreciated update would be the removal of this very much hated and pretty useless microformat. I mean, when you’ve introduced it because the Google (as well as M$, Yahoo and Ask) algos were flawed at this time, why not take the chance and dump it now when it’s no longer needed?

Most online marketers mistakenly attribute 100% of a sale or lead to the Last Clicked source. The main reason for this is that analytic solutions only provide last click analysis. 93% to 95% of marketing touch points are ignored when you only attribute success to the last click. That is why multi-attribution is required to properly source sales or leads.
I personally nofollow links to my privacy policy and contact form. Even though these are excluded in robots.txt, I prefer that extra layer of protection so that the pages are not indexed. Anyone that has ever had their contact form blasted continuously by spammers knows what I mean. And yes, one could add the noindex meta tag. But let’s face it, not everyone is a skilled PHP programmer. On dynamic sites its not as simple as adding a meta tag…
7. Keyword research. Specific target keywords aren’t as important for SEO success as they used to be, now that Google search is fueled by semantic and contextual understanding, but you should still be able to identify both head keyword (short, high-volume keywords) and long-tail keyword (longer, conversational, low-volume keywords) targets to guide the direction of your campaign.
The Truth? You don't often come across genuine individuals in this space. I could likely count on one hand who those genuine-minded marketers might be. Someone like Russel Brunson who's developed a career out of providing true value in the field and helping to educate the uneducated is one such name. However, while Brunson has built a colossal business, the story of David Sharpe and his journey to becoming an 8-figure earner really hits home for most people.
The internet was the little guy savior, simple sites could rank well locally. Sadly your company is in the process of destroying that. In this economy small business with zero page rank that are listed on page 22 of results, need to be found in order to survive. My customers are really suffering because of the work that is coming out of Google, it keeps getting worse. Their conversions are still good coming out of Yahoo and MSN and now Bing. They do not have the resources to produce blogs, forums, or $5,000 websites let alone pay for Adwords when they are just trying to pay rent and not a lot of people can do their own web production.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

Most people who used the Google Toolbar probably never went through the effort of enabling the PageRank meter, which Google offered as an incentive to web surfers, a way for them to understand the quality of pages encountered when browsing (and a way for Google to understand what people were viewing beyond Google itself). But one group was very inclined to make the effort: SEOs.
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
In a number of recent articles, where I've interviewed some of social media's rising stars such as Jason Stone from Millionaire Mentor, Sean Perelstein, who built StingHD into a global brand and Nathan Chan from Foundr Magazine, amongst several others, it's quite clear that multi-million-dollar businesses can be built on the backs of wildly-popular social media channels and platforms.

Assume a small universe of four web pages: A, B, C and D. Links from a page to itself, or multiple outbound links from one single page to another single page, are ignored. PageRank is initialized to the same value for all pages. In the original form of PageRank, the sum of PageRank over all pages was the total number of pages on the web at that time, so each page in this example would have an initial value of 1. However, later versions of PageRank, and the remainder of this section, assume a probability distribution between 0 and 1. Hence the initial value for each page in this example is 0.25.
For search engine optimization purposes, some companies offer to sell high PageRank links to webmasters.[40] As links from higher-PR pages are believed to be more valuable, they tend to be more expensive. It can be an effective and viable marketing strategy to buy link advertisements on content pages of quality and relevant sites to drive traffic and increase a webmaster's link popularity. However, Google has publicly warned webmasters that if they are or were discovered to be selling links for the purpose of conferring PageRank and reputation, their links will be devalued (ignored in the calculation of other pages' PageRanks). The practice of buying and selling links is intensely debated across the Webmaster community. Google advises webmasters to use the nofollow HTML attribute value on sponsored links. According to Matt Cutts, Google is concerned about webmasters who try to game the system, and thereby reduce the quality and relevance of Google search results.[40]

The Open Directory Project (ODP) is a Web directory maintained by a large staff of volunteers. Each volunteer oversees a category, and together volunteers list and categorize Web sites into a huge, comprehensive directory. Because a real person evaluates and categorizes each page within the directory, search engines like Google use the ODP as a database for search results. Getting a site listed on the ODP often means it will show up on Google.
I am not worried by this; I do agree with Danny Sullivan (Great comment Danny, best comment I have read in a long time). I will not be changing much on my site re: linking but it is interesting too see that Google took over a year to tell us regarding the change, but was really happy to tell us about rel=”nofollow” in the first place and advised us all to use it.
More appropriately, blame Google for ever making the PageRank score visible. When Google first started, PageRank was something it talked about as part of its research papers, press releases and technology pages to promote itself as a smarter search engine than well-established and bigger rivals at the time — players like Yahoo, AltaVista and Lycos, to name a few.

I think it’s important to remember that the majority of website owners are NOT at all technical and savvy as to how this whole system works when it comes to SEO, but they still get along and do their best to put something worthwhile on the net. Unfortunately this same majority can so often damage their website rankings without ever knowing it, and lead to an under-performing website as a result, regardless of the quality of their content. As I’ve learnt as a new website owner for the first time, there’s a lot more to running a website than just doing the right thing and trying to produce quality, when time and time again the experts in SEO always win out on the SERPs regardless of quality. One keyword phrase I searched on repeatedly over recent years resulted in the same EMPTY site being returned as the number one result, truly, it had NO content.
Such an enlightening post! Thanks for revealing those sources, Brian. This really has opened up my mind to the new ideas. I have read many articles about SEO, especially the ones in my country, most of them don’t really tell how to increase your presence in search engines. But today I found this page, which gave me much more valuable insights. Definitely going to try your tips..

Denver Colorado Internet Marketing