There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
Search engines are a powerful channel for connecting with new audiences. Companies like Google and Bing look to connect their customers with the best user experience possible. Step one of a strong SEO strategy is to make sure that your website content and products are the best that they can be. Step 2 is to communicate that user experience information to search engines so that you rank in the right place. SEO is competitive and has a reputation of being a black art. Here’s how to get started the right way.
Matt, my biggest complaint with Google and this “page Rank” nofollow nightmare is it seems we need to have a certain type of site to get ranked well or to make your crawler happy, you say you want a quality site, but what my users deem as quality (3000 links to the best academic information on the planet for business development) is actually looked at by Google as a bad thing and I do not get any rank because of it, makes it hard for my site to be found, and people that can really use the information can not find it when you yourself would look at the info and think it was fantastic to find it all in one place.
Our agency can provide both offensive and defensive ORM strategies as well as preventive ORM that includes developing new pages and social media profiles combined with consulting on continued content development. Our ORM team consists of experts from our SEO, Social Media, Content Marketing, and PR teams. At the end of the day, ORM is about getting involved in the online “conversations” and proactively addressing any potentially damaging content.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
No PageRank would ever escape from the loop, and as incoming PageRank continued to flow into the loop, eventually the PageRank in that loop would reach infinity. Infinite PageRank isn’t that helpful 🙂 so Larry and Sergey introduced a decay factor–you could think of it as 10-15% of the PageRank on any given page disappearing before the PageRank flows along the outlinks. In the random surfer model, that decay factor is as if the random surfer got bored and decided to head for a completely different page. You can do some neat things with that reset vector, such as personalization, but that’s outside the scope of our discussion.
With focus I mean making sure that your pages focus on the same keyword everywhere, and your site focuses on the same high level keywords and sections in your site focusing on their own high level (but not as high as the keywords for which you want your home page to rank) keywords. Focus few people really understand while the interesting thing is that you do this almost automatically right if you do your site architecture and understanding your customers, right.
I was exactly thinking the same thing what Danny Sullivan had said. If comments (even with nofollow) directly affect the outgoing PR distribution, people will tend to allow less comments (maybe usage of iframes even). Is he right? Maybe, Google should develop a new tag as well something like rel=”commented” to inform spiders about it to give less value and wordpress should be installed default with this attribute 🙂

There’s a lot of frustration being vented in this comments section. It is one thing to be opaque – which Google seems to be masterly at – but quite another to misdirect, which is what No Follow has turned out to be. All of us who produce content always put our readers first, but we also have to be sensible as far as on page SEO is concerned. All Google are doing with this kind of thing is to progressively direct webmasters towards optimizing for other, more reliable and transparent, ways of generating traffic (and no, that doesn’t necessarily mean Adwords, although that may be part of the intent).
Our SEO professionals are all well-respected thought leaders in the space and have decades of combined experience and include the following credentials: Search Engine Workshop Certification, Google Analytics and Yahoo Certifications, PMP Certification, UNIX Certification, Computer Engineering degrees and MBA’s. Our SEO team members are acclaimed SEO speakers and bloggers. IMI’s SEO team members have been keynote presenters at Pubcon, SMX, SEMCon, Etail, and many more influential conferences.
Ask for explanations if something is unclear. If an SEO creates deceptive or misleading content on your behalf, such as doorway pages or "throwaway" domains, your site could be removed entirely from Google's index. Ultimately, you are responsible for the actions of any companies you hire, so it's best to be sure you know exactly how they intend to "help" you. If an SEO has FTP access to your server, they should be willing to explain all the changes they are making to your site.

Try using Dribble to find designers with good portfolios. Contact them directly by upgrading your account to PRO status, for just $20 a year. Then simply use the search filter and type "infographics." After finding someone you like, click on "hire me" and send a message detailing your needs and requesting a price. Fiver is another place to find great designers willing to create inexpensive infographics.

The PageRank algorithm outputs a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called "iterations", through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.


Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing. 

Excellent! I was wondering when Google would finally release information regarding this highly controversial issue. I have always agreed with and followed Matt’s advice in having PR flow as freely as possible, natural linking is always the best linking in my experience with my search engine experience and results. I am very glad that you have addressed the topic of nofollow links having no effects in the Google SERPs, I was getting tired of telling the same topics covered in this article to my clients and other “SEOs”.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.
Positioning of a webpage on Google SERPs for a keyword depends on relevance and reputation, also known as authority and popularity. PageRank is Google's indication of its assessment of the reputation of a webpage: It is non-keyword specific. Google uses a combination of webpage and website authority to determine the overall authority of a webpage competing for a keyword.[36] The PageRank of the HomePage of a website is the best indication Google offers for website authority.[37]
As digital marketing continues to grow and develop, brands take great advantage of using technology and the Internet as a successful way to communicate with its clients and allows them to increase the reach of who they can interact with and how they go about doing so,.[2] There are however disadvantages that are not commonly looked into due to how much a business relies on it. It is important for marketers to take into consideration both advantages and disadvantages of digital marketing when considering their marketing strategy and business goals.
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention.
Hi, Norman! PageRank is an indicator of authority and trust, and inbound links are a large factor in PageRank score. That said, it makes sense that you may not be seeing any significant increases in your PageRank after only four months; A four-month old website is still a wee lad! PageRank is a score you will see slowly increase over time as your website begins to make its mark on the industry and external websites begin to reference (or otherwise link to) your Web pages.

And if you really want to know what are the most important, relevant pages to get links from, forget PageRank. Think search rank. Search for the words you’d like to rank for. See what pages come up tops in Google. Those are the most important and relevant pages you want to seek links from. That’s because Google is explicitly telling you that on the topic you searched for, these are the best.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.

Yep, please change things to stop keyword stuffing. Change them to stop cloaking. Definately change them to stop buying links that try to game Google. But, telling search engines to not give weight (that I control) to pages that are not what my site is about or are not really relevant. No way. This is logical stuff here. Maybe too logical. I think deep down you know this Matt too.


Matt Cutts, it’s Shawn Hill from Longview, Texas and I’ve got to say, “you’re a semseo guru”. That’s obviously why Google retained you as they did. Very informative post! As head of Google’s Webspam team how to you intend to combat Social Networking Spam (SNS)? It’s becoming an increasingly obvious problem in SERPs. I’m thinking Blogspam should be the least of Google’s worries. What’s your take?
×