3. General on-site optimization. On-site optimization is a collection of tactics, most of which are simple to implement, geared toward making your website more visible and indexable to search engines. These tactics include things like optimizing your titles and meta descriptions to include some of your target keywords, ensuring your site’s code is clean and minimal, and providing ample, relevant content on every page. I’ve got a huge list of on-site SEO tactics you can check out here.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.
When you comment on a blog post, you are usually allowed to include a link back to your website. This is often abused by spammers and can become a negative link building tool. But if you post genuine comments on high-quality blog posts, there can be some value in sharing links, as it can drive traffic to your site and increase the visibility of your brand.

How does this all relate to disallows in the robots.txt? My ecom site has 12,661 pages disallowed because we got nailed for duplicate content. We sale batteries so revisons to each battery where coming up as duplicate content. Is PageRank being sent (and ignored) to these internal disallowed links as well? One of our category levels has hundreds of links to different series found under models, the majority of these series are disallowed. If PageRank acts the same with disallows as it does with nofollows, are these disallowed links are hurting our
Matt, my biggest complaint with Google and this “page Rank” nofollow nightmare is it seems we need to have a certain type of site to get ranked well or to make your crawler happy, you say you want a quality site, but what my users deem as quality (3000 links to the best academic information on the planet for business development) is actually looked at by Google as a bad thing and I do not get any rank because of it, makes it hard for my site to be found, and people that can really use the information can not find it when you yourself would look at the info and think it was fantastic to find it all in one place.

While ordinary users were not that interested in pages' scores, SEOs of a different caliber felt that this was a great opportunity to make a difference for their customers. This obsession of SEOs with PageRank made everyone feel that this ranking signal is more or less the only important one. In spite of the fact that pages with a lower PR score can beat those with a higher score! What did we receive then, as a result?

Page Rank