Bob Dole (interesting name), you’re certainly welcome to use Bing if you prefer, but before you switch, you might check whether they do similar things. I know that Nate Buggia has strongly recommended not to bother with PageRank sculpting in the past, for example, or at least that was my perception from his comments at the last couple SMX Advanced conferences.


Of course, important pages mean nothing to you if they don’t match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all dozens of aspects of the page’s content (and the content of the pages linking to it) to determine if it’s a good match for your query.
I just did a consult and opinion letter for an extremely large 200,000+ page corporate website that had been forced to temporarily remove their html sitemap due to some compromised code that overloaded their server and crashed the site. A number of individuals at the company were concerned at the potential, negative SEO implications of removing this page, loss of page rank equity transfer to sitemap targets and a feeling that this page was providing the robots with important pathways to many of the orphan pages unavailable through the menu system. This article was helpful in debunking the feeling that a page with 200,000 links off of it was passing any link juice to the targets. PS. XML sitemap in place.
Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?
Given that “only a tiny percentage of links on the Web use nofollow”, why don’t we just get back to focusing on humans and drop nofollow? It has failed, and given that all it ever was was a tool to manipulate Pagerank, it was bound to do so. Has Google done any tests on its search quality taking nofollow into account vs. not taking it into account, I wonder?
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]

9. Troubleshooting and adjustment. In your first few years as a search optimizer, you’ll almost certainly run into the same problems and challenges everyone else does; your rankings will plateau, you’ll find duplicate content on your site, and you’ll probably see significant ranking volatility. You’ll need to know how to diagnose and address these problems if you don’t want them to bring down the effectiveness of your campaign.

Page Ranks Denver

×