The original Random Surfer PageRank patent from Stanford has expired. The Reasonable Surfer version of PageRank (assigned to Google) is newer than that one, and has been updated via a continuation patent at least once. The version of PageRank based upon a trusted seed set of sites (assigned to Google) has also been updated via a continuation patent and differs in many ways from the Stanford version of PageRank. It is likely that Google may be using one of the versions of PageRank that they have control over (the exclusive license to use Stanford’s version of PageRank has expired along with that patent). The updated versions of PageRank (reasonable surfer and Trusted Seeds approach) both are protected under present day patents assigned to Google, and both have been updated to reflect modern processes in how they are implemented. Because of their existence, and the expiration of the original, I would suggest that it is unlikely that the random surfer model-base PageRank is still being used.
Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
For the purpose of their second paper, Brin, Page, and their coauthors took PageRank for a spin by incorporating it into an experimental search engine, and then compared its performance to AltaVista, one of the most popular search engines on the Web at that time. Their paper included a screenshot comparing the two engines’ results for the word “university.”

Totally agree — more does not always equal better. Google takes a sort of ‘Birds of a Feather’ approach when analyzing inbound links, so it’s really all about associating yourself (via inbound links) with websites Google deems high quality and trustworthy so that Google deems YOUR web page high quality and trustworthy. As you mentioned, trying to cut corners, buy links, do one-for-one trades, or otherwise game/manipulate the system never works. The algorithm is too smart.


Such an enlightening post! Thanks for revealing those sources, Brian. This really has opened up my mind to the new ideas. I have read many articles about SEO, especially the ones in my country, most of them don’t really tell how to increase your presence in search engines. But today I found this page, which gave me much more valuable insights. Definitely going to try your tips..
So, for example, a short-tail keyphrase might be “Logo design”. Putting that into Google will get you an awful lot of hits. There’s a lot of competition for that phrase, and it’s not particularly useful for your business, either. There are no buying signals in the phrase – so many people will use this phrase to learn about logo design or to examine other aspects of logo design work.

Understand that whatever you're going to do, you'll need traffic. If you don't have any money at the outset, your hands will be tied no matter what anyone tells you. The truth is that you need to drive traffic to your offers if you want them to convert. These are what we call landing pages or squeeze pages. This is where you're coming into contact with the customers, either for the first time or after they get to know you a little bit better.
Deliver value no matter what: Regardless of who you are and what you're trying to promote, always deliver value, first and foremost. Go out of your way to help others by carefully curating information that will assist them in their journey. The more you focus on delivering value, the quicker you'll reach that proverbial tipping point when it comes to exploding your fans or followers.
There are also many keyword research tools (some free and some paid) that claim to take the effort out of this process. A popular tool for first timers is Traffic Travis, which can also analyse your competitors’ sites for their keyword optimization strategies and, as a bonus, it can deliver detailed analysis on their back-linking strategy, too. You can also use Moz.com’s incredibly useful keyword research tools – they’re the industry leader, but they come at a somewhat higher price.
The PageRank algorithm has major effects on society as it contains a social influence. As opposed to the scientific viewpoint of PageRank as an algorithm the humanities instead view it through a lens examining its social components. In these instances, it is dissected and reviewed not for its technological advancement in the field of search engines, but for its societal influences.[42] Laura Granka discusses PageRank by describing how the pages are not simply ranked via popularity as they contain a reliability that gives them a trustworthy quality. This has led to a development of behavior that is directly linked to PageRank. PageRank is viewed as the definitive rank of products and businesses and thus, can manipulate thinking. The information that is available to individuals is what shapes thinking and ideology and PageRank is the device that displays this information. The results shown are the forum to which information is delivered to the public and these results have a societal impact as they will affect how a person thinks and acts.
Start Value (In this case) is the number of actual links to each “node”. Most people actually set this to 1 to start, but there are two great reasons for using link counts. First, it is a better approximation to start with than giving everything the same value, so the algorithm stabilizes in less iterations and it is so useful to check my spreadsheet in a second… so node A has one link in (from page C)
After finding websites that have good metrics, you have to make sure the website is related to your site. For each competitor backlink, try to understand how your competitor got that link. If it was a guest article, send a request to become a contributor as well. If it was a product review by a blogger, contact the writer and offer them a good deal in exchange for a similar review.
As mentioned above, the two versions of the algorithm do not differ fundamentally from each other. A PageRank which has been calculated by using the second version of the algorithm has to be multiplied by the total number of web pages to get the according PageRank that would have been caculated by using the first version. Even Page and Brin mixed up the two algorithm versions in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they claim the first version of the algorithm to form a probability distribution over web pages with the sum of all pages' PageRanks being one.

Back in the ’90s, two students at Stanford named Larry Page and Sergey Brin started pondering how they could make a better search engine that didn’t get fooled by keyword stuffing. They realized that if you could measure each website’s popularity (and then cross index that with what the website was about), you could build a much more useful search engine. In 1998, they published a scientific paper in which they introduced the concept of “PageRank.” This topic was further explored in another paper that Brin and Page contributed to, “PageRank Citation Ranking: Bringing Order to the Web.”
Understand that whatever you're going to do, you'll need traffic. If you don't have any money at the outset, your hands will be tied no matter what anyone tells you. The truth is that you need to drive traffic to your offers if you want them to convert. These are what we call landing pages or squeeze pages. This is where you're coming into contact with the customers, either for the first time or after they get to know you a little bit better.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[21] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Suggesting that this change is really just the equivalent of “resetting” things to the way they were is absurd. nofollow is still be using on outbound links in mass by the most authoritative/trusted sites on the web. Allowing us peons to have a slight bit of control over our internal juice flow simply allowed us to recoup a small portion of the overall juice that we lost when the top-down flow was so dramatically disrupted.
6. Measurement and analysis. You won’t get far in SEO unless you know how to measure your results, interpret those results, and use your analysis to make meaningful changes to your approach. The best tool for the job is still Google Analytics, especially if you’re new to the game. Spend some time experimenting with different metrics and reports, and read up on Analytics knowledge base articles. There’s a deep world to dive into.
The better you learn and understand SEO and the more strides you take to learn this seemingly confusing and complex discipline, the more likely you'll be to appear organically in search results. And let's face it, organic search is important to marketing online. Considering that most people don't have massive advertising budgets and don't know the first thing about lead magnets, squeeze pages and sales funnels, appearing visible is critical towards long-term success.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.

Search Engine Optimization

×