There's a lot to learn when it comes to the internet marketing field in general, and the digital ether of the web is a crowded space filled with one know-it-all after another that wants to sell you the dream. However, what many people fail to do at the start, and something that Sharpe learned along the way, is to actually understand what's going on out there in the digital world and how businesses and e-commerce works in general, before diving in headfirst.
Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?
Regarding nofollow on content that you don’t want indexed, you’re absolutely right that nofollow doesn’t prevent that, e.g. if someone else links to that content. In the case of the site that excluded user forums, quite a few high-quality pages on the site happened not to have links from other sites. In the case of my feed, it doesn’t matter much either way, but I chose not to throw any extra PageRank onto my feed url. The services that want to fetch my feed url (e.g. Google Reader or Bloglines) know how to find it just fine.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
If I’m writing a page about the use of the vCard microformat on a page, it absolutely makes sense for me to link out to the definition where it was originally published, and improves user experience as well as lending authority to my arguments. Often as SEOs we get obsessed with the little things, claiming that its hard to get links on particular subjects, and that is pretty true, but its mainly our own selfishness in linking out to authority content that prevents other people giving us the same courtesy.
In this new world of digital transparency brands have to be very thoughtful in how they engage with current and potential customers. Consumers have an endless amount of data at their fingertips especially through social media channels, rating and review sites, blogs, and more. Unless brands actively engage in these conversations they lose the opportunity for helping guide their brand message and addressing customer concerns.
There are plenty of guides to marketing. From textbooks to online video tutorials, you can really take your pick. But, we felt that there was something missing — a guide that really starts at the beginning to equip already-intelligent professionals with a healthy balance of strategic and tactical advice. The Beginner’s Guide to Online Marketing closes that gap.
Our agency can provide both offensive and defensive ORM strategies as well as preventive ORM that includes developing new pages and social media profiles combined with consulting on continued content development. Our ORM team consists of experts from our SEO, Social Media, Content Marketing, and PR teams. At the end of the day, ORM is about getting involved in the online “conversations” and proactively addressing any potentially damaging content.
I compare the latest Google search results to this: Mcdonalds is the most popular and is #1 in hamburgers… they dont taste that great but people still go there. BUT I bet you know a good burger joint down the road from Google that makes awesome burgers, 10X better than Mcdonalds, but “we” can not find that place because he does not have the resources or budget to market his burgers effectively.
After that, you need to make a choice about how to construct an online presence that helps you achieve that goal. Maybe you need to set up an e-commerce site. If you’re interested in publishing content to drive awareness and subscribers, look into setting up a blog. A simple website or landing page with a lead capture form can help you start developing your brand and generating traffic. A basic analytics platform (like Google Analytics, which is free) can help you start to measure how you are tracking towards your initial goal.
Understand that whatever you're going to do, you'll need traffic. If you don't have any money at the outset, your hands will be tied no matter what anyone tells you. The truth is that you need to drive traffic to your offers if you want them to convert. These are what we call landing pages or squeeze pages. This is where you're coming into contact with the customers, either for the first time or after they get to know you a little bit better.

At the time I was strongly advocating page rank sculting by inclusion of no follow links on “related product” links. It’s interesting to note that my proposed technique would have perhaps worked for a little while then would have lost its effectiveness. Eventualy I reached the point where my efforts delivered diminishing returns which was perhaps unavoidable.

Start Value (In this case) is the number of actual links to each “node”. Most people actually set this to 1 to start, but there are two great reasons for using link counts. First, it is a better approximation to start with than giving everything the same value, so the algorithm stabilizes in less iterations and it is so useful to check my spreadsheet in a second… so node A has one link in (from page C)


What i have learnt with comments only allow them if they give value to your blog i have used this for one of my main blogs bpd and me and it worked i have let comments threw witch were spamee and it just got a google page rank of 2 after a year learning by mistakes google page rank is always going to be a mystery and people will try to beat it they might for a short period after that they get caught out but the people who write good quality content will be the winners and keep writing quality content a question might be does google count how many no follows there are i wounder
Internet marketing is not a singular approach to raising interest and awareness in a product. Because of the vast number of platforms the Internet creates, the field encompasses several disciplines. It involves everything from email, to Search Engine Optimization (SEO), to website design, and much more to reach an ever-evolving, ever-growing audience. (See also Web Marketing)
We regard a small web consisting of three pages A, B and C, whereby page A links to the pages B and C, page B links to page C and page C links to page A. According to Page and Brin, the damping factor d is usually set to 0.85, but to keep the calculation simple we set it to 0.5. The exact value of the damping factor d admittedly has effects on PageRank, but it does not influence the fundamental principles of PageRank. So, we get the following equations for the PageRank calculation:
Likewise, ‘nofollowing’ your archive pages on your blog. Is this really a bad thing? You can get to the pages from the ‘tag’ index or the ‘category’ index, why put weight to a page that is truly navigational. At least the tag and category pages are themed. Giving weight to a page that is only themed by the date is crazy and does not really help search engines deliver ‘good’ results (totally leaving aside the duplicate content issues for now).
Is very telling and an important thing to consider. Taking the model of a university paper on a particular subject as an example, you would expect the paper to cite (link to) other respected papers in the same field in order to demonstrate that it is couched in some authority. As PageRank is based on the citation model used in university work, it makes perfect sense to incorporate a “pages linked to” factor into the equation.
The first component of Google's trust has to do with age. Age is more than a number. But it's not just the age when you first registered your website. The indexed age has to do with two factors: i) the date that Google originally found your website, and; ii) what happened between the time that Google found your website and the present moment in time.
Digital marketing is also referred to as 'online marketing', 'internet marketing' or 'web marketing'. The term digital marketing has grown in popularity over time. In the USA online marketing is still a popular term. In Italy, digital marketing is referred to as web marketing. Worldwide digital marketing has become the most common term, especially after the year 2013.[19]
Having a ‘keyword rich’ domain name may lead to closer scrutiny from Google. According to Moz, Google has “de-prioritized sites with keyword-rich domains that aren’t otherwise high-quality. Having a keyword in your domain can still be beneficial, but it can also lead to closer scrutiny and a possible negative ranking effect from search engines—so tread carefully.”
In both versions of my model, I used the total of my initia esitimate to check my math was not doing south. After every iteration, the total Pagerank remains the same. This means that PageRank doesn’t leak! 301 redirects cannot just bleed PageRank, otherwise the algorithm might not remain stable. On a similar note, pages with zero outbound links can’t be “fixed” by dividing by something other than zero. They do need to be fixed, but not by diluing the overall PageRank. I can maybe look at these cases in more depth if there is some demand.

For the purpose of their second paper, Brin, Page, and their coauthors took PageRank for a spin by incorporating it into an experimental search engine, and then compared its performance to AltaVista, one of the most popular search engines on the Web at that time. Their paper included a screenshot comparing the two engines’ results for the word “university.”
Finally, it’s critical you spend time and resources on your business’s website design. When these aforementioned customers find your website, they’ll likely feel deterred from trusting your brand and purchasing your product if they find your site confusing or unhelpful. For this reason, it’s important you take the time to create a user-friendly (and mobile-friendly) website.

I suppose for those people, including myself who just keep trying to our best and succeed, we just need to keep trusting that Google is doing all it can to weed out irrelevant content and produce the quality goods with changes such as this. Meanwhile the “uneducated majority” will just have to keep getting educated or get out of the game I suppose.


Mathematical PageRanks for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value. If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. In the presence of damping, Page A effectively links to all pages in the web, even though it has no outgoing links of its own.
It’s important to monitor the backlinks your site is accumulating. First, you can verify that your outreach is working. Second, you can monitor if you pick up any shady backlinks. Domains from Russia and Brazil are notorious origins of spam. Therefore, it can be wise to disavow links from sites originating from this part of the world through Google Search Console as soon as you find them – even if they haven’t impacted your site… yet.
Because of the size of the actual web, the Google search engine uses an approximative, iterative computation of PageRank values. This means that each page is assigned an initial starting value and the PageRanks of all pages are then calculated in several computation circles based on the equations determined by the PageRank algorithm. The iterative calculation shall again be illustrated by our three-page example, whereby each page is assigned a starting PageRank value of 1.
Google wasn’t happy with the Pandora’s Box it had opened. It began to fight back, with its most famous action against a network known as SearchKing, penalizing the site and some of those in the network with PageRank score reductions or actual removal from Google. SearchKing sued Google. Google won, a judge ruling that its search results were entitled to First Amendment protection as opinions.
Keeping up with the latest trends is a must for any business, but ignoring technology trends in the digital world is the matter of staying in business. Unfortunately, those trends (while easy enough to find mentioned online) are rarely explained well. There seems to be this mistaken idea that anyone who has an interest or need in the practice will just magically get the jargon. As we all know, that is one superpower that doesn’t exist in the real world.
PageRank always was and remains only one part of the Google search algorithm, the system that determines how to rank pages. There are many other ranking factors that are also considered. A high PageRank score did NOT mean that a page would rank well for any topic. Pages with lower scores could beat pages with higher scores if they had other factors in their favor.

After finding websites that have good metrics, you have to make sure the website is related to your site. For each competitor backlink, try to understand how your competitor got that link. If it was a guest article, send a request to become a contributor as well. If it was a product review by a blogger, contact the writer and offer them a good deal in exchange for a similar review.

Backlinks are an essential part of SEO process. They help search bots to crawl your site and rank it correctly to its content. Each backlink is a part of a ranking puzzle. That`s why every website owner wants to get as much as possible backlinks due to improving website’s SEO ranking factors. It’s a type of citation or hyperlink used in the text. If a person says “to be or not to be,” he/she is citing Shakespeare’s character, Hamlet.


Great post. I’m posting a link back to this article from our blog along with some comments. I do have a question. In your article, you post “The only place I deliberately add a nofollow is on the link to my feed, because it’s not super-helpful to have RSS/Atom feeds in web search results.” Yet when I look at this article, I noticed that the comment links are “external, nofollow”. Is there a reason for that?
Disclaimer: Google™ search engine and PageRank™ algorithm are the trademarks of Google Inc. CheckPageRank.net is not affiliated with Google Inc., but provides publicly available information about pagerank values of websites. We provide our services on "as is" and "as available" basis and we do not provide any guarantees regarding this service stability and/or availability.
For most parts the sophistication in this system is simplified here. I still have trouble understanding the difference between letting link flow withing my pages without thinking about a loop. For example, page A, B and C link to each other from all angles therefore the link points should be shared. But in this loop formula, page B does not link to A. It just goes to C and loops. How does this affect navigation bars? As you know they are meant to link stay on top and link to all pages. I’m lost.

The biggest problem that most people have when trying to learn anything to do with driving more traffic to their website or boosting their visibility across a variety of online mediums, is that they try to do the least amount of work for the greatest return. They cut corners and they take shortcuts. Because of that, they fail. Today, if you're serious about marketing anything on the web, you have to gain Google's trust.


Thanks a lot for all of those great tips you handed out here. I immediately went to work applying the strategies that you mentioned. I will keep you posted on my results. I have been offering free SEO services to all of my small business bookkeeping clients as a way of helping them to grow their businesses. Many of them just don’t have the resources required to hire an SEO guru to help them but they need SEO bad. I appreciate the fact that you share your knowledge and don’t try to make it seem like it’s nuclear science in order to pounce on the innocent. All the best to you my friend!
A generalization of PageRank for the case of ranking two interacting groups of objects was described in [32] In applications it may be necessary to model systems having objects of two kinds where a weighted relation is defined on object pairs. This leads to considering bipartite graphs. For such graphs two related positive or nonnegative irreducible matrices corresponding to vertex partition sets can be defined. One can compute rankings of objects in both groups as eigenvectors corresponding to the maximal positive eigenvalues of these matrices. Normed eigenvectors exist and are unique by the Perron or Perron-Frobenius theorem. Example: consumers and products. The relation weight is the product consumption rate.

Thanks for sharing this, Matt. I’m happy that you took the time to do so considering that you don’t have to. What I mean is, in an ideal world, there should be no such thing as SEO. It is the SE’s job to bring the right users to the right sites and it is the job of webmasters to cater to the needs of the users brought into their sites by SEs. Webmasters should not be concerned of bringing the users in themselves. (aside from offsite or sponsored marketing campaigns) The moment they do, things start to get ugly because SEs would now have to implement counter-measures. (To most SEO tactics) This becomes an unending spiral. If people only stick to their part of the equation, SEs will have more time to develop algorithms for making sure webmasters get relevant users rather than to develop algorithms for combating SEOs to ensure search users get relevant results. Just do your best in providing valuable content and Google will try their best in matching you with your users. Don’t waste time trying to second guess how Google does it so that you can present yourself to Google as having a better value than you really have. They have great engineers and they have the code—you only have a guess. At most, the SEO anyone should be doing is to follow the webmasters guidelines. It will benefit all.
8. Technical SEO. Technical SEO is one of the most intimidating portions of the SEO knowledge base, but it’s an essential one. Don’t let the name scare you; the most technical elements of SEO can be learned even if you don’t have any programming or website development experience. For example, you can easily learn how to update and replace your site’s robots.txt file, and with the help of an online template, you should be able to put together your sitemap efficiently.

Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Conversion rate optimization is all about testing. Many companies get too bogged down in design and what they think looks best and will convert. At the end of the day, you don’t know until you test. At IMI, we have the tools, technology, and expertise to not only build well-optimized web pages but to test them once they go live. Our conversion rate optimization can not only save our client’s money but generate millions in revenue.
No PageRank would ever escape from the loop, and as incoming PageRank continued to flow into the loop, eventually the PageRank in that loop would reach infinity. Infinite PageRank isn’t that helpful 🙂 so Larry and Sergey introduced a decay factor–you could think of it as 10-15% of the PageRank on any given page disappearing before the PageRank flows along the outlinks. In the random surfer model, that decay factor is as if the random surfer got bored and decided to head for a completely different page. You can do some neat things with that reset vector, such as personalization, but that’s outside the scope of our discussion.
Thanks for sharing this, Matt. I’m happy that you took the time to do so considering that you don’t have to. What I mean is, in an ideal world, there should be no such thing as SEO. It is the SE’s job to bring the right users to the right sites and it is the job of webmasters to cater to the needs of the users brought into their sites by SEs. Webmasters should not be concerned of bringing the users in themselves. (aside from offsite or sponsored marketing campaigns) The moment they do, things start to get ugly because SEs would now have to implement counter-measures. (To most SEO tactics) This becomes an unending spiral. If people only stick to their part of the equation, SEs will have more time to develop algorithms for making sure webmasters get relevant users rather than to develop algorithms for combating SEOs to ensure search users get relevant results. Just do your best in providing valuable content and Google will try their best in matching you with your users. Don’t waste time trying to second guess how Google does it so that you can present yourself to Google as having a better value than you really have. They have great engineers and they have the code—you only have a guess. At most, the SEO anyone should be doing is to follow the webmasters guidelines. It will benefit all.
Backlinks are important for a number of reasons. The quality and quantity of pages backlinking to your website are some of the criteria used by search engines like Google to determine your ranking on their search engine results pages (SERP). The higher you rank on a SERP, the better for your business as people tend to click on the first few search results Google, Bing or other search engines return for them.
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
Some brilliant tips and advice here, I am curious you mention the directory sites to submit to and I notice a lot of my competitors are on such sites but a lot of these sites want links back which would mean I would need pages of links on my site and that is something I don’t see on competitors sites – how do they manage that would it be all paid for or is there a trick
Despite this many people seem to get it wrong! In particular “Chris Ridings of www.searchenginesystems.net” has written a paper entitled “PageRank Explained: Everything you’ve always wanted to know about PageRank”, pointed to by many people, that contains a fundamental mistake early on in the explanation! Unfortunately this means some of the recommendations in the paper are not quite accurate.
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
In my experience this means (the key words are “not the most effective way”) a page not scored by Google (“e.g. my private link” – password protected, disallowed via robots.txt and/or noindex meta robots) whether using or not using rel=”nofollow” attribute in ‘links to’ is not factored into anything… because it can’t factor in something it isn’t allowed.
Today, with nearly half the world's population wired to the internet, the ever-increasing connectivity has created global shifts in strategic thinking and positioning, disrupting industry after industry, sector after sector. Seemingly, with each passing day, some new technological tool emerges that revolutionizes our lives, further deepening and embedding our dependence on the world wide web. 
×