My favorite tool to spy on my competitors' backlinks is called Monitor Backlinks. It allows you to add your four most important competitors. From then on, you get a weekly report containing all the new links they have earned. Inside the tool, you get more insights about these links and can sort them by their value and other SEO metrics. A useful feature is that all the links my own website already has are highlighted in green, as in the screenshot below.
I’ve seen so many cases of webmasters nofollowing legitimate external links it is not funny. Any external link on their site is nofollowed, even when quoting text on the other site. IMO, the original purpose of nofollow has long been defeated in specific industries. As more webmasters continue doing everything they can to preserve their pagerank, the effectiveness of nofollow will continue to erode.
In an effort to manually control the flow of PageRank among pages within a website, many webmasters practice what is known as PageRank Sculpting[65]—which is the act of strategically placing the nofollow attribute on certain internal links of a website in order to funnel PageRank towards those pages the webmaster deemed most important. This tactic has been used since the inception of the nofollow attribute, but may no longer be effective since Google announced that blocking PageRank transfer with nofollow does not redirect that PageRank to other links.[66]
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

Unfortunately, SEO is also a slow process. You can make “quick wins” in markets which are ill-established using SEO, but the truth is that the vast majority of useful keyphrases (including long-tail keyphrases) in competitive markets will already have been optimized for. It is likely to take a significant amount of time to get to a useful place in search results for these phrases. In some cases, it may take months or even years of concentrated effort to win the battle for highly competitive keyphrases.
Now, how much weight does PageRank carry? Like most every other part of the algorithm, it’s questionable. If we listed all the ranking factors, I don’t suspect it would be in the top 5, but it’s important to remember that the key to ranking well is to be the LESS IMPERFECT than your competition. IE: To have more of the right things that send the right signals in the right places so that Google sees you as a better, more relevant, candidate for the top three on page one. If you and your competitor both have optimized (on-page and technically) for the same keyword phrase perfectly, PR could be the deal breaker that pushes your blue link an inch up.
That sort of solidifies my thoughts that Google has always liked and still likes sites that are most natural the best – so to me it seems like it’s best not to stress over nofollow and dofollow – regarding on-site and off-site links – and just link to sites you really think are cool and likewise comment on blogs you really like )and leave something useful)… if nothing else, if things change will nofollow again, you’ll have all those comments floating around out there so it can’t hurt. And besides, you may get some visitors from them if the comments are half-decent.
As Google becomes more and more sophisticated, one of the major cores of their algorithm, the one dealing with links (called Penguin) aims to value natural, quality links and devalue those unnatural or spammy ones. As a search engine, if they are to stay viable, they have to make sure their results are as honest and high-quality as possible, and that webmasters can't manipulate those results to their own benefit.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
Suggesting that this change is really just the equivalent of “resetting” things to the way they were is absurd. nofollow is still be using on outbound links in mass by the most authoritative/trusted sites on the web. Allowing us peons to have a slight bit of control over our internal juice flow simply allowed us to recoup a small portion of the overall juice that we lost when the top-down flow was so dramatically disrupted.

TrustRank takes into consideration website foundational backlinks. Searching engines find quicker sites which are reliable and trustworthy and place them on the top of SERP. All doubtful websites you can find somewhere at the end of the rank if you decide to look what is there. As a rule, people take the information from the first links and stop searching, in case they have found nothing on first 20 top sites. Surely, your website may have that required information, service or goods but because of lack of authority, Internet users will not find them unless you have good foundational backlinks. What are backlinks which we call foundational? These are all branded and non-optimized backlinks on authority websites.
Establishment of customer exclusivity: A list of customers and customer's details should be kept on a database for follow up and selected customers can be sent selected offers and promotions of deals related to the customer's previous buyer behaviour. This is effective in digital marketing as it allows organisations to build up loyalty over email.[22]

PageRank always was and remains only one part of the Google search algorithm, the system that determines how to rank pages. There are many other ranking factors that are also considered. A high PageRank score did NOT mean that a page would rank well for any topic. Pages with lower scores could beat pages with higher scores if they had other factors in their favor.
If I’m writing a page about the use of the vCard microformat on a page, it absolutely makes sense for me to link out to the definition where it was originally published, and improves user experience as well as lending authority to my arguments. Often as SEOs we get obsessed with the little things, claiming that its hard to get links on particular subjects, and that is pretty true, but its mainly our own selfishness in linking out to authority content that prevents other people giving us the same courtesy.

This year, for the first time, Google stated that user experience would be a core part of gaining rankings for mobile websites. A poorer user experience would send your site hurtling down the rankings. This appeared to come as a shock to many in the SEO community and despite assurances that content was still king – many seemed to feel that this ...


Digital marketing is probably the fastest-changing marketing field out there: New tools are being built, more platforms emerge and more channels need to be included into your marketing plan. How not to get overwhelmed while staying on top of the latest marketing trends? Here are a few tools that help you scale and automate some parts of your marketing routine making you a more productive and empowered marketer: Tools to Semi-Automate Marketing Tasks 1.
Digital marketing became more sophisticated in the 2000s and the 2010s, when[13][14] the proliferation of devices' capable of accessing digital media led to sudden growth.[15] Statistics produced in 2012 and 2013 showed that digital marketing was still growing.[16][17] With the development of social media in the 2000s, such as LinkedIn, Facebook, YouTube and Twitter, consumers became highly dependent on digital electronics in daily lives. Therefore, they expected a seamless user experience across different channels for searching product's information. The change of customer behavior improved the diversification of marketing technology.[18]

However, with all of these so-called modern conveniences to life, where technology's ever-pervading presence has improved even the most basic tasks for us such as hailing a ride or ordering food or conducting any sort of commerce instantly and efficiently, many are left in the dark. While all of us have become self-professed experts at consuming content and utilizing a variety of tools freely available to search and seek out information, we're effectively drowning in a sea of digital overload.
“So what happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each.”
[44] Matteo Pasquinelli reckons the basis for the belief that PageRank has a social component lies in the idea of attention economy. With attention economy, value is placed on products that receive a greater amount of human attention and the results at the top of the PageRank garner a larger amount of focus then those on subsequent pages. The outcomes with the higher PageRank will therefore enter the human consciousness to a larger extent. These ideas can influence decision-making and the actions of the viewer have a direct relation to the PageRank. They possess a higher potential to attract a user's attention as their location increases the attention economy attached to the site. With this location they can receive more traffic and their online marketplace will have more purchases. The PageRank of these sites allow them to be trusted and they are able to parlay this trust into increased business.
DisabledGO, an information provider for people with disabilities in the UK and Ireland, hired Agency51 to implement an SEO migration strategy to move DisabledGO from an old platform to a new one. By applying 301 redirects to old URLS, transferring metadata, setting up Google webmaster tools, and creating a new sitemap, Agency 51 was able to successfully transfer DisabledGO to a new platform while keeping their previous SEO power alive. Additionally, they were able to boost visitor numbers by 21% year over year, and the site restructuring allowed DisabledGO to rank higher than competitors. Their case study is available on SingleGrain.com.
Hey Brian, this is an absolutely fabulous post! It caused me to come out of lurking mode on the Warrior Forum and post a response there as well. Only my second post in 4 years, it was that kickass… I’ve signed to your newsletter on the strength of this. You have a new follower on Twitter as well! I mean what I said on the Warrior Forum… Since 2001 I’ve worked in an SEO commercially, freelance and now from the comfort of my own home – I have bought IM ebooks with less useful information in them than covered by any one of your 17. You might not please everyone in our industry giving some of those secrets away for free though! All power to you my friend, you deserve success and lots of it!
Internet Marketing Inc. provides integrated online marketing strategies that help companies grow. We think of ourselves as a business development consulting firm that uses interactive marketing as a tool to increase revenue and profits. Our management team has decades of combined experience in online marketing as well as graduate level education and experience in business and finance. That is why we focus on creating integrated online marketing campaigns designed to maximize your return on investment.
A generalization of PageRank for the case of ranking two interacting groups of objects was described in [32] In applications it may be necessary to model systems having objects of two kinds where a weighted relation is defined on object pairs. This leads to considering bipartite graphs. For such graphs two related positive or nonnegative irreducible matrices corresponding to vertex partition sets can be defined. One can compute rankings of objects in both groups as eigenvectors corresponding to the maximal positive eigenvalues of these matrices. Normed eigenvectors exist and are unique by the Perron or Perron-Frobenius theorem. Example: consumers and products. The relation weight is the product consumption rate.

There are simple and fast random walk-based distributed algorithms for computing PageRank of nodes in a network.[33] They present a simple algorithm that takes {\displaystyle O(\log n/\epsilon )} rounds with high probability on any graph (directed or undirected), where n is the network size and {\displaystyle \epsilon } is the reset probability ( {\displaystyle 1-\epsilon } is also called as damping factor) used in the PageRank computation. They also present a faster algorithm that takes {\displaystyle O({\sqrt {\log n}}/\epsilon )} rounds in undirected graphs. Both of the above algorithms are scalable, as each node processes and sends only small (polylogarithmic in n, the network size) number of bits per round.
Question, when it comes to backlinks. Would it be useful to guest blog or agree to create pages on websites with backlinks with other business owners in your community. Example: our window replacement company used a local photography company for head shots. A backlink to the photographers website on our “staff” page with photos of the head shots for a reference. Then the photographer post examples of company headshots on her website with a back link to our website for reference. Is this a good way of going about getting more backlinks?
For the purpose of their second paper, Brin, Page, and their coauthors took PageRank for a spin by incorporating it into an experimental search engine, and then compared its performance to AltaVista, one of the most popular search engines on the Web at that time. Their paper included a screenshot comparing the two engines’ results for the word “university.”
Wow Brian…I’ve been making and promoting websites full-time since 2006 and just when I thought I’ve seen it all, here you are introducing me to all these innovative ways of getting backlinks that I wasn’t aware of before. I never subscribe to newsletters, but yours is just too good to say no to! Thanks very much for this information. Off to read your other posts now…
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Great post, it is very valuable info indeed. I just want to ask, I am trying to market a business to business website, I find it quite hard to market the website in the appropriate categories as it is specialized. Many of the websites I am designing are FCA regulated so when it comes to advertising or giving advise, I am limited as to what I can and can’t do/say. What business to business websites do you recommend for backlinking specialist websites? I find that I am also limited in the social media area and its just LinkedIn that helps
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
youfoundjake, those would definitely be the high-order bits. The fact that no one noticed this change means (to me) even though it feels like a really big shift, in practice the impact of this change isn’t that huge. By the way, I have no idea why CFC flagged you, but I pulled your comment out of the Akismet bin. Maybe some weird interaction of cookies with WordPress caching? Sorry that happened.

Thanks for the info on nofollow and pagerank. It makes sense that this will always be a moving target less everyone will eventually game the system until it’s worthless but at the same time it’s worth it to know a few tricks. I still have open concerns on how freshness of content factor in, the only time i’m ever annoyed by search results these days is when the only links available (on the first page at least) are articles from 4 years ago.
I just wanted to thank you for the awesome email of information. It was so awesome to see the results I have gotten and the results that your company has provided for other companies. Truly remarkable. I feel so blessed to be one of your clients. I do not feel worthy but do feel very blessed and appreciative to been a client for over 5 years now. My business would not be where it is today without you, your company and team. I sure love how you are dedicated to quality. I can not wait to see what the next 5 years bring with 10 years of internet marketing ninjas as my secret weapon. John B.
Private corporations use Internet marketing techniques to reach new customers by providing easy-to-access information about their products. The most important element is a website that informs the audience about the company and its products, but many corporations also integrate interactive elements like social networking sites and email newsletters.
As mentioned above, the two versions of the algorithm do not differ fundamentally from each other. A PageRank which has been calculated by using the second version of the algorithm has to be multiplied by the total number of web pages to get the according PageRank that would have been caculated by using the first version. Even Page and Brin mixed up the two algorithm versions in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they claim the first version of the algorithm to form a probability distribution over web pages with the sum of all pages' PageRanks being one.
Also given that the original reasons for implementing the ‘nofollow’ tag was to reduce comment spam (something that it really hasn’t had a great effect in combatting) – the real question I have is why did they ever take any notice of nofollow on internal links in the first place? It seems to me that in this case they made the rod for their own back.
Brunson talks about this reverse engineering in his book called, Dot Com Secrets, a homage to the internet marketing industry, and quite possibly one of the best and most transparent books around in the field. Communication is what will bridge the divide between making no money and becoming a massive six or seven-figure earner. Be straight with people and learn to communicate effectively and understand every stage of the process and you'll prosper as an internet marketer.
Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
×