Our agency can provide both offensive and defensive ORM strategies as well as preventive ORM that includes developing new pages and social media profiles combined with consulting on continued content development. Our ORM team consists of experts from our SEO, Social Media, Content Marketing, and PR teams. At the end of the day, ORM is about getting involved in the online “conversations” and proactively addressing any potentially damaging content.
Thanks for the article (and lead-off links as they were good info too) but I did not quite get – if there was a penalisation by Google for sculpting – from the article or whether it was just bad practice? And also to echo what someone else asked ‘is it WORTH actually undoing this type of work on websites SEO’s have worked on’ or simply change the way we work with new sites?
Getting unique and authoritative links is crucial for higher ranking in the SERPs and improving your SEO. Google's algorithm on evaluation of links evolved in recent years creating a more challenging process now to get high quality backlinks. External links still matter and aren’t obsolete, so start working on strategies to get valuable backlinks to improve your search visibility.
For example, it makes a much bigger difference to make sure that people (and bots) can reach the pages on your site by clicking links than it ever did to sculpt PageRank. If you run an e-commerce site, another example of good site architecture would be putting products front-and-center on your web site vs. burying them deep within your site so that visitors and search engines have to click on many links to get to your products.
On a blog the page rank should go to the main article pages. Now it just gets “evaporated” if you use “nofollow” or scattered to all the far flung nooks and crannys which means google will not be able to see the wood for the trees. The vast majority of a site’s overall page rank will now reside in the long tail of useless pages such as commentors profile pages. This can only make it harder for google to serve up the most relevant pages.
There are also many keyword research tools (some free and some paid) that claim to take the effort out of this process. A popular tool for first timers is Traffic Travis, which can also analyse your competitors’ sites for their keyword optimization strategies and, as a bonus, it can deliver detailed analysis on their back-linking strategy, too. You can also use Moz.com’s incredibly useful keyword research tools – they’re the industry leader, but they come at a somewhat higher price.
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
If I was able to write a blog post that was popular and it got lots of comments, then any links that I would have put in the body text would be devalued with each additional comment – even with ‘no follow’ being on the commenter’s links. So it would seem that in some sort of perverse way, the more popular (by comments) a page is, the less page rank it will be passing. I would have to hope that the number of inbound links it gets would grow faster than the comments it receives, a situation that is unlikely to occur.
As Rogers pointed out in his classic paper on PageRank, the biggest takeaway for us about the eigenvector piece is that it’s a type of math that let’s you work with multiple moving parts. “We can go ahead and calculate a page’s PageRank without knowing the final value of the PR of the other pages. That seems strange but, basically, each time we run the calculation we’re getting a closer estimate of the final value. So all we need to do is remember the each value we calculate and repeat the calculations lots of times until the numbers stop changing much.”
The numbers didn’t quite sit right with me because there didn’t seem to be enough juicy inbound links to the winning page. Then I noticed that two key links were missing from the 10 node chart with the PageRank metrics on it when compared to the previous chart without the metrics. The two missing links are the two coming from node 2 to node 1. Suddenly it all made sense again and it was obvious why that page won.
Yes, the more links on a page the smaller the amount of page rank it can pass on to each, but that was as it was before. With regard to what happens to the ‘missing’ page rank, it seems that if this is the case all over the Internet, and it will be, the total amount of page rank flow is reduced the same all over so you don’t need as much page rank flow to your good links to maintain relative position.
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
Hi Dean! Thanks for the ideas! They are awesome! However, I have a serious doubt about the Scholarship link. I’ve checked a few of those .edu sites.. and now that so many people have followed your tips… those .edu sites have TONS of links to niche sites… even if the link comes from a high DA site.. don’t you think it might be weird in the eyes of google? I don’t know if it might be dangerous to have a link from the same page with hundreds of low quality sites (not all of them, but some for sure).. what do you think? Thanks!
PageRank as a visible score has been dying a slow death since around 2010, I’d say. Pulling it from the Google Toolbar makes it official, puts the final nail in the visible PageRank score coffin. The few actually viewing it within Internet Explorer, itself a depreciated browser, aren’t many. The real impact in dropping it from the toolbar means that third parties can no longer find ways to pull those scores automatically.
Nashville Grant, here’s the mental model I’d employ: search engines want to return great content. If you make such a fantastic site that all the web has heard of you, search engines should normally reflect that fact and return your site. A lot of bad SEO happens because people say “I’ll force my way to the top of Google first, and then everyone will find out about my site.” Putting rankings before the creation of a great site is in many ways putting the cart before the horse. Often the search rankings follow from the fact that you’re getting to be well-known on the web completely outside the sphere of search. Think about sites like Twitter and Facebook–they succeed by chasing a vision of what users would want. In chasing after that ideal of user happiness and satisfaction, they became the sort of high-quality sites that search engines want to return, because we also want to return what searches will find useful and love. By chasing a great user experience above search rankings, many sites turn out to be what search engines would want to return anyway.
Get a link to your pages from an high PR page and yes, some of that PageRank importance is transmitted to your page. But that’s doesn’t take into account the context of the link — the words in the link — the anchor text. If you don’t understand anchor text, Google Now Reporting Anchor Text Phrases from me last month will take you by the hand and explain it more.

There is another way to gain quality backlinks to your site, in addition to related site themes: anchor text. When a link incorporates a keyword into the text of the hyperlink, we call this quality anchor text. A link's anchor text may be one of the under-estimated resources a webmaster has. Instead of using words like "click here" which probably won't relate in any way to your website, using the words "Please visit our tips page for how to nurse an orphaned kitten" is a far better way to utilize a hyperlink. A good tool for helping you find your backlinks and what text is being used to link to your site is the Backlink Anchor Text Analysis Tool. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something incorporating relevant keywords. This will also help boost your quality backlinks score.


All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

In order to be a data driven agency, we foster a culture of inspired marketing entrepreneurs that collaborate, innovate, and are constantly pushing the threshold of marketing intelligence. Our analytics team is well versed in mathematics, business analytics, multi-channel attribution modeling, creating custom analytics reporting dashboards, and performing detailed analysis and reporting for each client.

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
Meanwhile, the link spam began. People chasing higher PageRank scores began dropping links wherever they could, including into blog posts and forums. Eventually, it became such an issue that demands were raised that Google itself should do something about it. Google did in 2005, getting behind the nofollow tag, a way to prevent links from passing along PageRank credit.
Just like the new structure on JavaScript links and them now carrying weight and being crawlable, the PageRank sculpting change is understandable. Google now can and wants to index more of the web (JavaScript link change). Google wants to reverse a method that can only help people in the know (PageRank sculpting change). Logically, all is very understandable.
Two weeks ago I changed a few internal anchor text links for a HTML SELECT Label in order to save some space in the menu bar. Today, when I saw in Google the Cache (text-version) page of my site I realized that all the links in the HTML SELECT Label cannot be followed. So I understand that Googlebot doesn’t follow this links and obviously there’s no inbound ‘link juice’. Is that so?
My main concern though, is Google appears to becoming reliant on sites doing MANY things for SE only. It also appears that Google is lowering the bar for YouTube videos in the organic SERPs and forcing their insertion as the cost of relevant pages. It even seems they are now doing the same for pictures, despite BOTH having their own SEs. I fear Google is attempting to increase profits, for it’s shareholders, in a rather impatient manner.
To create an effective DMP, a business first needs to review the marketplace and set 'SMART' (Specific, Measurable, Actionable, Relevant and Time-Bound) objectives.[60] They can set SMART objectives by reviewing the current benchmarks and key performance indicators (KPIs) of the company and competitors. It is pertinent that the analytics used for the KPIs be customised to the type, objectives, mission and vision of the company.[61][62]
In the 2000s, with more and more Internet users and the birth of iPhone, customers started searching products and making decisions about their needs online first, instead of consulting a salesperson, which created a new problem for the marketing department of a company. In addition, a survey in 2000 in the United Kingdom found that most retailers had not registered their own domain address.[12] These problems made marketers find the digital ways for market development.

In an effort to manually control the flow of PageRank among pages within a website, many webmasters practice what is known as PageRank Sculpting[65]—which is the act of strategically placing the nofollow attribute on certain internal links of a website in order to funnel PageRank towards those pages the webmaster deemed most important. This tactic has been used since the inception of the nofollow attribute, but may no longer be effective since Google announced that blocking PageRank transfer with nofollow does not redirect that PageRank to other links.[66]
Black hat SEO is to be avoided. This is basically link spamming. You can pay somebody peanuts to do this on your behalf and, for a very short period, it brings results. Then Google sees what’s happened, and they delist your site permanently from search engine rankings. Now, you need a new website and new content, etc.—so, black hat SEO is a terrible idea.
The numbers didn’t quite sit right with me because there didn’t seem to be enough juicy inbound links to the winning page. Then I noticed that two key links were missing from the 10 node chart with the PageRank metrics on it when compared to the previous chart without the metrics. The two missing links are the two coming from node 2 to node 1. Suddenly it all made sense again and it was obvious why that page won.
I first discovered Sharpe years ago online. His story was one of the most sincere and intriguing tales that any one individual could convey. It was real. It was heartfelt. It was passionate. And it was a story of rockbottom failure. It encompassed a journey that mentally, emotionally and spiritually crippled him in the early years of his life. As someone who left home at the age of 14, had a child at 16, became addicted to heroin at 20 and clean four long years later, the cards were definitely stacked up against him.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Such an enlightening post! Thanks for revealing those sources, Brian. This really has opened up my mind to the new ideas. I have read many articles about SEO, especially the ones in my country, most of them don’t really tell how to increase your presence in search engines. But today I found this page, which gave me much more valuable insights. Definitely going to try your tips..
When Site A links to your web page, Google sees this as Site A endorsing, or casting a vote for, your page. Google takes into consideration all of these link votes (i.e., the website’s link profile) to draw conclusions about the relevance and significance of individual webpages and your website as a whole. This is the basic concept behind PageRank.

Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.


An Internet marketing campaign is not an isolated, one-off proposal. Any company that plans on using it once is certain to continue to use it. An individual who is knowledgeable about all aspects of an Internet marketing campaign and who has strong interpersonal skills is well-suited to maintain an ongoing managerial role on a dedicated marketing team.

Hi Brian thank you for sharing this awesome backlinking techniques. My site is currently not ranking well. It used to be, sometime mid last year, but it suddenly got de-ranked. Not really sure why. I haven’t been participating in any blackhat techniques or anything at all. I’ll try a few of your tips and hopefully it will help my site back to its shape.


Google PageRank (Google PR) is one of the methods Google uses to determine a page's relevance or importance. Important pages receive a higher PageRank and are more likely to appear at the top of the search results. Google PageRank (PR) is a measure from 0 - 10. Google Pagerank is based on backlinks. The more quality backlinks the higher Google Pagerank. Improving your Google page rank (building QUALITY backlinks ) is very important if you want to improve your search engine rankings.
×