I just did a consult and opinion letter for an extremely large 200,000+ page corporate website that had been forced to temporarily remove their html sitemap due to some compromised code that overloaded their server and crashed the site. A number of individuals at the company were concerned at the potential, negative SEO implications of removing this page, loss of page rank equity transfer to sitemap targets and a feeling that this page was providing the robots with important pathways to many of the orphan pages unavailable through the menu system. This article was helpful in debunking the feeling that a page with 200,000 links off of it was passing any link juice to the targets. PS. XML sitemap in place.
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at considerably more than the sheer volume of votes, or links a page receives; for example, it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.” Using these and other factors, Google provides its views on pages’ relative importance.
But if you do it properly, it can be worth your money. Also, press releases can be much more than just a block of text. In December 2018, we ran a press release through Business Wire that had multiple backlinks, stylized call outs, and even a video! If you put effort into them, press releases can be not just a source of backlinks, but also serve as a great marketing piece as well.
Thank you, Brian, for this definitive guide. I have already signed up for Haro and have plans to implement some of your strategies. My blog is related to providing digital marketing tutorials for beginners and hence can be in your niche as well. This is so good. I highly recommend all my team members in my company to read your blog everytime you published new content. 537 comments in this post within a day, you are a master of this. A great influence in digital marketing space.
One final note is that if the links are not directly related to the subject, or you have no control over them, such as commentors’ website links, maybe you should consider putting them on another page, which links to your main content. That way you don’t leak page rank, and still gain hits from search results from the content of the comments. I may be missing something but this seems to mean that you can have your cake and eat it, and I don’t even think it is gaming the system or against the spirit of it. You might even gain a small sprinkling of page rank if the comment page accumulates any of it’s own.
PageRank as a visible score has been dying a slow death since around 2010, I’d say. Pulling it from the Google Toolbar makes it official, puts the final nail in the visible PageRank score coffin. The few actually viewing it within Internet Explorer, itself a depreciated browser, aren’t many. The real impact in dropping it from the toolbar means that third parties can no longer find ways to pull those scores automatically.
Before I start this, I am using the term ‘PageRank’ as a general term fully knowing that this is not a simple issue and ‘PageRank’ and the way it is calculated (and the other numerous methods Google use) are multidimensional and complex. However, if you use PageRank to imply ‘weight’ it make it a lot simpler. Also, ‘PageRank sculpting’ (in my view) is meant to mean ‘passing weight you can control’. Now… on with the comment!
Thanks for sharing this, Matt. I’m happy that you took the time to do so considering that you don’t have to. What I mean is, in an ideal world, there should be no such thing as SEO. It is the SE’s job to bring the right users to the right sites and it is the job of webmasters to cater to the needs of the users brought into their sites by SEs. Webmasters should not be concerned of bringing the users in themselves. (aside from offsite or sponsored marketing campaigns) The moment they do, things start to get ugly because SEs would now have to implement counter-measures. (To most SEO tactics) This becomes an unending spiral. If people only stick to their part of the equation, SEs will have more time to develop algorithms for making sure webmasters get relevant users rather than to develop algorithms for combating SEOs to ensure search users get relevant results. Just do your best in providing valuable content and Google will try their best in matching you with your users. Don’t waste time trying to second guess how Google does it so that you can present yourself to Google as having a better value than you really have. They have great engineers and they have the code—you only have a guess. At most, the SEO anyone should be doing is to follow the webmasters guidelines. It will benefit all.
Start Value (In this case) is the number of actual links to each “node”. Most people actually set this to 1 to start, but there are two great reasons for using link counts. First, it is a better approximation to start with than giving everything the same value, so the algorithm stabilizes in less iterations and it is so useful to check my spreadsheet in a second… so node A has one link in (from page C)
Getting unique and authoritative links is crucial for higher ranking in the SERPs and improving your SEO. Google's algorithm on evaluation of links evolved in recent years creating a more challenging process now to get high quality backlinks. External links still matter and aren’t obsolete, so start working on strategies to get valuable backlinks to improve your search visibility.
Thanks for the post Chelsea! I think Google is starting to move further away from PageRank but I do agree that a higher amoount of links doesn’t necessarily mean a higher rank. I’ve seen many try to shortcut the system and end up spending weeks undoing these “shortcuts.” I wonder how much weight PageRank still holds today, considering the algorithms Google continues to put out there to provide more relevant search results.
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
One of the earliest adopters of Internet marketing in the world of Fortune 500 companies was the Coca-Cola Corporation. Today, this huge purveyor of soft drinks has one of the strongest online portfolios in the world. More than 12,000 websites link to the Coca-Cola homepage, which itself is a stunning display of Internet savvy. Their homepage alone sports an auto-updating social network column, an embedded video, a unique piece of advertising art, frequently rotating copy, an opt-in user registration tab, tie-in branding with pop culture properties, and even a link to the company's career opportunities page. Despite how busy that sounds, the Coca-Cola homepage is clean and easy to read. It is a triumph of Internet marketing for its confidence, personality, and professionalism.
Social media is a mixed bag when it comes to backlinks. There is a modicum of value, as social media sites allow you to link to your website in your profile. However, these days Facebook, Twitter, and other social media sites mark links as 'nofollow,' meaning that they don't pass SEO value (sometimes referred to as "link juice") to the linked site. These links won't do anything to boost your site's performance in search results.
What an article… thank you so much for the priceless information, we will be changing our pages around to make sure we get the highest page rank available to us, we are trying to get high page rank sites to link to us, hopefully there is more information out there to gather as we want to compete within our market to gain as much market-share as possible.
Current search engine optimization focuses on techniques such as making sure that each web page has appropriate title tags and that the content is not "thin" or low-quality. High-quality content is original, authoritative, factual, grammatically correct, and engaging to users. Poorly edited articles with spelling and grammatical errors will be demoted by search engines.
There is no secret that getting hiqh-quality backlinks is your website’s way to better ranking in Google. But how to differ good link from the bad one? Carefully choosing backlinks is a very tremulous and important task for everyone who wants to optimize their sites. There are a lot of different tools which can help you to check whether your backlinks are trustful and can bring your website value.
Was reviewing some competitive data and thought this was pretty interesting. I ran a batch analysis on Ahrefs of competitors. See attached screenshot. With just 603 backlinks, Our site is ranking up there with sites with 2x, 3x, 10x the number of backlinks/unique ips. Guessing some of this authority is coming from the backlinks program and general good quality of those links. Hard to speculate but nice to see. Ben R.
Cause if I do that, If I write good content, whilst my 100+ competitors link build, article market, forum comment, social bookmark, release viral videos, buy links, I’ll end up the very bottom of the pile, great content or not and really I am just as well taking my chances pulling off every sneaky trick in the book to get my site top because, everyone does it anyway and if I don’t what do have to lose?”
If you’re Matt Cutts and a billion people link to you because you’re the Spam guy at Google, writing great content is enough. For the rest of us in hypercompetitive markets, good content alone is not enough. There was nothing wrong with sculpting page rank to pages on your site that make you money as a means of boosting traffic to those pages. It’s not manipulating Google, there’s more than enough of that going on in the first page of results for most competitive keywords. Geez Matt, give the little guy a break!
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
Backlinks can be time-consuming to earn. New sites or those expanding their keyword footprint may find it difficult to know where to start when it comes to link building. That's where competitive backlink research comes in: By examining the backlink profile (the collection of pages and domains linking to a website) to a competitor that's already ranking well for your target keywords, you can gain insight about the link building that may have helped them. A tool like Link Explorer can help uncover these links so you can and target those domains in your own link building campaigns.
I think that removing the link to the sitemap shouldn’t be a big problem for the navigation, but I wonder what happens with the disclaimer and the contact page? If nofollow doesn’t sink the linked page, how can we tell the search engine that these are not content pages. For some websites these are some of the most linked pages. And yes for some the contact page is worth gaining rank, but for my website is not.