A way to build backlinks by providing value to other sites is through branded badges. A branded badge is an award that a brand creates and gives out to other sites as a status symbol. For example, you could create a list of the top sites or best brands that are published on your site, and then give badges to each brand on the list so that they can show the status on their site. You include a link back to the article on the badge to create the link.
6. Measurement and analysis. You won’t get far in SEO unless you know how to measure your results, interpret those results, and use your analysis to make meaningful changes to your approach. The best tool for the job is still Google Analytics, especially if you’re new to the game. Spend some time experimenting with different metrics and reports, and read up on Analytics knowledge base articles. There’s a deep world to dive into.
I dont know if Google gets its kicks out of keeping Search Engine Marketers and Webmasters jumping through hoops – or if they are in cahoots with the big SEM firms – so that they get this news and these updates before the average guy on the street. Either way, they are seriously getting a bit too big and powerful and the time is RIPE for a new search engine to step in and level the playing field.

I think Google will always be working to discern and deliver “quality, trustworthy” content and I think analyzing inbound links as endorsements is a solid tool the SE won’t be sunsetting anytime soon. Why would they? If the president of the United States links to your page that is undoubtedly an endorsement that tells Google you’re a legitimate trusted source. I know that is an extreme example, but I think it illustrates the principals of a linking-as-endorsement model well.
The probability that the random surfer visits a page is its PageRank. And, the d damping factor is the probability at each page the “random surfer” will get bored and request another random page. One important variation is to only add the damping factor d to a single page, or a group of pages. This allows for personalization and can make it nearly impossible to deliberately mislead the system in order to get a higher ranking. We have several other extensions to PageRank…
“NOTE: You may be curious what your site’s or your competitor’s PR score is. But Google no longer reveals the PageRank score for websites. It used to display at the top of web browsers right in the Google Toolbar, but no more. And PR data is no longer available to developers through APIs, either. Even though it’s now hidden from public view, however, PageRank remains an important ingredient in Google’s secret ranking algorithms.”

PageRank sculpting came out of the idea that virtually any page will have links that are important for users but not necessarily that meaningful to receive any PageRank that a page can flow. Navigational links are a primary example of this. Go to a place like the LA Times, and you’ve got tons of navigational links on every page. Nofollow those, and you (supposedly in the past) ensure that the remaining links (say your major stories) get more of a boost.


Having a ‘keyword rich’ domain name may lead to closer scrutiny from Google. According to Moz, Google has “de-prioritized sites with keyword-rich domains that aren’t otherwise high-quality. Having a keyword in your domain can still be beneficial, but it can also lead to closer scrutiny and a possible negative ranking effect from search engines—so tread carefully.”
I dont know if Google gets its kicks out of keeping Search Engine Marketers and Webmasters jumping through hoops – or if they are in cahoots with the big SEM firms – so that they get this news and these updates before the average guy on the street. Either way, they are seriously getting a bit too big and powerful and the time is RIPE for a new search engine to step in and level the playing field.

All in all, PageRank sculpting (or whatever we should call it) didn’t really rule my world. But, I did think that it was a totally legitimate method to use. Now that we know the ‘weight’ leaks, this will put a totally new (and more damaging) spin on things. Could we not have just left the ‘weight’ with the parent page? This is what I thought would happen most of the time anyway.

As mentioned above, the two versions of the algorithm do not differ fundamentally from each other. A PageRank which has been calculated by using the second version of the algorithm has to be multiplied by the total number of web pages to get the according PageRank that would have been caculated by using the first version. Even Page and Brin mixed up the two algorithm versions in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they claim the first version of the algorithm to form a probability distribution over web pages with the sum of all pages' PageRanks being one.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
Chris_D, great question. If you have a single product page that can have multiple urls with slightly different parameters, that’s a great time to use a rel=canonical meta tag. You can use rel=canonical for pages with session IDs in a similar fashion. What rel=canonical lets you do is say “this page X on my host is kinda of ugly or otherwise isn’t the best version of this page. Use url Y as the preferred version of my page instead.” You can read about rel=canonical at http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394. Bear in mind that if you can make your site work without session IDs or make it so that you don’t have multiple “aliases” for the same page, that’s even better because it solves the problem at the root.

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
I don’t get it, it seems Google is constantly making rules & regulations as they see fit. I don’t try to “manipulate” any links we have on our site or any clients we work for. Links take time period. No way around it. But, now this explanation gives more fuel to all the Google bashers out there. I recently read an article about Guy Kawasaki has been “loaned” one, two, three cars in three years & is still within Google’s guidelines? Makes me wonder how many rules and regulations are broken. My take is do your job right, and don’t worry what Google is doing. If content is King then everything will fall into place naturally.
Nice word is not enough for this. You show that Blogging is like Apple vs Samsung. You can create lot of post and drive traffic (which is Samsung like lot of phone every year) or you can create high quality post like apple (which is you) and force higher rank site to make content like you copy content from you blog. Now i will work hard on already publish post until they will not get traffic.
Let’s start with what Google says. In a nutshell, it considers links to be like votes. In addition, it considers that some votes are more important than others. PageRank is Google’s system of counting link votes and determining which pages are most important based on them. These scores are then used along with many other things to determine if a page will rank well in a search.
If you don’t want to rebuild an expired domain, just take its backlinks and allow the linkers to be aware of the “to-dead-resource” linking. You can ask a link-builder to replace non-working links with your website’s one. If the content is relevant, you can try to restore it. Be sure that you can make it better than it was before. Reach out and inform the link-builder about the renewed content.
I have a small service business called Eco Star Painting in Calgary and I do all of my own SEO. I’m having trouble getting good backlinks. How do you suggest a painting company get quality backlinks other than the typical local citation sites and social media platforms? I don’t know what I can offer another high domain site in terms of content. Do you have any suggestions?

Danny, I was on the panel where Matt suggested that and I point blank asked on stage what happened when folks starting abusing the tactic and Google changed their mind if you recall (at the time, I’d seen some of the things being done I knew Google would clarify as abuse and was still a nofollow unenthusiast s a result at that time). And Matt dismissed it. So, I think you can take home two important things from that – 1. SEO tactics can always change regardless of who first endorses them and 2. Not everything Matt says is etched in stone. <3 ya Matt.


When Site A links to your web page, Google sees this as Site A endorsing, or casting a vote for, your page. Google takes into consideration all of these link votes (i.e., the website’s link profile) to draw conclusions about the relevance and significance of individual webpages and your website as a whole. This is the basic concept behind PageRank.
A generalization of PageRank for the case of ranking two interacting groups of objects was described in [32] In applications it may be necessary to model systems having objects of two kinds where a weighted relation is defined on object pairs. This leads to considering bipartite graphs. For such graphs two related positive or nonnegative irreducible matrices corresponding to vertex partition sets can be defined. One can compute rankings of objects in both groups as eigenvectors corresponding to the maximal positive eigenvalues of these matrices. Normed eigenvectors exist and are unique by the Perron or Perron-Frobenius theorem. Example: consumers and products. The relation weight is the product consumption rate.
Okay, if you're still with me, fantastic. You're one of the few that doesn't mind wading through a little bit of hopeless murkiness to reemerge on the shores of hope. But before we jump too far ahead, it's important to understand what online marketing is and what it isn't. That definition provides a core understanding of what it takes to peddle anything on the web, whether it's a product, service or information.
8. Technical SEO. Technical SEO is one of the most intimidating portions of the SEO knowledge base, but it’s an essential one. Don’t let the name scare you; the most technical elements of SEO can be learned even if you don’t have any programming or website development experience. For example, you can easily learn how to update and replace your site’s robots.txt file, and with the help of an online template, you should be able to put together your sitemap efficiently.
Once the company has identified the target demographic for its Internet marketing campaign, they then decide what online platforms will comprise the campaign. For instance, a company that is seeking customers from the 18 to 33 demographic should develop a mobile application that raises awareness about the product, such as a game, a news feed, or a daily coupon program users can download for free.

Despite this many people seem to get it wrong! In particular “Chris Ridings of www.searchenginesystems.net” has written a paper entitled “PageRank Explained: Everything you’ve always wanted to know about PageRank”, pointed to by many people, that contains a fundamental mistake early on in the explanation! Unfortunately this means some of the recommendations in the paper are not quite accurate.
Hi Brian thank you for sharing this awesome backlinking techniques. My site is currently not ranking well. It used to be, sometime mid last year, but it suddenly got de-ranked. Not really sure why. I haven’t been participating in any blackhat techniques or anything at all. I’ll try a few of your tips and hopefully it will help my site back to its shape.
After finding websites that have good metrics, you have to make sure the website is related to your site. For each competitor backlink, try to understand how your competitor got that link. If it was a guest article, send a request to become a contributor as well. If it was a product review by a blogger, contact the writer and offer them a good deal in exchange for a similar review.
What is a useful place in search results? Ideally, you need to be in the top three search results returned. More than 70% of searches are resolved in these three results, while 90% are resolved on the first page of results. So, if you’re not in the top three, you’re going to find you’re missing out on the majority of potential business—and if you’re not on the first page, you’re going to miss out on nearly all potential business.

Also hadn’t thought about decreasing the rank value based on the spammyness of sites a page is linking into. My guess on how to do it would be determining the spammyness of individual pages based on multiple page and site factors, then some type of reverse pagerank calcuation starting with the those bad scores, then overlaying that on top of the “good” pagerank calculation as a penalty. This is another thing which would be interesting to play around with in the Nutch algorithm.
I have not at all seen the results I would expect in terms of page rank throughout my site. I have almost everything pointing at my home page, with a variety of anchor text, but my rank is 1. There is a page on my site with 3, though, and a couple with 2, so it certainly is not all about links; I do try to have somewhat unique and interesting content, but some of my strong pages are default page content. I will explore the help forum. (I guess these comments are nofollow :P) I would not mind a piece of this page rank …

But I also don’t wanna lose PageRank on every comment with a link… If I can give PageRank and lose none, I wanna let the comment there, even without nofollow. But if I lose PageRank on every link, even inside original post, EVEN MORE if nofollow also takes PageRank out of me, I may just start using JavaScript or simple text without anchor for links… I definetely don’t like this idea, but I dislike even more losing PageRank on each outlink on my site. I’d just link top quality sites that I actively wanna vote for Search Engines.

×