This is the argument that quickly emerged about blog comments recently. Say I have an article on a blog with 5 links in the editorial copy — some of those links leading back to other content within the blog that I hope to do well. Then I get 35 comments on the article, with each comment having a link back to the commenters’ sites. That’s 40 links in all. Let’s say this particular page has $20 in PageRank to spend. Each link gets 50 cents.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
The probability that the random surfer visits a page is its PageRank. And, the d damping factor is the probability at each page the “random surfer” will get bored and request another random page. One important variation is to only add the damping factor d to a single page, or a group of pages. This allows for personalization and can make it nearly impossible to deliberately mislead the system in order to get a higher ranking. We have several other extensions to PageRank…
In early 2005, Google implemented a new value, "nofollow",[64] for the rel attribute of HTML link and anchor elements, so that website developers and bloggers can make links that Google will not consider for the purposes of PageRank—they are links that no longer constitute a "vote" in the PageRank system. The nofollow relationship was added in an attempt to help combat spamdexing.
btw; All those SEO’s out there probably made some monies off clients, selling the sculpting thang to them. I know some are still insisting it worked, etc, but would they say in public that it didn’t work after they already took a site’s money to sculpt? How would anyone judge if it worked or not definitively? The funny thing is, the real issues of that site could have been fixed for the long term instead of applying a band aide. Of course; knowing the state of this industry right now, band aides are the in thing anyway.
The paper’s authors noted that AltaVista (on the right) returned a rather random assortment of search results–rather obscure optical physics department of the University of Oregon, the campus networking group at Carnegie Mellon, Wesleyan’s computer science group, and then a page for one of the campuses of a Japanese university. Interestingly, none of the first six results return the homepage of a website
For instance, you might use Facebook’s Lookalike Audiences to get your message in front of an audience similar to your core demographic. Or, you could pay a social media influencer to share images of your products to her already well-established community. Paid social media can attract new customers to your brand or product, but you’ll want to conduct market research and A/B testing before investing too much in one social media channel.
Me, I didn’t like the sculpting idea from the start. I linked to what I thought should get links and figured that was pretty natural, to have navigational links, external links and so on — and natural has long been the think Google’s rewarded the most. So I didn’t sculpt, even after Matt helped put it out there, because it just made no long term sense to me.
On a blog the page rank should go to the main article pages. Now it just gets “evaporated” if you use “nofollow” or scattered to all the far flung nooks and crannys which means google will not be able to see the wood for the trees. The vast majority of a site’s overall page rank will now reside in the long tail of useless pages such as commentors profile pages. This can only make it harder for google to serve up the most relevant pages.
After your site has been built out, creating a social media presence is the best second step for most businesses. All businesses should have a Facebook Page that’s fully fleshed out with plenty of information about your business. Depending on your audience, you can also start a Twitter, Instagram, and/or Pinterest account. Social media is a long-term commitment that requires frequently updating and monitoring, but it’s one of the best ways to build an online community around your business.
One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.
Great article and writing in general. My company just published a 5,000 word Keyword targeting best practices guide for PPC and SEO, and we linked to your article “10 Reasons You Should Use Google Trends for More Than Just Keyword Research”. http://vabulous.com/keyword-research-targeting-for-ppc-and-seo-guide/ I would love if you checked it out and possibly shared it if you like it.
There are also many keyword research tools (some free and some paid) that claim to take the effort out of this process. A popular tool for first timers is Traffic Travis, which can also analyse your competitors’ sites for their keyword optimization strategies and, as a bonus, it can deliver detailed analysis on their back-linking strategy, too. You can also use Moz.com’s incredibly useful keyword research tools – they’re the industry leader, but they come at a somewhat higher price.
Finally, it’s critical you spend time and resources on your business’s website design. When these aforementioned customers find your website, they’ll likely feel deterred from trusting your brand and purchasing your product if they find your site confusing or unhelpful. For this reason, it’s important you take the time to create a user-friendly (and mobile-friendly) website. 

For the purpose of their second paper, Brin, Page, and their coauthors took PageRank for a spin by incorporating it into an experimental search engine, and then compared its performance to AltaVista, one of the most popular search engines on the Web at that time. Their paper included a screenshot comparing the two engines’ results for the word “university.”
Search engine optimization (SEO) is the process of affecting the online visibility of a website or a web page in a web search engine's unpaid results—often referred to as "natural", "organic", or "earned" results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers.[1] SEO may target different kinds of search, including image search, video search, academic search,[2] news search, and industry-specific vertical search engines. SEO differs from local search engine optimization in that the latter is focused on optimizing a business' online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services. The former instead is more focused on national or international searches.
Brian Dean, an SEO expert and the creator of BackLinko, uses SEO tactics to rank #1 on YouTube for keywords like “on page SEO” and “video SEO”. Initially, Dean admits his YouTube account struggled to get any views. Employing SEO methods like keyword optimization has enabled Dean to rise to #1 on YouTube for search results related to his business. He published his full strategy on Backlinko.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by {\displaystyle PR(E).} Other factors like Author Rank can contribute to the importance of an entity.
On the other hand, all of the results for the PageRank engine (aside from a single secondary listing) link to the homepage of major American universities. The results are much more logical and useful in nature. If you search for “university,” are you going to want the homepages for popular universities, or random subpages from a sprinkling of colleges all over the world?

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized, but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
The green ratings bars are a measure of the importance Google’s assessment of the importance of a web page, as determined by Google’s patented PageRank technology and other factors. These PageRank bars tell you at a glance whether other people on the web consider Google considers a page to be a high-quality site worth checking out. Google itself does not evaluate or endorse websites. Rather, we measure what others on the web feel is important enough to deserve a link. And because Google does not accept payment for placement within our results, the information you see when you conduct a search is based on totally objective criteria.
Peter made a very good point in all of this, and Michael Martinez did in a backhanded way as well. Talking about a concept related PageRank sounds cool. It doesn’t actually have to be useful or practical, and it usually isn’t; but as long as the impression of something productive is given off, then that can be all that matters in the eyes of those who lack sufficient knowledge.
Thanks for sharing this, Matt. I’m happy that you took the time to do so considering that you don’t have to. What I mean is, in an ideal world, there should be no such thing as SEO. It is the SE’s job to bring the right users to the right sites and it is the job of webmasters to cater to the needs of the users brought into their sites by SEs. Webmasters should not be concerned of bringing the users in themselves. (aside from offsite or sponsored marketing campaigns) The moment they do, things start to get ugly because SEs would now have to implement counter-measures. (To most SEO tactics) This becomes an unending spiral. If people only stick to their part of the equation, SEs will have more time to develop algorithms for making sure webmasters get relevant users rather than to develop algorithms for combating SEOs to ensure search users get relevant results. Just do your best in providing valuable content and Google will try their best in matching you with your users. Don’t waste time trying to second guess how Google does it so that you can present yourself to Google as having a better value than you really have. They have great engineers and they have the code—you only have a guess. At most, the SEO anyone should be doing is to follow the webmasters guidelines. It will benefit all.
Andy Beard, I was only talking about the nofollow attribute on individual links, not noindex/nofollow as a meta tag. But I’ll check that out. Some parts of Thesis I really like, and then there’s a few pieces that don’t quite give me the granularity I’d like. As far as page size, we can definitely crawl much more than 101KB these days. In my copious spare time I’ll chat with some folks about upping the number of links in that guideline.
Site owners are using the toolbar to find “good” sites that they should get links from, regardless of the fact that link context is also important, not to mention many, many other factors that are used by Google to rank a web page. Other site owners, getting a gray PR0 toolbar for their site, immediate assume the worst, that they’ve been blacklisted.
×