Re: Cameron’s Comment. Google transparent? Maybe. Great products for users – yes… but they operate from lofty towers. Can’t get a hold of them. Can’t contact them. They are the ONLY company in the world with zero customer support for their millions of users. Who really knows what they are doing from one month to the month in regards to ranking sites… etc.
This will give you an indication of how many times a search is performed in a month (low numbers are not very useful unless there is a very clear buying signal in the keyphrase – working hard for five hits a month is not recommended in most cases) and how much the phrase is “worth” per click to advertisers (e.g., how much someone will pay to use that keyphrase). The more it’s worth, the more likely it is that the phrase is delivering business results for someone.
Thanks a lot for all of those great tips you handed out here. I immediately went to work applying the strategies that you mentioned. I will keep you posted on my results. I have been offering free SEO services to all of my small business bookkeeping clients as a way of helping them to grow their businesses. Many of them just don’t have the resources required to hire an SEO guru to help them but they need SEO bad. I appreciate the fact that you share your knowledge and don’t try to make it seem like it’s nuclear science in order to pounce on the innocent. All the best to you my friend!

Great article and writing in general. My company just published a 5,000 word Keyword targeting best practices guide for PPC and SEO, and we linked to your article “10 Reasons You Should Use Google Trends for More Than Just Keyword Research”. http://vabulous.com/keyword-research-targeting-for-ppc-and-seo-guide/ I would love if you checked it out and possibly shared it if you like it.
Google's core algorithms and its propensity to shroud its data in layers of obscurity is not something new. However, it is critical to any understanding of marketing on the internet simply because this visibility is at the heart of everything else that you do. Forget about social media and other forms of marketing for the time being. Search engine optimization (SEO) offers up the proverbial key to near-limitless amounts of traffic on the web.

After your site has been built out, creating a social media presence is the best second step for most businesses. All businesses should have a Facebook Page that’s fully fleshed out with plenty of information about your business. Depending on your audience, you can also start a Twitter, Instagram, and/or Pinterest account. Social media is a long-term commitment that requires frequently updating and monitoring, but it’s one of the best ways to build an online community around your business.
Unfortunately, SEO is also a slow process. You can make “quick wins” in markets which are ill-established using SEO, but the truth is that the vast majority of useful keyphrases (including long-tail keyphrases) in competitive markets will already have been optimized for. It is likely to take a significant amount of time to get to a useful place in search results for these phrases. In some cases, it may take months or even years of concentrated effort to win the battle for highly competitive keyphrases.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[21] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page rank is an assessment of the authority and value of a website in Google’s algorithm. You can find plug-in tools for your browser or online tools (see the references section for links to some of these) to check the page rank of any given site. You can also check your own website’s page rank. The higher the page rank, the more likely it is that your SEO efforts are succeeding.

One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.
Hey Brian, this is an absolutely fabulous post! It caused me to come out of lurking mode on the Warrior Forum and post a response there as well. Only my second post in 4 years, it was that kickass… I’ve signed to your newsletter on the strength of this. You have a new follower on Twitter as well! I mean what I said on the Warrior Forum… Since 2001 I’ve worked in an SEO commercially, freelance and now from the comfort of my own home – I have bought IM ebooks with less useful information in them than covered by any one of your 17. You might not please everyone in our industry giving some of those secrets away for free though! All power to you my friend, you deserve success and lots of it!

Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.
The amount of link juice passed depends on two things: the number of PageRank points of the webpage housing the link, and the total number of links on the webpage that are passing PageRank. It’s worth noting here that while Google will give every website a public-facing PageRank score that is between 1 and 10, the “points” each page accumulates from the link juice passed by high-value inbound links can — and do — significantly surpass ten. For instance, webpages on the most powerful and significant websites can pass link juice points in the hundreds or thousands. To keep the rating system concise, Google uses a lot of math to correlate very large (and very small) PageRank values with a neat and clean 0 to 10 rating scale.
Online reviews, then, have become another form of internet marketing that small businesses can't afford to ignore. While many small businesses think that they can't do anything about online reviews, that's not true. Just by actively encouraging customers to post reviews about their experience small businesses can weight online reviews positively. Sixty-eight percent of consumers left a local business review when asked. So assuming a business's products or services are not subpar, unfair negative reviews will get buried by reviews by happier customers.
The original Random Surfer PageRank patent from Stanford has expired. The Reasonable Surfer version of PageRank (assigned to Google) is newer than that one, and has been updated via a continuation patent at least once. The version of PageRank based upon a trusted seed set of sites (assigned to Google) has also been updated via a continuation patent and differs in many ways from the Stanford version of PageRank. It is likely that Google may be using one of the versions of PageRank that they have control over (the exclusive license to use Stanford’s version of PageRank has expired along with that patent). The updated versions of PageRank (reasonable surfer and Trusted Seeds approach) both are protected under present day patents assigned to Google, and both have been updated to reflect modern processes in how they are implemented. Because of their existence, and the expiration of the original, I would suggest that it is unlikely that the random surfer model-base PageRank is still being used.
Matt, this is an excellent summary. I finally got around to reading “The Search” by John Battelle and it was very enlightening to understand much of the academia behind what led to the creation of Backrub.. er Google.Looking at how many times the project was almost shutdown due to bandwidth consumption (> 50% of what the university could offer at times) as well as webmasters being concerned that their pages would be stolen and recreated. It’s so interesting to see that issues we see today are some of the same ones that Larry and Sergey were dealing with back then. As always, thanks for the great read Matt! 

If Google finds two identical pieces of content, whether on your own site, or on another you’re not even aware of, it will only index one of those pages. You should be aware of scraper sites, stealing your content automatically and republishing as your own. Here’s Graham Charlton’s thorough investigation on what to if your content ends up working better for somebody else.
@matt: I notice a bit of WordPress-related talk early in the comments (sorry, Dont have time to read all of them right now..), I was wondering if you’d like to comment on Trac ticket(http://core.trac.wordpress.org/ticket/10550) – Related to the use of nofollow on non-js-fallback comment links which WordPress uses – Its linking to the current page with a changed form.. the content and comments should remain the same, just a different form.. I think the original reason nofollow was added there was to prevent search engines thinking the site was advertising multiple pages with the same content..
For search engine optimization purposes, some companies offer to sell high PageRank links to webmasters.[40] As links from higher-PR pages are believed to be more valuable, they tend to be more expensive. It can be an effective and viable marketing strategy to buy link advertisements on content pages of quality and relevant sites to drive traffic and increase a webmaster's link popularity. However, Google has publicly warned webmasters that if they are or were discovered to be selling links for the purpose of conferring PageRank and reputation, their links will be devalued (ignored in the calculation of other pages' PageRanks). The practice of buying and selling links is intensely debated across the Webmaster community. Google advises webmasters to use the nofollow HTML attribute value on sponsored links. According to Matt Cutts, Google is concerned about webmasters who try to game the system, and thereby reduce the quality and relevance of Google search results.[40]

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[53] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[54]


Search engine optimization (SEO) receives a lot of love from inexperienced marketers. It’s seen as “free marketing” in that you can handle your own SEO work (as long as you follow some rules to do so), and thus all it requires is your time to make things happen. SEO is simply what you do to your website and web pages to make them show up in “organic” (or unpaid) search results on search engines.
Say I have an article on a blog with 5 links in the editorial copy — some of those links leading back to other content within the blog that I hope to do well. Then I get 35 comments on the article, with each comment having a link back to the commenters’ sites. That’s 40 links in all. Let’s say this particular page has $20 in PageRank to spend. Each link gets 50 cents.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.

Meanwhile, the link spam began. People chasing higher PageRank scores began dropping links wherever they could, including into blog posts and forums. Eventually, it became such an issue that demands were raised that Google itself should do something about it. Google did in 2005, getting behind the nofollow tag, a way to prevent links from passing along PageRank credit.
For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.
“So what happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each.”
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.[27]
“So what happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each.”
Concerning broken link building, it can also sometimes be relevant to scan the whole domain (e.g. if the website is a blog within a specific niche as these often feature multiple articles closely related to the same) for broken external links using e.g. XENU, A1 Website Analyzer or similar. (Just be sure to enable checking of external links before crawling the website.)
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
In a number of recent articles, where I've interviewed some of social media's rising stars such as Jason Stone from Millionaire Mentor, Sean Perelstein, who built StingHD into a global brand and Nathan Chan from Foundr Magazine, amongst several others, it's quite clear that multi-million-dollar businesses can be built on the backs of wildly-popular social media channels and platforms.
Thank you, Brian, for this definitive guide. I have already signed up for Haro and have plans to implement some of your strategies. My blog is related to providing digital marketing tutorials for beginners and hence can be in your niche as well. This is so good. I highly recommend all my team members in my company to read your blog everytime you published new content. 537 comments in this post within a day, you are a master of this. A great influence in digital marketing space.

Do you regularly publish helpful, useful articles, videos or other types of media that are popular and well produced? Do you write for actual human beings rather than the search engine itself? Well, you should. Latest research from Searchmetrics on ranking factors indicates that Google is moving further towards longer-form content that understands a visitor’s intention as a whole, instead of using keywords based on popular search queries to create content.
Get a link to your pages from an high PR page and yes, some of that PageRank importance is transmitted to your page. But that’s doesn’t take into account the context of the link — the words in the link — the anchor text. If you don’t understand anchor text, Google Now Reporting Anchor Text Phrases from me last month will take you by the hand and explain it more.
I think it is important you distinguish your advice about no-following INTERNAL links and no-following EXTERNAL links for user-generated content. Most popular UGC-heavy sites have no-followed links as they can’t possibly police them editorially & want to give some indication to the search engines that the links haven’t been editorially approved, but still might provide some user benefit.
Being on the cutting edge of website design and development is critical to stay relevant as a leading agency which is why our expert team uses the latest technology to ensure your websites and lading pages are easily accessed and usable across all devices. We have vast experience in Ecommerce design and development, building well-optimized landing pages, conversion rate optimization, mobile websites, and responsive design. Our design team has experience in all things digital and the ability to create amazing websites, landing pages, creative for display advertising, infographics, typographic video, print ads, and much more.

Mega-sites, like http://news.bbc.co.uk have tens or hundreds of editors writing new content – i.e. new pages - all day long! Each one of those pages has rich, worthwile content of its own and a link back to its parent or the home page! That’s why the Home page Toolbar PR of these sites is 9/10 and the rest of us just get pushed lower and lower by comparison…
You should optimize your site to serve your users' needs. One of those users is a search engine, which helps other users discover your content. Search Engine Optimization is about helping search engines understand and present content. Your site may be smaller or larger than our example site and offer vastly different content, but the optimization topics we discuss below should apply to sites of all sizes and types. We hope our guide gives you some fresh ideas on how to improve your website, and we'd love to hear your questions, feedback, and success stories in the Google Webmaster Help Forum1.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
It doesn’t mean than you have to advertise on these social media platforms. It means that they belong to that pyramid which will function better thanks to their support. Just secure them and decide which of them will suit your goal better. For example, you can choose Instagram because its audience is the most suitable for mobile devices and bits of advice of their exploitation distribution.

When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
What is Search Engine Optimization (also known as SEO)? A broad definition is that search engine optimization is the art and science of making web pages attractive to search engines. More narrowly, SEO seeks to tweak particular factors known to affect search engine standing to make certain pages more attractive to search engines than other web pages that are vying for the same keywords or keyword phrases.
A PageRank results from a mathematical algorithm based on the webgraph, created by all World Wide Web pages as nodes and hyperlinks as edges, taking into consideration authority hubs such as cnn.com or usa.gov. The rank value indicates an importance of a particular page. A hyperlink to a page counts as a vote of support. The PageRank of a page is defined recursively and depends on the number and PageRank metric of all pages that link to it ("incoming links"). A page that is linked to by many pages with high PageRank receives a high rank itself.
When an Internet user starts searching for something, he/she tries to solve some particular problem or achieve something. Your prior aim is to help them find a good solution. Don’t be obsessed with search volume only. Think about the user’s needs. There is no difference between 40,000 and 1,000 word posts and articles when we speak about their value. Try to create high-quality content and don’t pay any attention to certain stereotypes.
What is a useful place in search results? Ideally, you need to be in the top three search results returned. More than 70% of searches are resolved in these three results, while 90% are resolved on the first page of results. So, if you’re not in the top three, you’re going to find you’re missing out on the majority of potential business—and if you’re not on the first page, you’re going to miss out on nearly all potential business.
One final note is that if the links are not directly related to the subject, or you have no control over them, such as commentors’ website links, maybe you should consider putting them on another page, which links to your main content. That way you don’t leak page rank, and still gain hits from search results from the content of the comments. I may be missing something but this seems to mean that you can have your cake and eat it, and I don’t even think it is gaming the system or against the spirit of it. You might even gain a small sprinkling of page rank if the comment page accumulates any of it’s own.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]

Journalists and writers are always on the lookout for experts to contribute quotes for their articles. Some (but not all) will include backlinks to their sources’ websites. Getting quotes in media outlets is a great way to not only get backlinks, but also build credibility within your industry. Even in instances where you don't get backlinks, this profile page for PMM's CEO Josh Rubin is a good example of how you can showcase your media appearances - something which both Google and your clients value when it comes to evaluating your authority.
Ah – well the Reasonable Surfer is a different patent (and therefore a different algorithm) to PageRank. I would imagine that initially, only the first link counted – simply because there either IS or IS NOT a relationship between the two nodes. This mean it was a binary choice. However, at Majestic we certainly think about two links between page A and Page B with separate anchor texts… in this case in a binary choice, either the data on the second link would need to be dropped or, the number of backlinks can start to get bloated. I wrote about this on Moz way back in 2011!
Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.

When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Our agency can provide both offensive and defensive ORM strategies as well as preventive ORM that includes developing new pages and social media profiles combined with consulting on continued content development. Our ORM team consists of experts from our SEO, Social Media, Content Marketing, and PR teams. At the end of the day, ORM is about getting involved in the online “conversations” and proactively addressing any potentially damaging content.
Well, to make things worse, website owners quickly realized they could exploit this weakness by resorting to “keyword stuffing,” a practice that simply involved creating websites with massive lists of keywords and making money off of the ad revenue they generated. This made search engines largely worthless, and weakened the usefulness of the Internet as a whole. How could this problem be fixed?
If (a) is correct that looks like bad news for webmasters, BUT if (b) is also correct then – because PR is ultimately calculated over the whole of the web – every page loses out relative to every other page. In other words, there is less PR on the web as a whole and, after a sufficient number of iterations in the PR calculation, normality is restored. Is this correct?
However, before learning any of that, it's important that you get a lay of the land, so to speak. If you truly want to understand the field of internet marketing, Sharpe has some very good points. In essence there are four overall steps to really understanding internet marketing and leveraging the industry to make money online. Depending on where you are with your education, you'll be somewhere along the lines of these four steps.
Andy Beard, I was only talking about the nofollow attribute on individual links, not noindex/nofollow as a meta tag. But I’ll check that out. Some parts of Thesis I really like, and then there’s a few pieces that don’t quite give me the granularity I’d like. As far as page size, we can definitely crawl much more than 101KB these days. In my copious spare time I’ll chat with some folks about upping the number of links in that guideline.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
PageRank was once available for the verified site maintainers through the Google Webmaster Tools interface. However, on October 15, 2009, a Google employee confirmed that the company had removed PageRank from its Webmaster Tools section, saying that "We've been telling people for a long time that they shouldn't focus on PageRank so much. Many site owners seem to think it's the most important metric for them to track, which is simply not true."[67] In addition, The PageRank indicator is not available in Google's own Chrome browser.
Content is king. Your content needs to be written so that it provides value to your audience. It should be a mix of long and short posts on your blog or website. You should not try to “keyphrase stuff” (mentioning a keyphrase over and over again to try and attract search engines) as this gets penalized by search engines now. However, your text should contain the most important keyphrases at least once and ideally two to three times—ideally, it should appear in your title. However, readability and value are much more important than keyword positioning today.
All of the examples above and more could be used as anchor text for the same backlink. Google will index each differently. Not only that, Google will even examine the few words before and after the anchor text as well as take into account all of the text on the page. It will also attribute value to which backlink was first in the page and diminish the value for each following link.
I’ve seen so many cases of webmasters nofollowing legitimate external links it is not funny. Any external link on their site is nofollowed, even when quoting text on the other site. IMO, the original purpose of nofollow has long been defeated in specific industries. As more webmasters continue doing everything they can to preserve their pagerank, the effectiveness of nofollow will continue to erode.
In my experience this means (the key words are “not the most effective way”) a page not scored by Google (“e.g. my private link” – password protected, disallowed via robots.txt and/or noindex meta robots) whether using or not using rel=”nofollow” attribute in ‘links to’ is not factored into anything… because it can’t factor in something it isn’t allowed.

NOTE: You may be curious what your site’s or your competitor’s PR score is. But Google no longer reveals the PageRank score for websites. It used to display at the top of web browsers right in the Google Toolbar, but no more. And PR data is no longer available to developers through APIs, either. Even though it’s now hidden from public view, however, PageRank remains an important ingredient in Google’s secret ranking algorithms. 

Meanwhile, the link spam began. People chasing higher PageRank scores began dropping links wherever they could, including into blog posts and forums. Eventually, it became such an issue that demands were raised that Google itself should do something about it. Google did in 2005, getting behind the nofollow tag, a way to prevent links from passing along PageRank credit.
Content is king. Your content needs to be written so that it provides value to your audience. It should be a mix of long and short posts on your blog or website. You should not try to “keyphrase stuff” (mentioning a keyphrase over and over again to try and attract search engines) as this gets penalized by search engines now. However, your text should contain the most important keyphrases at least once and ideally two to three times—ideally, it should appear in your title. However, readability and value are much more important than keyword positioning today.

I don’t get it, it seems Google is constantly making rules & regulations as they see fit. I don’t try to “manipulate” any links we have on our site or any clients we work for. Links take time period. No way around it. But, now this explanation gives more fuel to all the Google bashers out there. I recently read an article about Guy Kawasaki has been “loaned” one, two, three cars in three years & is still within Google’s guidelines? Makes me wonder how many rules and regulations are broken. My take is do your job right, and don’t worry what Google is doing. If content is King then everything will fall into place naturally.


Email marketing is the practice of nurturing leads and driving sales through email communications with your customers. Like social media, the goal is to remind users that you’re here and your product is waiting. Unlike social media, however, you can be a lot more aggressive with your sales techniques, as people expect that email marketing will contain offers, product announcements and calls to action.
Bob Dole (interesting name), you’re certainly welcome to use Bing if you prefer, but before you switch, you might check whether they do similar things. I know that Nate Buggia has strongly recommended not to bother with PageRank sculpting in the past, for example, or at least that was my perception from his comments at the last couple SMX Advanced conferences.

Okay, if you're still with me, fantastic. You're one of the few that doesn't mind wading through a little bit of hopeless murkiness to reemerge on the shores of hope. But before we jump too far ahead, it's important to understand what online marketing is and what it isn't. That definition provides a core understanding of what it takes to peddle anything on the web, whether it's a product, service or information.
“Google itself solely decides how much PageRank will flow to each and every link on a particular page. The number of links doesn’t matter. Google might decide some links don’t deserve credit and give them no PageRank. The use of nofollow doesn’t “conserve” PageRank for other links; it simply prevents those links from getting any PageRank that Google otherwise might have given them.”
I think it is important you distinguish your advice about no-following INTERNAL links and no-following EXTERNAL links for user-generated content. Most popular UGC-heavy sites have no-followed links as they can’t possibly police them editorially & want to give some indication to the search engines that the links haven’t been editorially approved, but still might provide some user benefit.
Also, by means of the iterative calculation, the sum of all pages' PageRanks still converges to the total number of web pages. So the average PageRank of a web page is 1. The minimum PageRank of a page is given by (1-d). Therefore, there is a maximum PageRank for a page which is given by dN+(1-d), where N is total number of web pages. This maximum can theoretically occur, if all web pages solely link to one page, and this page also solely links to itself.
On a blog the page rank should go to the main article pages. Now it just gets “evaporated” if you use “nofollow” or scattered to all the far flung nooks and crannys which means google will not be able to see the wood for the trees. The vast majority of a site’s overall page rank will now reside in the long tail of useless pages such as commentors profile pages. This can only make it harder for google to serve up the most relevant pages. 
×