An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.
Internet Marketing Inc. provides integrated online marketing strategies that help companies grow. We think of ourselves as a business development consulting firm that uses interactive marketing as a tool to increase revenue and profits. Our management team has decades of combined experience in online marketing as well as graduate level education and experience in business and finance. That is why we focus on creating integrated online marketing campaigns designed to maximize your return on investment.
Do you regularly publish helpful, useful articles, videos or other types of media that are popular and well produced? Do you write for actual human beings rather than the search engine itself? Well, you should. Latest research from Searchmetrics on ranking factors indicates that Google is moving further towards longer-form content that understands a visitor’s intention as a whole, instead of using keywords based on popular search queries to create content.
Well, to make things worse, website owners quickly realized they could exploit this weakness by resorting to “keyword stuffing,” a practice that simply involved creating websites with massive lists of keywords and making money off of the ad revenue they generated. This made search engines largely worthless, and weakened the usefulness of the Internet as a whole. How could this problem be fixed?
nofollow is beyond a joke now. There is so much confusion (especially when other engines’ treatment is factored in), I don’t know how you expect a regular publisher to keep up. The expectation seems to have shifted from “Do it for humans and all else will follow” to “Hang on our every word, do what we say, if we change our minds then change everything” and nofollow lead the way. I could give other examples of this attitude (e.g. “We don’t follow JavaScript links so it’s ‘safe’ to use those for paid links”), but nofollow is surely the worst.
For some business owners, they’ll think of a website. Others may think of social media, or blogging. In reality, all of these avenues of advertising fall in the category internet marketing and each is like a puzzle piece in a much bigger marketing picture. Unfortunately, for new business owners trying to establish their web presence, there’s a lot of puzzle pieces to manage.

“With 150 million pages, the Web had 1.7 billion edges (links).” Kevin Heisler, that ratio holds true pretty well as the web gets bigger. A good rule of thumb is that the number of links is about 10x the number of pages. I agree that it’s pretty tragic that Rajeev Motwani was a co-author of many of those early papers. I got to talk to Rajeev a little bit at Google, and he was a truly decent and generous man. What has heartened me is to see all the people that he helped, and to see those people pay their respects online. No worries on the Consumer WebWatch–I’m a big fan of Consumer WebWatch, and somehow I just missed their blog. I just want to reiterate that even though this feels like a huge change to a certain segment of SEOs, in practical terms this change really doesn’t affect rankings very much at all.
Submit website to directories (limited use). Professional search marketers don’t sub­mit the URL to the major search engines, but it’s possible to do so. A better and faster way is to get links back to your site naturally. Links get your site indexed by the search engines. However, you should submit your URL to directories such as Yahoo! (paid), (paid) and DMOZ (free). Some may choose to include AdSense ( scripts on a new site to get their Google Media bot to visit. It will likely get your pages indexed quickly.
The best strategy to get backlinks is to create great content and let other people promote your content. However, to get started, you can create your own links to content on your social media platform, ask your friends to share your content on their websites and social media, and if you can find questions in forums that your content answers, you can always post it there.
To answer your question, David, take a look at Jim’s comment below. Yes, you can and SHOULD optimize PR by directing link equity at important pages and internally linking within a theme. PageRank is a core part of the Google ranking algo. We don’t get visibility into PageRank as a number or score, but you need to know about the concept in order to direct your internal, strategic linking and navigation.
Danny, I was on the panel where Matt suggested that and I point blank asked on stage what happened when folks starting abusing the tactic and Google changed their mind if you recall (at the time, I’d seen some of the things being done I knew Google would clarify as abuse and was still a nofollow unenthusiast s a result at that time). And Matt dismissed it. So, I think you can take home two important things from that – 1. SEO tactics can always change regardless of who first endorses them and 2. Not everything Matt says is etched in stone. <3 ya Matt.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Thanks a lot for all of those great tips you handed out here. I immediately went to work applying the strategies that you mentioned. I will keep you posted on my results. I have been offering free SEO services to all of my small business bookkeeping clients as a way of helping them to grow their businesses. Many of them just don’t have the resources required to hire an SEO guru to help them but they need SEO bad. I appreciate the fact that you share your knowledge and don’t try to make it seem like it’s nuclear science in order to pounce on the innocent. All the best to you my friend!
Adjusting how Google treats nofollows is clearly a major shift (as the frenzy in the SEO community has demonstrated). So, if Google were to adjust how they treat nofollows they would need to phase it in gradually. I believe this latest (whether in 2008 or 2009) change is simply a move in the direction of greater changes to come regarding nofollow. It is the logical first step.

Secondly, nofollow is also essential on links to off-topic pages, whether they’re internal or external to your site. You want to prevent search engines from misunderstanding what your pages are about. Linking relevant pages together reinforces your topic relevance. So to keep your topic silos clear, strategic use of the nofollow attribute can be applied when linking off-topic pages together.

Social media is a mixed bag when it comes to backlinks. There is a modicum of value, as social media sites allow you to link to your website in your profile. However, these days Facebook, Twitter, and other social media sites mark links as 'nofollow,' meaning that they don't pass SEO value (sometimes referred to as "link juice") to the linked site. These links won't do anything to boost your site's performance in search results.
In order to do all that, you will need to acquire and apply knowledge in human psychology. If you understand how your customers think, you can design for their needs. This course is based on tried and tested psychological techniques that bring together content and design so as to deliver hands-on advice for how to improve your web design and increase your customer engagement.

Suppose instead that page B had a link to pages C and A, page C had a link to page A, and page D had links to all three pages. Thus, upon the first iteration, page B would transfer half of its existing value, or 0.125, to page A and the other half, or 0.125, to page C. Page C would transfer all of its existing value, 0.25, to the only page it links to, A. Since D had three outbound links, it would transfer one third of its existing value, or approximately 0.083, to A. At the completion of this iteration, page A will have a PageRank of approximately 0.458.
As Rogers pointed out in his classic paper on PageRank, the biggest takeaway for us about the eigenvector piece is that it’s a type of math that let’s you work with multiple moving parts. “We can go ahead and calculate a page’s PageRank without knowing the final value of the PR of the other pages. That seems strange but, basically, each time we run the calculation we’re getting a closer estimate of the final value. So all we need to do is remember the each value we calculate and repeat the calculations lots of times until the numbers stop changing much.”
A generalization of PageRank for the case of ranking two interacting groups of objects was described in [32] In applications it may be necessary to model systems having objects of two kinds where a weighted relation is defined on object pairs. This leads to considering bipartite graphs. For such graphs two related positive or nonnegative irreducible matrices corresponding to vertex partition sets can be defined. One can compute rankings of objects in both groups as eigenvectors corresponding to the maximal positive eigenvalues of these matrices. Normed eigenvectors exist and are unique by the Perron or Perron-Frobenius theorem. Example: consumers and products. The relation weight is the product consumption rate.

Non-profit corporations and political entities use Internet marketing to raise awareness about the issues they address and engage individuals in their campaigns. They strongly favor social networking platforms because they are more personal than websites and they are easy to share, increasing the “viral” word-of-mouth effect that is so prevalent in online media.

I won’t blame MC. Google, knows what they does. These are things that webmasters need not worry about. Well, it won’t make much difference as far as I think. I don’t use no follow tags specifically – I use WP for blogging purposes and it does rest of the things for me other than writing content which I do. I think it is the content and the external links that sites point to – which should be considered. I mean, if a computer blog owner posts a really fantastic computer article about something related to computer, and also puts some links to external pages (which are really useful for the readers), then that post, should be ranked high in gooogle – And I think google does this well – So, webmasters, just concentrate on yur website/blogs etc and leave rest of the things to Big G.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content41 for ranking, parsing structured data, and generating snippets.
Thanks for the post Chelsea! I think Google is starting to move further away from PageRank but I do agree that a higher amoount of links doesn’t necessarily mean a higher rank. I’ve seen many try to shortcut the system and end up spending weeks undoing these “shortcuts.” I wonder how much weight PageRank still holds today, considering the algorithms Google continues to put out there to provide more relevant search results.
I have to take my hat off to your content – not just for the tips you’ve given that have helped me with my websites, but for how clearly you can write. May I ask, what books or resources have inspired and influenced your writing and content creation the most? The two best books I’ve read so far to improve my writing are On Writing Well and Letting Go of the Words.
Nashville Grant, here’s the mental model I’d employ: search engines want to return great content. If you make such a fantastic site that all the web has heard of you, search engines should normally reflect that fact and return your site. A lot of bad SEO happens because people say “I’ll force my way to the top of Google first, and then everyone will find out about my site.” Putting rankings before the creation of a great site is in many ways putting the cart before the horse. Often the search rankings follow from the fact that you’re getting to be well-known on the web completely outside the sphere of search. Think about sites like Twitter and Facebook–they succeed by chasing a vision of what users would want. In chasing after that ideal of user happiness and satisfaction, they became the sort of high-quality sites that search engines want to return, because we also want to return what searches will find useful and love. By chasing a great user experience above search rankings, many sites turn out to be what search engines would want to return anyway.
The PageRank formula also contains a damping factor (d) . According to the PageRank theory, there is an imaginary surfer who is randomly clicking on links, and at some point he gets bored and eventually stops clicking. The probability that the person will continue clicking at any step is a damping factor. Thus, this factor is introduced to stop some pages having too much influence. As a result, their total vote is damped down by multiplying it by 0.85 (a generally assumed value).

NOTE: You may be curious what your site’s or your competitor’s PR score is. But Google no longer reveals the PageRank score for websites. It used to display at the top of web browsers right in the Google Toolbar, but no more. And PR data is no longer available to developers through APIs, either. Even though it’s now hidden from public view, however, PageRank remains an important ingredient in Google’s secret ranking algorithms.
Outreach to webmasters should be personalized. You can list reasons why you like their brand, think your brand would partner well with them or citing articles and other content they published are great ways to make them more receptive. Try to find an actual point-of-contact on professional sites like LinkedIn. A generic blast of “Dear Webmaster…” emails is really just a spam campaign.

In today’s world, QUALITY is more important than quantity. Google penalties have caused many website owners to not only stop link building, but start link pruning instead. Poor quality links (i.e., links from spammy or off-topic sites) are like poison and can kill your search engine rankings. Only links from quality sites, and pages that are relevant to your website, will appear natural and not be subject to penalty. So never try to buy or solicit links — earn them naturally or not at all.
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by  tag with rel="canonical" and rel="alternate" elements.

Re: Cameron’s Comment. Google transparent? Maybe. Great products for users – yes… but they operate from lofty towers. Can’t get a hold of them. Can’t contact them. They are the ONLY company in the world with zero customer support for their millions of users. Who really knows what they are doing from one month to the month in regards to ranking sites… etc.
I don’t know how you do it without having a strong team of employees building backlinks for you. I love your blog and all the guidance you provide. I have found trying to build backlinks on your own is one of the most time consuming activities there is. Obviously if you have a specific product or service you are wishing to share getting more customers and visitors to your business is essential. You make it look easy. Thanks again for all your guidance.
However, before learning any of that, it's important that you get a lay of the land, so to speak. If you truly want to understand the field of internet marketing, Sharpe has some very good points. In essence there are four overall steps to really understanding internet marketing and leveraging the industry to make money online. Depending on where you are with your education, you'll be somewhere along the lines of these four steps.
All of the examples above and more could be used as anchor text for the same backlink. Google will index each differently. Not only that, Google will even examine the few words before and after the anchor text as well as take into account all of the text on the page. It will also attribute value to which backlink was first in the page and diminish the value for each following link.
Thanks for the info on nofollow and pagerank. It makes sense that this will always be a moving target less everyone will eventually game the system until it’s worthless but at the same time it’s worth it to know a few tricks. I still have open concerns on how freshness of content factor in, the only time i’m ever annoyed by search results these days is when the only links available (on the first page at least) are articles from 4 years ago.
Well, it seems that what this article says, is that the purpose of the no-follow link is to take the motivation away from spammers to post spam comments for the purpose of the link and the associated page rank flow; that the purpose of no-follow was never to provide a means to control where a page’s pagerank flow is directed. It doesn’t seem that shocking to me folks.
Today, with nearly half the world's population wired to the internet, the ever-increasing connectivity has created global shifts in strategic thinking and positioning, disrupting industry after industry, sector after sector. Seemingly, with each passing day, some new technological tool emerges that revolutionizes our lives, further deepening and embedding our dependence on the world wide web.
Cross-platform measurement: The number of marketing channels continues to expand, as measurement practices are growing in complexity. A cross-platform view must be used to unify audience measurement and media planning. Market researchers need to understand how the Omni-channel affects consumer's behaviour, although when advertisements are on a consumer's device this does not get measured. Significant aspects to cross-platform measurement involves de-duplication and understanding that you have reached an incremental level with another platform, rather than delivering more impressions against people that have previously been reached (Whiteside, 2016).[42] An example is ‘ESPN and comScore partnered on Project Blueprint discovering the sports broadcaster achieved a 21% increase in unduplicated daily reach thanks to digital advertising’ (Whiteside, 2016).[42] Television and radio industries are the electronic media, which competes with digital and other technological advertising. Yet television advertising is not directly competing with online digital advertising due to being able to cross platform with digital technology. Radio also gains power through cross platforms, in online streaming content. Television and radio continue to persuade and affect the audience, across multiple platforms (Fill, Hughes, & De Franceso, 2013).[45]
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Before online marketing channels emerged, the cost to market products or services was often prohibitively expensive, and traditionally difficult to measure. Think of national television ad campaigns, which are measured through consumer focus groups to determine levels of brand awareness. These methods are also not well-suited to controlled experimentation. Today, anyone with an online business (as well as most offline businesses) can participate in online marketing by creating a website and building customer acquisition campaigns at little to no cost. Those marketing products and services also have the ability to experiment with optimization to fine-tune their campaigns’ efficiency and ROI.
Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
Google will like your content if your clients like it. The content should be helpful and contain less information which is already known to the reader. It is to meet their expectations. When users vote for your site, it means that Google starts accepting it as an authority site. That’s why content writing is as important as a speech of a candidate for the Presidency. The better it is, the more visitors you have.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.

Assume a small universe of four web pages: A, B, C and D. Links from a page to itself are ignored. Multiple outbound links from one page to another page are treated as a single link. PageRank is initialized to the same value for all pages. In the original form of PageRank, the sum of PageRank over all pages was the total number of pages on the web at that time, so each page in this example would have an initial value of 1. However, later versions of PageRank, and the remainder of this section, assume a probability distribution between 0 and 1. Hence the initial value for each page in this example is 0.25.
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention.
Hi Brian thank you for sharing this awesome backlinking techniques. My site is currently not ranking well. It used to be, sometime mid last year, but it suddenly got de-ranked. Not really sure why. I haven’t been participating in any blackhat techniques or anything at all. I’ll try a few of your tips and hopefully it will help my site back to its shape. 

As an avid reader of [insert their site name], I love reading anything you write about, such as [insert article on their website], and anything you link out to. Sadly, I couldn’t find the article you were trying to link to, but I did happen to find another good webpage on the same topic: [insert url to webpage that you are building links to]. You should check it out, and if you like it, you probably want to switch the links.

In my view there is nothing wrong with saying ‘hey Google, these pages are not important from a search engine perspective, let me not give them so much weight’. Regardless of how Google now views these type of pages from a weight perspective, doing the above as a webmaster should be logical and encouraged. You have said this yourself at least a few times in the past.
Yep, please change things to stop keyword stuffing. Change them to stop cloaking. Definately change them to stop buying links that try to game Google. But, telling search engines to not give weight (that I control) to pages that are not what my site is about or are not really relevant. No way. This is logical stuff here. Maybe too logical. I think deep down you know this Matt too.