This isn't about off-the-shelf solutions. You need to really convey something illustrious and beautiful, then fill it with incredible MVP content. Over time, this will become a thriving hotbed of activity for you, where people will come by and check-in repeatedly to see what you're talking about and what value you're delivering. Keep in mind that this won't happen quickly. It will take years. Yes, I said years.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
Okay. Okay. There is a lot to learn. However, everyone has to start somewhere. If you're just being introduced to internet marketing, and you've become bedazzled by the glitz and the glamor of the top online income earners, know that it's not going to be easy to replicate their success. Be sure that you set your expectations the proper way. As long as you stay persistent, you can achieve your goals of generating healthy amounts of money online without becoming the victim of a scam.
For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.
Ah – well the Reasonable Surfer is a different patent (and therefore a different algorithm) to PageRank. I would imagine that initially, only the first link counted – simply because there either IS or IS NOT a relationship between the two nodes. This mean it was a binary choice. However, at Majestic we certainly think about two links between page A and Page B with separate anchor texts… in this case in a binary choice, either the data on the second link would need to be dropped or, the number of backlinks can start to get bloated. I wrote about this on Moz way back in 2011!
Andy Beard, I was only talking about the nofollow attribute on individual links, not noindex/nofollow as a meta tag. But I’ll check that out. Some parts of Thesis I really like, and then there’s a few pieces that don’t quite give me the granularity I’d like. As far as page size, we can definitely crawl much more than 101KB these days. In my copious spare time I’ll chat with some folks about upping the number of links in that guideline.
While there are several platforms for doing this, clearly YouTube is the most popular for doing this. However, video marketing is also a great form of both content marketing and SEO on its own. It can help to provide visibility for several different ventures, and if the video is valuable enough in its message and content, it will be shared and liked by droves, pushing up the authority of that video through the roof.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
I think it is important you distinguish your advice about no-following INTERNAL links and no-following EXTERNAL links for user-generated content. Most popular UGC-heavy sites have no-followed links as they can’t possibly police them editorially & want to give some indication to the search engines that the links haven’t been editorially approved, but still might provide some user benefit.

As Rogers pointed out in his classic paper on PageRank, the biggest takeaway for us about the eigenvector piece is that it’s a type of math that let’s you work with multiple moving parts. “We can go ahead and calculate a page’s PageRank without knowing the final value of the PR of the other pages. That seems strange but, basically, each time we run the calculation we’re getting a closer estimate of the final value. So all we need to do is remember the each value we calculate and repeat the calculations lots of times until the numbers stop changing much.”
If the assumption here is that webmasters will remove the nofollow attributes in response to this change, then why did take “more than a year” for someone from Google to present this information to the public? It seems that if this logic had anything at all to do with the decision to change the nofollow policy, Google would have announced it immediately in order to “encourage” webmasters to change their linking policies and allow access to their pages with “high-quality information.”
Google will like your content if your clients like it. The content should be helpful and contain less information which is already known to the reader. It is to meet their expectations. When users vote for your site, it means that Google starts accepting it as an authority site. That’s why content writing is as important as a speech of a candidate for the Presidency. The better it is, the more visitors you have.
SEO experts have a really bad habit: They like to throw around strange words and industry jargon when they talk to customers without checking to make sure that their clients understand the topic at hand. Some do this intentionally to paper over the fact that they use black hat techniques that will ultimately hurt their customers. But for most, it’s simply a matter of failing to recognize that part of their job is to educate their clients.
I suppose for those people, including myself who just keep trying to our best and succeed, we just need to keep trusting that Google is doing all it can to weed out irrelevant content and produce the quality goods with changes such as this. Meanwhile the “uneducated majority” will just have to keep getting educated or get out of the game I suppose.
So, as you build a link, ask yourself, "am I doing this for the sake of my customer or as a normal marketing function?" If not, and you're buying a link, spamming blog comments, posting low-quality articles and whatnot, you risk Google penalizing you for your behavior. This could be as subtle as a drop in search ranking, or as harsh as a manual action, getting you removed from the search results altogether!
For instance, you might use Facebook’s Lookalike Audiences to get your message in front of an audience similar to your core demographic. Or, you could pay a social media influencer to share images of your products to her already well-established community. Paid social media can attract new customers to your brand or product, but you’ll want to conduct market research and A/B testing before investing too much in one social media channel.

I like that you said you let PageRank flow freely throughout your site. I think that’s good and I’ve steered many friends and clients to using WordPress for their website for this very reason. With WordPress, it seems obvious that each piece of content has an actual home (perma links) and so it would seem logical that Google and other search engines will figure out that structure pretty easily.


Data-driven advertising: Users generate a lot of data in every step they take on the path of customer journey and Brands can now use that data to activate their known audience with data-driven programmatic media buying. Without exposing customers' privacy, users' Data can be collected from digital channels (e.g.: when customer visits a website, reads an e-mail, or launches and interact with brand's mobile app), brands can also collect data from real world customer interactions, such as brick and mortar stores visits and from CRM and Sales engines datasets. Also known as People-based marketing or addressable media, Data-driven advertising is empowering brands to find their loyal customers in their audience and deliver in real time a much more personal communication, highly relevant to each customers' moment and actions.[37]
1. Now that we know that weight/PageRank/whatever will disappear (outside of the intrinsic wastage method that Google applies) when we use a ‘nofollow’ link, what do you think this will do to linking patterns? This is really a can of worms from an outbound linking and internal linking perspective. Will people still link to their ‘legals’ page from every page on their site? Turning comments ‘off’ will also be pretty tempting. I know this will devalue the sites in general, but we are not always dealing with logic here are we? (if we were you (as head of the web spam team) wouldn’t of had to change many things in the past. Changing the PageRank sculpting thing just being one of them).
Now that you know that backlinks are important, how do you acquire links to your site? Link building is still critical to the success of any SEO campaign when it comes to ranking organically. Backlinks today are much different than when they were built in 7-8 years back. Simply having thousands of backlinks or only have link from one website isn’t going to affect your rank position. There are also many ways to manage and understand your backlink profile. Majestic, Buzzstream, and Moz offer tools to help you manage and optimize your link profile. seoClarity offers an integration with Majestic, the largest link index database, that integrates link profile management into your entire SEO lifecycle.   

At the time I was strongly advocating page rank sculting by inclusion of no follow links on “related product” links. It’s interesting to note that my proposed technique would have perhaps worked for a little while then would have lost its effectiveness. Eventualy I reached the point where my efforts delivered diminishing returns which was perhaps unavoidable.


“With 150 million pages, the Web had 1.7 billion edges (links).” Kevin Heisler, that ratio holds true pretty well as the web gets bigger. A good rule of thumb is that the number of links is about 10x the number of pages. I agree that it’s pretty tragic that Rajeev Motwani was a co-author of many of those early papers. I got to talk to Rajeev a little bit at Google, and he was a truly decent and generous man. What has heartened me is to see all the people that he helped, and to see those people pay their respects online. No worries on the Consumer WebWatch–I’m a big fan of Consumer WebWatch, and somehow I just missed their blog. I just want to reiterate that even though this feels like a huge change to a certain segment of SEOs, in practical terms this change really doesn’t affect rankings very much at all.
In 2005, in a pilot study in Pakistan, Structural Deep Democracy, SD2[61][62] was used for leadership selection in a sustainable agriculture group called Contact Youth. SD2 uses PageRank for the processing of the transitive proxy votes, with the additional constraints of mandating at least two initial proxies per voter, and all voters are proxy candidates. More complex variants can be built on top of SD2, such as adding specialist proxies and direct votes for specific issues, but SD2 as the underlying umbrella system, mandates that generalist proxies should always be used.
Well, to make things worse, website owners quickly realized they could exploit this weakness by resorting to “keyword stuffing,” a practice that simply involved creating websites with massive lists of keywords and making money off of the ad revenue they generated. This made search engines largely worthless, and weakened the usefulness of the Internet as a whole. How could this problem be fixed?
I compare the latest Google search results to this: Mcdonalds is the most popular and is #1 in hamburgers… they dont taste that great but people still go there. BUT I bet you know a good burger joint down the road from Google that makes awesome burgers, 10X better than Mcdonalds, but “we” can not find that place because he does not have the resources or budget to market his burgers effectively.
Content marketing is more than just blogging. When executed correctly, content including articles, guides (like this one), webinars, and videos can be powerful growth drivers for your business. Focus on building trust and producing amazing quality. And most of all, make sure that you’re capturing the right metrics. Create content to generate ROI. Measure the right results. This chapter will teach you how.
@matt: I notice a bit of WordPress-related talk early in the comments (sorry, Dont have time to read all of them right now..), I was wondering if you’d like to comment on Trac ticket(http://core.trac.wordpress.org/ticket/10550) – Related to the use of nofollow on non-js-fallback comment links which WordPress uses – Its linking to the current page with a changed form.. the content and comments should remain the same, just a different form.. I think the original reason nofollow was added there was to prevent search engines thinking the site was advertising multiple pages with the same content..
We combine our sophisticated Search Engine Optimization skills with our ORM tools such as social media, social bookmarking, PR, video optimization, and content marketing to decrease the visibility of potentially damaging content. We also work with our clients to create rebuttal pages, micro-sites, positive reviews, social media profiles, and blogs in order to increase the volume of positive content that can be optimized for great search results.

Matt Cutts, it’s Shawn Hill from Longview, Texas and I’ve got to say, “you’re a semseo guru”. That’s obviously why Google retained you as they did. Very informative post! As head of Google’s Webspam team how to you intend to combat Social Networking Spam (SNS)? It’s becoming an increasingly obvious problem in SERPs. I’m thinking Blogspam should be the least of Google’s worries. What’s your take?
×