Chris_D, great question. If you have a single product page that can have multiple urls with slightly different parameters, that’s a great time to use a rel=canonical meta tag. You can use rel=canonical for pages with session IDs in a similar fashion. What rel=canonical lets you do is say “this page X on my host is kinda of ugly or otherwise isn’t the best version of this page. Use url Y as the preferred version of my page instead.” You can read about rel=canonical at http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394. Bear in mind that if you can make your site work without session IDs or make it so that you don’t have multiple “aliases” for the same page, that’s even better because it solves the problem at the root.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.

Things are constantly changing, there is even evidence that nofollow links do count on some occasions. Its really a very complex subject as there is a formula behind the algorithm that takes many factors into consideration trying to guess what factors come into play is very difficult. I always focus on making the site as useful as possible to as many people as possible this is the end goal for search engines as well as webmasters. Webmasters who do this whilst observing the search engine’s guidelines should not have problems in reaching the top.
For instance, you might use Facebook’s Lookalike Audiences to get your message in front of an audience similar to your core demographic. Or, you could pay a social media influencer to share images of your products to her already well-established community. Paid social media can attract new customers to your brand or product, but you’ll want to conduct market research and A/B testing before investing too much in one social media channel.
Yes the links we have are found elsewhere but our focus is saving our users and clients time so we consolidated the links because it takes hours and hours and hours of searching to find them and some searchers are not very savvy when it comes to looking for, and finding, good quality information. I look at the links like a library, my library has these books, so do a bunch of other libraries. I think it is a shame that I have to hide my books from Google because I have to many really good ones because it is seen as a BAD thing in Google’s eyes. Darned if you dont create a good site, and darned if you do.
Hi, Norman! PageRank is an indicator of authority and trust, and inbound links are a large factor in PageRank score. That said, it makes sense that you may not be seeing any significant increases in your PageRank after only four months; A four-month old website is still a wee lad! PageRank is a score you will see slowly increase over time as your website begins to make its mark on the industry and external websites begin to reference (or otherwise link to) your Web pages.
I would like to know how Google is handling relevancy with so many websites now jumping on the “no follow” wagon? Seems like just about every major website has no follow links, so with the Panda updates this year what’s happening to all that lost link power? Seem’s like this tactic will stagnate the growth of up-and-coming websites on the internet to me. Am I right here?
There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
Thanks a lot for all of those great tips you handed out here. I immediately went to work applying the strategies that you mentioned. I will keep you posted on my results. I have been offering free SEO services to all of my small business bookkeeping clients as a way of helping them to grow their businesses. Many of them just don’t have the resources required to hire an SEO guru to help them but they need SEO bad. I appreciate the fact that you share your knowledge and don’t try to make it seem like it’s nuclear science in order to pounce on the innocent. All the best to you my friend!

But bear in mind that you can't guest post just anywhere and expect that it'll help. In fact, for years black hatters have perverted the value of guest posts by 'creating private blog networks,' which put out mass quantities of low-quality content for the sole purpose of exchanging backlinks. Google has caught on to this, and penalizes websites accordingly. So, you want to ensure that you only provide guest posts to reputable, respected websites that are relevant to your industry.
Matt, in almost every example you have given about “employing great content” to receive links naturally, you use blogs as an example. What about people that do not run blog sites (the vast majority of sites!), for example an E-Com site selling stationary? How would you employ “great content” on a site that essentially sells a boring product? Is it fair that companies that sell uninteresting products or services should be outranked by huge sites like Amazon that have millions to spend on marketing because they cant attract links naturally?
Submit website to directories (limited use). Professional search marketers don’t sub­mit the URL to the major search engines, but it’s possible to do so. A better and faster way is to get links back to your site naturally. Links get your site indexed by the search engines. However, you should submit your URL to directories such as Yahoo! (paid), Business.com (paid) and DMOZ (free). Some may choose to include AdSense (google.com/adsense) scripts on a new site to get their Google Media bot to visit. It will likely get your pages indexed quickly.

Spam is a poison that in different ways (and in different names) affects many things. Matt, you and your guys do a great job in trying to keep it at bay. But, as mentioned before, with that role and power, you set the rules for the web in many ways. As I have said before even though the JavaScript link change is not (in Danny’s words) backward compatible, it is understandable. I will maintain that the PageRank sculpting thing is not the same.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.
Also hadn’t thought about decreasing the rank value based on the spammyness of sites a page is linking into. My guess on how to do it would be determining the spammyness of individual pages based on multiple page and site factors, then some type of reverse pagerank calcuation starting with the those bad scores, then overlaying that on top of the “good” pagerank calculation as a penalty. This is another thing which would be interesting to play around with in the Nutch algorithm.
Backlinks are important for a number of reasons. The quality and quantity of pages backlinking to your website are some of the criteria used by search engines like Google to determine your ranking on their search engine results pages (SERP). The higher you rank on a SERP, the better for your business as people tend to click on the first few search results Google, Bing or other search engines return for them.
Donating your time or money to local charities, organizations, and schools is actually a great - yet often overlooked - way of obtaining backlinks. Such organizations often have pages where they promote sponsors and donors, giving you the opportunity to net a backlink from a trusted organization. If such an organization has a donors section on their homepage, that's even better!

We combine our sophisticated Search Engine Optimization skills with our ORM tools such as social media, social bookmarking, PR, video optimization, and content marketing to decrease the visibility of potentially damaging content. We also work with our clients to create rebuttal pages, micro-sites, positive reviews, social media profiles, and blogs in order to increase the volume of positive content that can be optimized for great search results.
Our SEM team has been managing paid search since its inception and is driven solely by analytics and financial data. Our core focus is to expand our clients’ campaigns, drive quality traffic that will foster conversions and increase revenue, while decreasing the cost per acquisition. IMI’s PPC team members are recognized thought leaders, active bloggers and speakers and major tradeshows, and care deeply about each and every client. We manage our client’s budgets as if it was our own, tracking every dollar and optimizing towards very specific milestones and metrics.
i.e. the PageRank value for a page u is dependent on the PageRank values for each page v contained in the set Bu (the set containing all pages linking to page u), divided by the number L(v) of links from page v. The algorithm involves a damping factor for the calculation of the pagerank. It is like the income tax which the govt extracts from one despite paying him itself.
There is no secret that getting hiqh-quality backlinks is your website’s way to better ranking in Google. But how to differ good link from the bad one? Carefully choosing backlinks is a very tremulous and important task for everyone who wants to optimize their sites. There are a lot of different tools which can help you to check whether your backlinks are trustful and can bring your website value. 
Collaborative Environment: A collaborative environment can be set up between the organization, the technology service provider, and the digital agencies to optimize effort, resource sharing, reusability and communications.[36] Additionally, organizations are inviting their customers to help them better understand how to service them. This source of data is called User Generated Content. Much of this is acquired via company websites where the organization invites people to share ideas that are then evaluated by other users of the site. The most popular ideas are evaluated and implemented in some form. Using this method of acquiring data and developing new products can foster the organizations relationship with their customer as well as spawn ideas that would otherwise be overlooked. UGC is low-cost advertising as it is directly from the consumers and can save advertising costs for the organisation.
I like that you said you let PageRank flow freely throughout your site. I think that’s good and I’ve steered many friends and clients to using WordPress for their website for this very reason. With WordPress, it seems obvious that each piece of content has an actual home (perma links) and so it would seem logical that Google and other search engines will figure out that structure pretty easily.

Likewise, ‘nofollowing’ your archive pages on your blog. Is this really a bad thing? You can get to the pages from the ‘tag’ index or the ‘category’ index, why put weight to a page that is truly navigational. At least the tag and category pages are themed. Giving weight to a page that is only themed by the date is crazy and does not really help search engines deliver ‘good’ results (totally leaving aside the duplicate content issues for now).
Consumers today are driven by the experience. This shift from selling products to selling an experience requires a connection with customers on a deeper level, at every digital touch point. TheeDigital’s internet marketing professionals work to enhance the customer experience, grow your online presence, generate high-quality leads, and solve your business-level challenges through innovative, creative, and tactful internet marketing.
Matt Cutts, it’s Shawn Hill from Longview, Texas and I’ve got to say, “you’re a semseo guru”. That’s obviously why Google retained you as they did. Very informative post! As head of Google’s Webspam team how to you intend to combat Social Networking Spam (SNS)? It’s becoming an increasingly obvious problem in SERPs. I’m thinking Blogspam should be the least of Google’s worries. What’s your take?
×