Chris_D, great question. If you have a single product page that can have multiple urls with slightly different parameters, that’s a great time to use a rel=canonical meta tag. You can use rel=canonical for pages with session IDs in a similar fashion. What rel=canonical lets you do is say “this page X on my host is kinda of ugly or otherwise isn’t the best version of this page. Use url Y as the preferred version of my page instead.” You can read about rel=canonical at Bear in mind that if you can make your site work without session IDs or make it so that you don’t have multiple “aliases” for the same page, that’s even better because it solves the problem at the root.
Pay per click (PPC) advertising, commonly referred to as Search Engine Marketing, delivers targeted traffic and conversions and will yield results faster than organic search engine optimization. Successful PPC marketing programs offer incredible revenue and brand-building opportunities. However, without a thorough understanding of how PPC works, it is very easy to mismanage valuable advertising budgets. That’s where we come in!
So enough of these scary stories. Google actually likes backlinks and relies upon them. The whole idea behind them is that they help to tell Google what is good and useful out there. Remember, it is still an algorithm. It doesn’t know that your page describing the best technique for restoring a 1965 Ford Mustang bumper is all that great. But if enough people are talking about how great it is, and thereby referencing that page on other websites, Google will actually know.

Matt, I’ve been a firm believer of the thought that webmasters shouldn’t really bother too much about the calculations that Google would do while spotting external links on a site. Leave that to Google. You write the content and if you find relevant resources, link to it. Why worry over PR ? In case you’re so sure about the linked site to be “kinda spammy” then nofollow it. That’s it.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
In early 2005, Google implemented a new value, "nofollow",[64] for the rel attribute of HTML link and anchor elements, so that website developers and bloggers can make links that Google will not consider for the purposes of PageRank—they are links that no longer constitute a "vote" in the PageRank system. The nofollow relationship was added in an attempt to help combat spamdexing.
Do you regularly publish helpful, useful articles, videos or other types of media that are popular and well produced? Do you write for actual human beings rather than the search engine itself? Well, you should. Latest research from Searchmetrics on ranking factors indicates that Google is moving further towards longer-form content that understands a visitor’s intention as a whole, instead of using keywords based on popular search queries to create content.
I did this post because I wanted people to understand more about PageRank, how it works, and to clarify my answers at SMX Advanced. Yes, I would agree that Google itself solely decides how much PageRank will flow to each and every link on a particular page. But that’s no reason to make PageRank a complete black box; if I can help provide people with a more accurate mental model, overall I think that’s a good thing. For example, from your proposed paragraph I would strike the “The number of links doesn’t matter” sentence because most of the time the number of links do matter, and I’d prefer that people know that. I would agree with the rest of your paragraph explanation–which is why in my mind PageRank and our search result rankings qualifies as an opinion and not simply some rote computation. But just throwing out your single paragraph, while accurate (and a whole lot faster to write!), would have been deeply unsatisfying for a number of people who want to know more.
This will help you replicate their best backlinks and better understand what methods they are using to promote their website. If they are getting links through guest blogging, try to become a guest author on the same websites. If most of their links come from blog reviews, get in touch with those bloggers and offer them a trial to test your tool. Eventually, they might write a review about it.
But I also don’t wanna lose PageRank on every comment with a link… If I can give PageRank and lose none, I wanna let the comment there, even without nofollow. But if I lose PageRank on every link, even inside original post, EVEN MORE if nofollow also takes PageRank out of me, I may just start using JavaScript or simple text without anchor for links… I definetely don’t like this idea, but I dislike even more losing PageRank on each outlink on my site. I’d just link top quality sites that I actively wanna vote for Search Engines.
Quality content is more likely to get shared. By staying away from creating "thin" content and focusing more on content that cites sources, is lengthy and it reaches unique insights, you'll be able to gain Google's trust over time. Remember, this happens as a component of time. Google knows you can't just go out there and create massive amounts of content in a few days. If you try to spin content or duplicate it in any fashion, you'll suffer a Google penalty and your visibility will be stifled.
In my view, the Reasonable Surfer model would findamentally change the matrix values above, so that the same overall PageRank is apportioned out of each node, but each outbound link carres a different value. In this scenario, you can indeed make the case that three links will generate more traffic than one, although the placement of these links might increase OR DECREASE the amount of PageRank that is passed, since (ultimately) the outbound links from page A to Page B are dependent on the location of all other outbound links on Page A. But this is the subject of another presentation for the future I think.
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.[27] 

Honestly, this I’ve read your blog for about 4 or 5 years now and the more I read the less I cared about creating new content online because it feels like even following the “Google Rules” still isn’t the way to go because unlike standards, there is no standard. You guys can change your mind whenever you feel like and I can become completely screwed. So screw it. I’m done trying to get Google to find my site. With Twitter and other outlets and 60% of all Google usage is not even finding site but Spell Check, I don’t care anymore.
From a customer experience perspective, we currently have three duplicate links to the same URL i.e. i.e. ????.com/abcde These links are helpful for the visitor to locate relevant pages on our website. However, my question is; does Google count all three of these links and pass all the value, or does Google only transfer the weight from one of these links. If it only transfers value from one of these links, does the link juice disappear from the two other links to the same page, or have these links never been given any value?
One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you. 

PageRank was developed by Google founders Larry Page and Sergey Brin at Stanford. In fact the name. PageRank is a likely play on Larry Page's name. At the time that Page and Brin met, early search engines typically linked to pages that had the highest keyword density, which meant people could game the system by repeating the same phrase over and over to attract higher search page results. Sometimes web designers would even put hidden text on pages to repeat phrases.