Personally, I wanted a bit more of the math, so I went back and read the full-length version of “The Anatomy of a Large-Scale Hypertextual Web Search Engine” (a natural first step). This was the paper written by Larry Page and Sergey Brin in 1997. Aka the paper in which they presented Google, published in the Stanford Computer Science Department. (Yes, it is long and I will be working a bit late tonight. All in good fun!)
But how do you get quoted in news articles? Websites such as HARO and ProfNet can help you to connect with journalists who have specific needs, and there are other tools that allow you to send interesting pitches to writers. Even monitoring Twitter for relevant conversations between journalists can yield opportunities to connect with writers working on pieces involving your industry.
A backlink’s value doesn’t only come from the website authority itself. There are other factors to consider as well. You’ll sometimes times hear those in the industry refer to “dofollow” and “nofollow” links. This goes back to the unethical linkbuilding tactics in the early days of SEO. One practices included commenting on blogs and leaving a link. It was an easy method and back then, search engines couldn’t tell the difference between a blog post and other site content.
One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.

Muratos – I’ve never nofollowed Amazon affiliate links on the theory that search engines probably recognize them for what they are anyway. I have a blog, though, that gets organic traffic from those Amazon products simply because people are looking for “Copenhagen ring DVD” and I hard-code the product names, musicians’ names, etc. on the page rather than use Amazon’s sexier links in iframes, etc.
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.[27]
Finally, it’s critical you spend time and resources on your business’s website design. When these aforementioned customers find your website, they’ll likely feel deterred from trusting your brand and purchasing your product if they find your site confusing or unhelpful. For this reason, it’s important you take the time to create a user-friendly (and mobile-friendly) website.
When deciding on your niche, you have to actually start a blog or a website that's going to be your online hub. This is where your anchor content is going to live. Everything else will link to here. All the ads you run and traffic you drive through social media or SEO or anything else will all come here. You need a custom domain and a professional looking site if you want anyone to take you seriously.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

What an amazing and informative post! One other option you left out was wikkigrabber. and how not many people use this option! Google wikki grabber, type in keywords and find articles missing links etc on Wikipedia, edit a post with what was missing (make sure it is relevant to the article or post otherwise it will be removed) and them boom! Quality, powerful backlink!


Internet marketing is a number of things. And true success in the field involves an immersion into several skill sets that are required if you really want to succeed at the highest level. That's why I knew I needed to go the top of the food chain of online marketers to get an understanding of what this actually takes. And it's important to note that while there are many hyped-up gurus out there, there are also genuine individuals that aren't just looking to extract money from you.


Google will like your content if your clients like it. The content should be helpful and contain less information which is already known to the reader. It is to meet their expectations. When users vote for your site, it means that Google starts accepting it as an authority site. That’s why content writing is as important as a speech of a candidate for the Presidency. The better it is, the more visitors you have.
Great article and writing in general. My company just published a 5,000 word Keyword targeting best practices guide for PPC and SEO, and we linked to your article “10 Reasons You Should Use Google Trends for More Than Just Keyword Research”. http://vabulous.com/keyword-research-targeting-for-ppc-and-seo-guide/ I would love if you checked it out and possibly shared it if you like it.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
This is what happens to the numbers after 15 iterations…. Look at how the 5 nodes are all stabilizing to the same numbers. If we had started with all pages being 1, by the way, which is what most people tell you to do, this would have taken many more iterations to get to a stable set of numbers (and in fact – in this model – would not have stabilized at all)
The probability for the random surfer not stopping to click on links is given by the damping factor d, which is, depending on the degree of probability therefore, set between 0 and 1. The higher d is, the more likely will the random surfer keep clicking links. Since the surfer jumps to another page at random after he stopped clicking links, the probability therefore is implemented as a constant (1-d) into the algorithm. Regardless of inbound links, the probability for the random surfer jumping to a page is always (1-d), so a page has always a minimum PageRank.

Matt, this is an excellent summary. I finally got around to reading “The Search” by John Battelle and it was very enlightening to understand much of the academia behind what led to the creation of Backrub.. er Google.Looking at how many times the project was almost shutdown due to bandwidth consumption (> 50% of what the university could offer at times) as well as webmasters being concerned that their pages would be stolen and recreated. It’s so interesting to see that issues we see today are some of the same ones that Larry and Sergey were dealing with back then. As always, thanks for the great read Matt!
Regarding nofollow on content that you don’t want indexed, you’re absolutely right that nofollow doesn’t prevent that, e.g. if someone else links to that content. In the case of the site that excluded user forums, quite a few high-quality pages on the site happened not to have links from other sites. In the case of my feed, it doesn’t matter much either way, but I chose not to throw any extra PageRank onto my feed url. The services that want to fetch my feed url (e.g. Google Reader or Bloglines) know how to find it just fine.
Online interviews are hot right now, and a great and easy way to earn backlinks to your website. Once you become the authority in your niche, you'll get lots of interview invitations, but until then, to get started, you have to make the first step. Look for websites that are running interviews and tell them you would like to participate and what knowledge you can contribute.
Cross-platform measurement: The number of marketing channels continues to expand, as measurement practices are growing in complexity. A cross-platform view must be used to unify audience measurement and media planning. Market researchers need to understand how the Omni-channel affects consumer's behaviour, although when advertisements are on a consumer's device this does not get measured. Significant aspects to cross-platform measurement involves de-duplication and understanding that you have reached an incremental level with another platform, rather than delivering more impressions against people that have previously been reached (Whiteside, 2016).[42] An example is ‘ESPN and comScore partnered on Project Blueprint discovering the sports broadcaster achieved a 21% increase in unduplicated daily reach thanks to digital advertising’ (Whiteside, 2016).[42] Television and radio industries are the electronic media, which competes with digital and other technological advertising. Yet television advertising is not directly competing with online digital advertising due to being able to cross platform with digital technology. Radio also gains power through cross platforms, in online streaming content. Television and radio continue to persuade and affect the audience, across multiple platforms (Fill, Hughes, & De Franceso, 2013).[45]

Our agency can provide both offensive and defensive ORM strategies as well as preventive ORM that includes developing new pages and social media profiles combined with consulting on continued content development. Our ORM team consists of experts from our SEO, Social Media, Content Marketing, and PR teams. At the end of the day, ORM is about getting involved in the online “conversations” and proactively addressing any potentially damaging content.
×