Ian Rogers first used the Internet in 1986 sending email on a University VAX machine! He first installed a webserver in 1990, taught himself HTML and perl CGI scripting. Since then he has been a Senior Research Fellow in User Interface Design and a consultant in Network Security and Database Backed Websites. He has had an informal interest in topology and the mathematics and behaviour of networks for years and has also been known to do a little Jive dancing.
Mathematical PageRanks for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value. If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. In the presence of damping, Page A effectively links to all pages in the web, even though it has no outgoing links of its own.
Google might see 10 links on a page that has $10 of PageRank to spend. It might notice that 5 of those links are navigational elements that occur a lot throughout the site and decide they should only get 50 cents each. It might decide 5 of those links are in editorial copy and so are worthy of getting more. Maybe 3 of them get $2 each and 2 others get $1.50 each, because of where they appear in the copy, if they’re bolded or any of a number of other factors you don’t disclose.
As mentioned earlier, technology and the internet allows for 24 hours a day, 7 days a week service for customers as well as enabling them to shop online at any hour of that day or night, not just when the shops are over and across the whole world. This is a huge advantage for retailers to use it and direct customers from the store to its online store. It has also opened up an opportunity for companies to only be online based rather than having an outlet or store due to the popularity and capabilities of digital marketing.

Search engine optimization (SEO) receives a lot of love from inexperienced marketers. It’s seen as “free marketing” in that you can handle your own SEO work (as long as you follow some rules to do so), and thus all it requires is your time to make things happen. SEO is simply what you do to your website and web pages to make them show up in “organic” (or unpaid) search results on search engines.
I work on a site that allows users to find what they are looking for by clicking links that take them deeper and deeper into the site hierarchy. Content can be categorised in lots of different ways. After about three steps the difference between the results pages shown is of significance to a user but not to a search engine. I was about to add nofollow to links that took the browser deeper than 3 levels but after this announcement I won’t be…
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
On a blog the page rank should go to the main article pages. Now it just gets “evaporated” if you use “nofollow” or scattered to all the far flung nooks and crannys which means google will not be able to see the wood for the trees. The vast majority of a site’s overall page rank will now reside in the long tail of useless pages such as commentors profile pages. This can only make it harder for google to serve up the most relevant pages. 

He is the co-founder of Neil Patel Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations. 

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Well – maybe for a few of you. But this algorithm is fundamental in understanding links and in particular, understanding why most links count for nothing or almost nothing. When you get to grips with Google’s algorithm, you will be light years ahead of other SEOs… but I never really see it properly explained. I guarantee that even if you know this algorithm inside out, you’ll see some unexpected results from this math by the end of this post and you will also never use the phrase “Domain Authority” in front of a customer again (at least in relation to links).
Let’s say that I want to link to some popular search results on my catalog or directory site – you know, to give a new user an alternative way of sampling the site. Of course, following Google’s advice, I have to “avoid allowing search result-like pages to be crawled”. Now, I happen to think that these pages are great for the new user, but I accept Google’s advice and block them using robots.txt.

Less than 2 years ago one could promote a website within a month with the help of PBN (Private Blog Network). Then Google created “a sandbox” which made a site owner wait no less than 3 months before the effect of PBN backlinks turned to be visible. There are two more negative factors: risk and financial investment. You will realize that neither your wasted time nor money were worth it. That’s why it’s better to rely on proper backlinks from real sites.
Probably the most creative thing I’ve ever done was wrote a review on a restaurant (The Heart Attack Grill) that was hilarious, emailed it to the owner. He loved it so much he posted it on FB and even put it on his homepage for a while. I got thousands of visitors from this stupid article: https://www.insuranceblogbychris.com/buy-life-insurance-before-eating-at-heart-attack-grill/
The Nielsen Global Connected Commerce Survey conducted interviews in 26 countries to observe how consumers are using the Internet to make shopping decisions in stores and online. Online shoppers are increasingly looking to purchase internationally, with over 50% in the study who purchased online in the last six months stating they bought from an overseas retailer.[23]
Digital marketing planning is a term used in marketing management. It describes the first stage of forming a digital marketing strategy for the wider digital marketing system. The difference between digital and traditional marketing planning is that it uses digitally based communication tools and technology such as Social, Web, Mobile, Scannable Surface.[57][58] Nevertheless, both are aligned with the vision, the mission of the company and the overarching business strategy.[59]

Internet Marketing Inc. provides integrated online marketing strategies that help companies grow. We think of ourselves as a business development consulting firm that uses interactive marketing as a tool to increase revenue and profits. Our management team has decades of combined experience in online marketing as well as graduate level education and experience in business and finance. That is why we focus on creating integrated online marketing campaigns designed to maximize your return on investment.
Steve, sometimes good information to users is a consolidation of very high quality links. We have over 3000 links to small business sites within the SBA as well as links to the Harvard and Yale library, academic journals, etc. But because we have the understanding that there should be no more than a hundred links in a website (more now from what Matt said) we have used nofollow on all of them out of fear that Google will penalize our site because of the amount of links.
The PageRank algorithm has major effects on society as it contains a social influence. As opposed to the scientific viewpoint of PageRank as an algorithm the humanities instead view it through a lens examining its social components. In these instances, it is dissected and reviewed not for its technological advancement in the field of search engines, but for its societal influences.[42] Laura Granka discusses PageRank by describing how the pages are not simply ranked via popularity as they contain a reliability that gives them a trustworthy quality. This has led to a development of behavior that is directly linked to PageRank. PageRank is viewed as the definitive rank of products and businesses and thus, can manipulate thinking. The information that is available to individuals is what shapes thinking and ideology and PageRank is the device that displays this information. The results shown are the forum to which information is delivered to the public and these results have a societal impact as they will affect how a person thinks and acts.
Thank you, Brian, for this definitive guide. I have already signed up for Haro and have plans to implement some of your strategies. My blog is related to providing digital marketing tutorials for beginners and hence can be in your niche as well. This is so good. I highly recommend all my team members in my company to read your blog everytime you published new content. 537 comments in this post within a day, you are a master of this. A great influence in digital marketing space.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[32] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[33] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[34]
PageRank gets its name from Google cofounder Larry Page. You can read the original ranking system to calculate PageRank here, if you want. Check out the original paper about how Google worked here, while you’re at it. But for dissecting how Google works today, these documents from 1998 and 2000 won’t help you much. Still, they’ve been pored over, analyzed and unfortunately sometimes spouted as the gospel of how Google operates now.
Cross-platform measurement: The number of marketing channels continues to expand, as measurement practices are growing in complexity. A cross-platform view must be used to unify audience measurement and media planning. Market researchers need to understand how the Omni-channel affects consumer's behaviour, although when advertisements are on a consumer's device this does not get measured. Significant aspects to cross-platform measurement involves de-duplication and understanding that you have reached an incremental level with another platform, rather than delivering more impressions against people that have previously been reached (Whiteside, 2016).[42] An example is ‘ESPN and comScore partnered on Project Blueprint discovering the sports broadcaster achieved a 21% increase in unduplicated daily reach thanks to digital advertising’ (Whiteside, 2016).[42] Television and radio industries are the electronic media, which competes with digital and other technological advertising. Yet television advertising is not directly competing with online digital advertising due to being able to cross platform with digital technology. Radio also gains power through cross platforms, in online streaming content. Television and radio continue to persuade and affect the audience, across multiple platforms (Fill, Hughes, & De Franceso, 2013).[45]

There are also many keyword research tools (some free and some paid) that claim to take the effort out of this process. A popular tool for first timers is Traffic Travis, which can also analyse your competitors’ sites for their keyword optimization strategies and, as a bonus, it can deliver detailed analysis on their back-linking strategy, too. You can also use Moz.com’s incredibly useful keyword research tools – they’re the industry leader, but they come at a somewhat higher price.


According to the U.S. Commerce Department, consumers spent $453.46 billion on the web for retail purchases in 2017, a 16.0% increase compared with $390.99 billion in 2016. That’s the highest growth rate since 2011, when online sales grew 17.5% over 2010. Forrester predicts that online sales will account for 17% of all US retail sales by 2022. And digital advertising is also growing strongly; According to Strategy Analytics, in 2017 digital advertising was up 12%, accounting for approximately 38% of overall spending on advertising, or $207.44 billion. 

Page Structure - The third core component of SEO is page structure. Because web pages are written in HTML, how the HTML code is structured can impact a search engine’s ability to evaluate a page. Including relevant keywords in the title, URL, and headers of the page and making sure that a site is crawlable are actions that site owners can take to improve the SEO of their site.
In early 2005, Google implemented a new value, "nofollow",[64] for the rel attribute of HTML link and anchor elements, so that website developers and bloggers can make links that Google will not consider for the purposes of PageRank—they are links that no longer constitute a "vote" in the PageRank system. The nofollow relationship was added in an attempt to help combat spamdexing.
Google uses a hyperlink based algorithm (known as ‘PageRank’) to calculate the popularity and authority of a page, and while Google is far more sophisticated today, this is still a fundamental signal in ranking. SEO can therefore also include activity to help improve the number and quality of ‘inbound links’ to a website, from other websites. This activity has historically been known as ‘link building’, but is really just marketing a brand with an emphasis online, through content or digital PR for example.
We have to remember that Google’s $ model+bots to scour the web have to tow the same line so they can optimize their own pocketbook, balancing a free and open resource – ie. the www, all while taking money from the natural competition that arises from their market share. On the one side, its all about appearing fair and the other, to drive competitive output.

Backlinks take place across the Internet when one website mentions another website and links to it. Also, referred to as “incoming links,” backlinks make their connection through external websites. These links from outside domains point to pages on your own domain. Whenever backlinks occur, it is like receiving a vote for a webpage. The more votes you get from the authoritative sites creates a positive effect on a site’s ranking and search visibility.
Why do so many people spend so much time researching SEO and page rank? Its really not that hard to figure out, (I am speaking in a nice tone by the way =) – all you should need to be focusing on is advertising and building your website in a manner that is ethical, operational and practical for the content and industry that your website is in/about. If you are not up-to-something, then google will know it, and they will rank you accordingly. If you spend so much time trying to figure out how to get to the top, I bet you google spends triple that time figuring out how to figure out how your trying to get to the top. So and and so forth…and your not going to win. Have good content not copied, stay away from to many out bound links especially affiliates, post your backlinks at places that have something to do with your site, etc etc… Is it an American thing, I don’t seem to see it as bad in other places of the world, that is “always trying to figure out an easy way, a quick fix, a way to not have to put in the effort…” anyway… Thanks for letting me vent. Please not nasty replies. Keep it to your self = )

And my vital question about Amazon affiliate links. I think many people also wonder about it as well. I have several blogs where I solely write unique content reviews about several Amazon products, nothing more. As you know, all these links are full of tags, affiliate IDs whatsoever (bad in SEO terms). Should I nofollow them all or leave as they are?
“There may be a miniscule number of pages (such as links to a shopping cart or to a login page) that I might add nofollow on, just because those pages are different for every user and they aren’t that helpful to show up in search engines” – it doesn`t make much sense. If a page isn`t helpful and should not show up on search results, the best option is to meta-noindex the page and disallow it on robots.txt.
A: I wouldn’t recommend it, because it isn’t the most effective way to utilize your PageRank. In general, I would let PageRank flow freely within your site. The notion of “PageRank sculpting” has always been a second- or third-order recommendation for us. I would recommend the first-order things to pay attention to are 1) making great content that will attract links in the first place, and 2) choosing a site architecture that makes your site usable/crawlable for humans and search engines alike.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content41 for ranking, parsing structured data, and generating snippets.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
Thanks to Google Search Console, Ahrefs, and, of course, Sitechecker you can easily check your website, look for 404 errors and proceed to their reclamation. It’s a very easy and effective way to boost the authority. We think that you can use several of the above-mentioned programs to examine your site in case one of them misses some 404 links. If you find some 404 errors, 301 redirect them to an appropriate webpage or to your homepage.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
While SEOs can provide clients with valuable services, some unethical SEOs have given the industry a black eye by using overly aggressive marketing efforts and attempting to manipulate search engine results in unfair ways. Practices that violate our guidelines may result in a negative adjustment of your site's presence in Google, or even the removal of your site from our index.
Should have added in my previous comment that our site has been established since 2000 and all our links have always been followable – including comment links (but all are manually edited to weed out spambots). We have never artificially cultivated backlinks but I have noticed that longstanding backlinks from established sites like government and trade organisations are changing to ‘nofollow’ (and our homepage PR has declined from 7 to 4 over the past 5 years). If webmasters of the established sites are converting to systems which automatically change links to ‘nofollow’ then soon the only followable links will be those that are paid for – and the blackhats win again.

Yes, the more links on a page the smaller the amount of page rank it can pass on to each, but that was as it was before. With regard to what happens to the ‘missing’ page rank, it seems that if this is the case all over the Internet, and it will be, the total amount of page rank flow is reduced the same all over so you don’t need as much page rank flow to your good links to maintain relative position.

A backlink’s value doesn’t only come from the website authority itself. There are other factors to consider as well. You’ll sometimes times hear those in the industry refer to “dofollow” and “nofollow” links. This goes back to the unethical linkbuilding tactics in the early days of SEO. One practices included commenting on blogs and leaving a link. It was an easy method and back then, search engines couldn’t tell the difference between a blog post and other site content.
Moreover, the PageRank mechanism is entirely general, so it can applied to any graph or network in any field. Currently, the PR formula is used in bibliometrics, social and information network analysis, and for link prediction and recommendation. It's even used for system analysis of road networks, as well as biology, chemistry, neuroscience, and physics.
Also given that the original reasons for implementing the ‘nofollow’ tag was to reduce comment spam (something that it really hasn’t had a great effect in combatting) – the real question I have is why did they ever take any notice of nofollow on internal links in the first place? It seems to me that in this case they made the rod for their own back.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
As for the use of nofollow as a way to keep pages that shouldn’t be indexed out of Google (as with your feed example) is terrible advice. Your use of it on your feed link does nothing. If anyone links to your feed without nofollow, then it’s going to get indexed. Things that shouldn’t be indexed need to use either robots.txt or meta robots blocking. Nofollow on links to those items isn’t a solution.
Well, to make things worse, website owners quickly realized they could exploit this weakness by resorting to “keyword stuffing,” a practice that simply involved creating websites with massive lists of keywords and making money off of the ad revenue they generated. This made search engines largely worthless, and weakened the usefulness of the Internet as a whole. How could this problem be fixed?
In this illustration from the “PageRank Citation Ranking” paper, the authors demonstrate how webpages pass value onto other pages. The two pages on the left have a value of 100 and 9, respectively. The page with a value of 100 has two links that point to the pages on the right. That page’s value of 100 is divided between the two links, so that each conveys a value of 50. The other page on the left has three outgoing links, each carrying one-third of the page’s value of 9. One link goes to the top page on the right, which ends up with a total value of 53. The bottom right page has no other backlinks, so its total value is 50.
While most search engine companies try to keep their processes a secret, their criteria for high spots on SERPs isn't a complete mystery. Search engines are successful only if they provide a user links to the best Web sites related to the user's search terms. If your site is the best skydiving resource on the Web, it benefits search engines to list the site high up on their SERPs. You just have to find a way to show search engines that your site belongs at the top of the heap. That's where search engine optimization (SEO) comes in -- it's a collection of techniques a webmaster can use to improve his or her site's SERP position.

Try using Dribble to find designers with good portfolios. Contact them directly by upgrading your account to PRO status, for just $20 a year. Then simply use the search filter and type "infographics." After finding someone you like, click on "hire me" and send a message detailing your needs and requesting a price. Fiver is another place to find great designers willing to create inexpensive infographics.


Probably the most creative thing I’ve ever done was wrote a review on a restaurant (The Heart Attack Grill) that was hilarious, emailed it to the owner. He loved it so much he posted it on FB and even put it on his homepage for a while. I got thousands of visitors from this stupid article: https://www.insuranceblogbychris.com/buy-life-insurance-before-eating-at-heart-attack-grill/
There are also many keyword research tools (some free and some paid) that claim to take the effort out of this process. A popular tool for first timers is Traffic Travis, which can also analyse your competitors’ sites for their keyword optimization strategies and, as a bonus, it can deliver detailed analysis on their back-linking strategy, too. You can also use Moz.com’s incredibly useful keyword research tools – they’re the industry leader, but they come at a somewhat higher price.

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
When you comment on a blog post, you are usually allowed to include a link back to your website. This is often abused by spammers and can become a negative link building tool. But if you post genuine comments on high-quality blog posts, there can be some value in sharing links, as it can drive traffic to your site and increase the visibility of your brand.
The internet was the little guy savior, simple sites could rank well locally. Sadly your company is in the process of destroying that. In this economy small business with zero page rank that are listed on page 22 of results, need to be found in order to survive. My customers are really suffering because of the work that is coming out of Google, it keeps getting worse. Their conversions are still good coming out of Yahoo and MSN and now Bing. They do not have the resources to produce blogs, forums, or $5,000 websites let alone pay for Adwords when they are just trying to pay rent and not a lot of people can do their own web production.
Site owners are using the toolbar to find “good” sites that they should get links from, regardless of the fact that link context is also important, not to mention many, many other factors that are used by Google to rank a web page. Other site owners, getting a gray PR0 toolbar for their site, immediate assume the worst, that they’ve been blacklisted.
×