@Ronny – At SMX Advanced it was noted by Google that they can, and do follow JavaScript links. They also said that there is a way to provide a nofollow to a JavaScript link but they didn’t go into much detail about it. Vanessa Fox recently wrote a lengthy article about it over on Search Engine Land which will likely address any questions you might have: http://searchengineland.com/google-io-new-advances-in-the-searchability-of-javascript-and-flash-but-is-it-enough-19881
One thing that has worked well for me lately that can work well (and may help with the infographic promotion) is surveys. Google Forms allow you to create a survey for free. Think of interesting questions to your niche and start promoting the survey (ask well known influencers in your niche to share the survey with their social followers to help with responses. Offer them a link as a contributor once the survey is complete). Once you have a few hundred responses, you can create a commentary about your findings (Google also puts the data into graphs). If you have enough responses and the information is interesting, get in touch with the same bloggers who helped push it out there to see if they would be happy to share the results. The beauty of this method is that if the results are interesting enough, you might end up getting a link back from a huge news site.

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Our agency can provide both offensive and defensive ORM strategies as well as preventive ORM that includes developing new pages and social media profiles combined with consulting on continued content development. Our ORM team consists of experts from our SEO, Social Media, Content Marketing, and PR teams. At the end of the day, ORM is about getting involved in the online “conversations” and proactively addressing any potentially damaging content.
1. Apparently, external linking of any kind bleeds PR from the page. Following or nofollowing becomes a function of whether you want that lost PR to benefit the other site. Since nofollow has ceased to provide the benefit of retaining pagerank, the only reason to use it at all is Google Might Think This Link Is Paid. Conclusion: Google is disincentivizing external links of any kind.

The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content41 for ranking, parsing structured data, and generating snippets.
When an Internet user starts searching for something, he/she tries to solve some particular problem or achieve something. Your prior aim is to help them find a good solution. Don’t be obsessed with search volume only. Think about the user’s needs. There is no difference between 40,000 and 1,000 word posts and articles when we speak about their value. Try to create high-quality content and don’t pay any attention to certain stereotypes.

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
In both versions of my model, I used the total of my initia esitimate to check my math was not doing south. After every iteration, the total Pagerank remains the same. This means that PageRank doesn’t leak! 301 redirects cannot just bleed PageRank, otherwise the algorithm might not remain stable. On a similar note, pages with zero outbound links can’t be “fixed” by dividing by something other than zero. They do need to be fixed, but not by diluing the overall PageRank. I can maybe look at these cases in more depth if there is some demand.

My favorite tool to spy on my competitors' backlinks is called Monitor Backlinks. It allows you to add your four most important competitors. From then on, you get a weekly report containing all the new links they have earned. Inside the tool, you get more insights about these links and can sort them by their value and other SEO metrics. A useful feature is that all the links my own website already has are highlighted in green, as in the screenshot below.
Here’s my take on the whole pagerank sculpting situation. As I understand it, the basic idea is that you can increase your rankings in Google by channeling the page rank of your pages to the pages you want ranked. This used be done with the use of the ‘no folow’ tag. That said, things have changed, and Google has come out and said that the way ‘no follow’ use to work has changed. In short, using ‘no follow’ to channel that page rank juice is no longer as effective as it once was.

Brand awareness has been proven to work with more effectiveness in countries that are high in uncertainty avoidance, also these countries that have uncertainty avoidance; social media marketing works effectively. Yet brands must be careful not to be excessive on the use of this type of marketing, as well as solely relying on it as it may have implications that could negatively harness their image. Brands that represent themselves in an anthropomorphizing manner are more likely to succeed in situations where a brand is marketing to this demographic. "Since social media use can enhance the knowledge of the brand and thus decrease the uncertainty, it is possible that people with high uncertainty avoidance, such as the French, will particularly appreciate the high social media interaction with an anthropomorphized brand." Moreover, digital platform provides an ease to the brand and its customers to interact directly and exchange their motives virtually.[33]
In search engine optimization (SEO) terminology a backlink is a hyperlink that links from a Web page, back to your own Web page or Web site. Also called an Inbound Link (IBL) these links are important in determining the popularity (or importance) of your Web site. Some search engines, including Google will consider Web sites with more backlinks more relevant in search results pages. May also be written as two separate words, back link.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[53] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[54]
The internet is full of business potential, but it is also rife with competition. In this situation, it becomes really tough to sell your products or services. Affiliate marketing can help you effectively promote your product on the web. By helping you reach out to a large potential customer base, affiliate programs help you to connect with millions of customers across the globe.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.
On another note, I would like to express my contempt for Google and its so called terms of service regarding the legitimate acquisition of links. why should it care if links are paid for or not? Thanks to the invention of pagerank, it is Google itself that has cancelled out reciprocal linking and has stopped people giving out links due to fear of them losing pagerank, and blogs and forums are worthless thanks to the nofollow trick. so it is now impossible to get decent links organically, without having to pay for them, and those who do give out free links are considered fools. Google has brought this dilemma on itself, and yet it seems like punishing us for trying to get links other than freely! Face facts, no one is going to link to someone without getting a link in return! google has invented pagerank which is like a currency, and so people expect to be paid for links, as giving out links devalues their pagerank and so compensation is now required. It is forcing people to use underhand methods to get links, mostly the ‘paid’ variety. 

Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.
However, if you are seasoned online marketer, and you've built a substantial following, then marketing as an affiliate might be the right fit. Jason Stone from Millionaire Mentor has built a seven-figure business with affiliate marketing, while David Sharpe from Legendary Marketer has built up an eight-figure business by creating an army of affiliates that market products in collaboration with his team.
Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to manipulate links on a web page to try to achieve a higher ranking, it is a lot harder to influence a search engine with external backlinks from other websites. This is also a reason why backlinks factor in so highly into a search engine's algorithm. Lately, however, a search engine's criteria for quality inbound links has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these inbound links by deceptive or sneaky techniques, such as with hidden links, or automatically generated pages whose sole purpose is to provide inbound links to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.

Here’s my take on the whole pagerank sculpting situation. As I understand it, the basic idea is that you can increase your rankings in Google by channeling the page rank of your pages to the pages you want ranked. This used be done with the use of the ‘no folow’ tag. That said, things have changed, and Google has come out and said that the way ‘no follow’ use to work has changed. In short, using ‘no follow’ to channel that page rank juice is no longer as effective as it once was.
I agree that the more facts that you provide and if you were to provide the complete algorithm, people would abuse it but if it were available to everyone, would it not almost force people to implement better site building and navigation policies and white hat seo simply because everyone would have the same tools to work with and an absolute standard to adhere to.
What seems to be happening is that the toolbar looks at the URL of the page the browser is displaying and strips off everything down the last “/” (i.e. it goes to the “parent” page in URL terms). If Google has a Toolbar PR for that parent then it subtracts 1 and shows that as the Toolbar PR for this page. If there’s no PR for the parent it goes to the parent’s parent’s page, but subtracting 2, and so on all the way up to the root of your site.  If it can’t find a Toolbar PR to display in this way, that is if it doesn’t find a page with a real calculated PR, then the bar is greyed out.
A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.
Mani, could not agree more your statements. It’s no wonder the SEO industry has such a bad name and it’s 99.9% Snake Oil. Still, Google, this Blog and other “SEO” sites are partly responsible for the PR hysteria, link spam and email spam for PR. Google should also put an end to Webmasters being screwed by SEO, but placing a BIG prominet statements on their Webmaster pages along the lines of;
Larry Page and Sergey Brin developed PageRank at Stanford University in 1996 as part of a research project about a new kind of search engine.[12] Sergey Brin had the idea that information on the web could be ordered in a hierarchy by "link popularity": a page ranks higher as there are more links to it.[13] Rajeev Motwani and Terry Winograd co-authored with Page and Brin the first paper about the project, describing PageRank and the initial prototype of the Google search engine, published in 1998:[5] shortly after, Page and Brin founded Google Inc., the company behind the Google search engine. While just one of many factors that determine the ranking of Google search results, PageRank continues to provide the basis for all of Google's web-search tools.[14]
Great article and writing in general. My company just published a 5,000 word Keyword targeting best practices guide for PPC and SEO, and we linked to your article “10 Reasons You Should Use Google Trends for More Than Just Keyword Research”. http://vabulous.com/keyword-research-targeting-for-ppc-and-seo-guide/ I would love if you checked it out and possibly shared it if you like it.
However, some of the world's top-earning blogs gross millions of dollars per month on autopilot. It's a great source of passive income and if you know what you're doing, you could earn a substantial living from it. You don't need millions of visitors per month to rake in the cash, but you do need to connect with your audience and have clarity in your voice.
Also hadn’t thought about decreasing the rank value based on the spammyness of sites a page is linking into. My guess on how to do it would be determining the spammyness of individual pages based on multiple page and site factors, then some type of reverse pagerank calcuation starting with the those bad scores, then overlaying that on top of the “good” pagerank calculation as a penalty. This is another thing which would be interesting to play around with in the Nutch algorithm.
Check your robots.txt file. Make sure you learn how to hide content you don’t want indexed from search engines and that search engines can find the content you do want indexed, too. (You will want to hide things such as repeat content, which can be penalized by search engines but is still necessary on your site). You’ll find a link to how to modify the robots.txt at the end of this article.
Consumers today are driven by the experience. This shift from selling products to selling an experience requires a connection with customers on a deeper level, at every digital touch point. TheeDigital’s internet marketing professionals work to enhance the customer experience, grow your online presence, generate high-quality leads, and solve your business-level challenges through innovative, creative, and tactful internet marketing.
×