I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.
The self-service keyword research tools we tested all handle pricing relatively similarly, pricing by month with discounts for annual billing with most SMB-focused plans ranging in the $50-$200 per month range. Depending on how your business plans to use the tools, the way particular products delineate pricing might make more sense. KWFinder.com is the cheapest of the bunch, but it's focused squarely on ad hoc keyword and Google SERP queries, which is why the product sets quotas for keyword lookups per 24 hours at different tiers. Moz and Ahrefs price by campaigns or projects, meaning the number of websites you're tracking in the dashboard. Most of the tools also cap the number of keyword reports you can run per day. SpyFu prices a bit differently, providing unlimited data access and results but capping the number of sales leads and domain contacts.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

Having a ‘keyword rich’ domain name may lead to closer scrutiny from Google. According to Moz, Google has “de-prioritized sites with keyword-rich domains that aren’t otherwise high-quality. Having a keyword in your domain can still be beneficial, but it can also lead to closer scrutiny and a possible negative ranking effect from search engines—so tread carefully.”
This relationship between rankings and clicks (and traffic) is strongest amongst the top 3 search results. However, changing layout of the search results pages is constantly changing, with the inclusion of Google’s Knowledge Graph data and the integration of Universal Search elements (SERP Features) like videos, maps and Google Shopping ads. These developments can mean that the top 3 organic rankings are no longer the 3 best positions on the SERP. This has been demonstrated in heatmap and eye-tracking tests.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “SEO Scotland” as opposed to https://www.hobo-web.co.uk or “click here“. Saying that – in 2019, Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially with links).
QUOTE: “Cleaning up these kinds of link issue can take considerable time to be reflected by our algorithms (we don’t have a specific time in mind, but the mentioned 6-12 months is probably on the safe side). In general, you won’t see a jump up in rankings afterwards because our algorithms attempt to ignore the links already, but it makes it easier for us to trust the site later on.” John Mueller, Google, 2018
The reality in 2019 is that if Google classifies your duplicate content as THIN content, or MANIPULATIVE BOILER-PLATE or NEAR DUPLICATE ‘SPUN’ content, then you probably DO have a severe problem that violates Google’s website performance recommendations and this ‘violation’ will need ‘cleaned’ up – if – of course – you intend to rank high in Google.
It’s important to note that Google is responsible for the majority of the search engine traffic in the world. This may vary from one industry to another, but it’s likely that Google is the dominant player in the search results that your business or website would want to show up in, but the best practices outlined in this guide will help you to position your site and its content to rank in other search engines, as well.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.

The audit is for all pages and not only one. What happens in the majority of the cases is that pages / posts have similarities so you can group them together. For example the pages of a website may be ok but the blog post pages may be missing titles. It’s a lot of work especially for a 500 pages website but you can start from the most important pages first and work your way to the rest
Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.
When it comes to finally choosing the SEO tools that suit your organization's needs, the decision comes back to that concept of gaining tangible ground. It's about discerning which tools provide the most effective combination of keyword-driven SEO investigation capabilities, and then on top of that, the added keyword organization, analysis, recommendations, and other useful functionality to take action on the SEO insights you uncover. If a product is telling you what optimizations need to be made to your website, does it then provide technology to help you make those improvements?
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
The ranking of your website is partly decided by on-page factors. On-page SEO factors are all those things you can influence from within your actual website. These factors include technical aspects (e.g. the quality of your code and site speed) and content-related aspects, like the structure of your website or the quality of the copy on your website. These are all crucial on-page SEO factors.

Everything for Google these days comes down to “Customer Experience”. Page speeds, table of content, the relevancy of content, length of content, uniqueness of content, jump links, video in the content, relevant headers, less broken links, customized 404 pages etc are all indicative of improving customer experience on th website and hence helps you rank better.


Another illicit practice is to place "doorway" pages loaded with keywords on the client's site somewhere. The SEO promises this will make the page more relevant for more queries. This is inherently false since individual pages are rarely relevant for a wide range of keywords. More insidious, however, is that these doorway pages often contain hidden links to the SEO's other clients as well. Such doorway pages drain away the link popularity of a site and route it to the SEO and its other clients, which may include sites with unsavory or illegal content.
Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites or any third party service. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content and services.” Shaun Anderson, Hobo

QUOTE: “We are a health services comparison website…… so you can imagine that for the majority of those pages the content that will be presented in terms of the clinics that will be listed looking fairly similar right and the same I think holds true if you look at it from the location …… we’re conscious that this causes some kind of content duplication so the question is is this type … to worry about? “


The depth of your articles impresses and amazes me. I love all the specific examples and tool recommendations. You discuss the importance of backlinks. How important is it to use a tool to list you on directories (Yext, Moz Local, Synup or JJUMP)? Will Google penalize you for listing on unimportant directories? Is it better to avoid these tools and get backlinks one at a time and avoid all but a few key directories?
We concentrated on the keyword-based aspect of all the SEO tools that included the capabilities, because that's where most business users will primarily focus. Monitoring particular keywords and your existing URL positions in search rankings is important but, once you've set that up, it's largely an automated process. Automated position-monitoring features are a given in most SEO platforms and most will alert you to issues, but they don't actively improve your search position. Though in tools such as AWR Cloud, Moz Pro, and Searchmetrics, position monitoring can become a proactive process that feeds back into your SEO strategy. It can spur further keyword research and targeted site and competitor domain crawling.

Hi Noya, all the info suggests that dwell time IS taken into account in search ranking, and we know that Google measures time on page and bounce rate in Analytics, too. Plus the search engine gets smarter all the time. With the machine learning component of RankBrain, we wouldn’t be surprised if Google can tell the difference between sites where visitors stick around, bounces where the visitor gets an answer immediately, and bounces where the visitor keeps searching.

×