QUOTE: “Duplicated content is often not manipulative and is commonplace on many websites and often free from malicious intent. Copied content can often be penalised algorithmically or manually. Duplicate content is not penalised, but this is often not an optimal set-up for pages, either. Be VERY careful ‘spinning’ ‘copied’ text to make it unique!” Shaun Anderson, Hobo, 2018

You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.


You pay each time a user clicks on an SEM result. You pay nothing when a user clicks on an SEO result. SEM results are paid placements, and your brand is charged each time a user clicks on the result. Therefore, you must have a budget for continually showing SEM ads and using this form of PPC lead generation. On the flip side, you are never charged when a user clicks on an organic search result.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.

Are you just launching your first website and creating your initial online footprint to promote your product or service? Then you’ll likely need immediate visibility in search until you build up your organic credibility. With a strategic PPC campaign, you'll be able to achieve this. What you shouldn't do, though, is rely strictly on PPC over the long-term while ignoring organic SEO. You still need to create great content that visitors will want to engage with once they get to your website.


We concentrated on the keyword-based aspect of all the SEO tools that included the capabilities, because that's where most business users will primarily focus. Monitoring particular keywords and your existing URL positions in search rankings is important but, once you've set that up, it's largely an automated process. Automated position-monitoring features are a given in most SEO platforms and most will alert you to issues, but they don't actively improve your search position. Though in tools such as AWR Cloud, Moz Pro, and Searchmetrics, position monitoring can become a proactive process that feeds back into your SEO strategy. It can spur further keyword research and targeted site and competitor domain crawling.
Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank ‘thin’ pages in results – any page you want to rank – should have all the things Google is looking for. That’s a lot these days!
For example, within the HubSpot Blogging App, users will find as-you-type SEO suggestions. This helpful inclusion serves as a checklist for content creators of all skill levels. HubSpot customers also have access to the Page Performance App, Sources Report, and the Keyword App. The HubSpot Marketing Platform will provide you with the tools you need to research keywords, monitor their performance, track organic search growth, and diagnose pages that may not be fully optimized.

QUOTE: “Starting April 21 (2015), we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high-quality search results that are optimized for their devices”. GOOGLE
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.

This review roundup covers 10 SEO tools: Ahrefs, AWR Cloud, DeepCrawl, KWFinder.com, LinkResearchTools, Majestic, Moz Pro, Searchmetrics Essentials, SEMrush, and SpyFu. The primary function of KWFinder.com, Moz Pro, SEMrush, and SpyFu falls under keyword-focused SEO. When deciding what search topics to target and how best to focus your SEO efforts, treating keyword querying like an investigative tool is where you'll likely get the best results.


The depth of your articles impresses and amazes me. I love all the specific examples and tool recommendations. You discuss the importance of backlinks. How important is it to use a tool to list you on directories (Yext, Moz Local, Synup or JJUMP)? Will Google penalize you for listing on unimportant directories? Is it better to avoid these tools and get backlinks one at a time and avoid all but a few key directories?
Both use keyword research to uncover popular search terms. The first step for both SEM and SEO is performing keyword research to identify the best keywords to target. The research includes looking at keyword popularity to determine the top keywords or buying keywords that your ideal audience searches for. It also includes looking at keyword competition to see what other brands are targeting the same keywords and determining what you will need to do to compete with those other companies.
The Java program is fairly intuitive, with easy-to-navigate tabs. Additionally, you can export any or all of the data into Excel for further analysis. So say you're using Optify, Moz, or RavenSEO to monitor your links or rankings for specific keywords -- you could simply create a .csv file from your spreadsheet, make a few adjustments for the proper formatting, and upload it to those tools.
In addition to on-page SEO factors, there are off-page SEO factors. These factors include links from other websites, social media attention, and other marketing activities outside your own website. These off-page SEO factors can be rather difficult to influence. The most important of these off-page factors is the number and quality of links pointing towards your site. The more quality, relevant sites that link to your website, the higher your position in Google will be.
The reality in 2019 is that if Google classifies your duplicate content as THIN content, or MANIPULATIVE BOILER-PLATE or NEAR DUPLICATE ‘SPUN’ content, then you probably DO have a severe problem that violates Google’s website performance recommendations and this ‘violation’ will need ‘cleaned’ up – if – of course – you intend to rank high in Google.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
The emphasis on tools, meaning plural, is important because there's no one magical way to plop your website atop every single search results page, at least not organically, though there are best practices to do so. If you want to buy a paid search ad spot, then Google AdWords will happily take your money. This will certainly put your website at the top of Google's search results but always with an indicator that yours is a paid position. To win the more valuable and customer-trusted organic search spots (meaning those spots that start below all of those marked with an "Ad" icon), you must have a balanced and comprehensive SEO strategy in place.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.

QUOTE: “Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose. SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.” Google Search Quality Evaluator Guidelines 2017
Ask for a technical and search audit for your site to learn what they think needs to be done, why, and what the expected outcome should be. You'll probably have to pay for this. You will probably have to give them read-only access to your site on Search Console. (At this stage, don't grant them write access.) Your prospective SEO should be able to give you realistic estimates of improvement, and an estimate of the work involved. If they guarantee you that their changes will give you first place in search results, find someone else.

The impact of SEM is immediate. SEO takes time. Through paid SEM ads, you can start to put your results in front of audiences with just a few clicks. As soon as you launch a campaign, your ads start showing in SERPs. At any time, you can turn ads on to increase visibility or turn them off to stop showing. Conversely, SEO is something that you acquire over time and typically over a long time. It can take months of implementing an SEO strategy before a brand begins to rank on search engines.


The terms SEO experts often start with are page authority (PA) and domain authority (DA). DA, a concept in fact coined by Moz, is a 100-point scale that predicts how well a website will rank on search engines. PA is the modern umbrella term for what started as Google's original PageRank algorithm, developed by co-founders Larry Page and Sergey Brin. Google still uses PageRank internally but has gradually stopped supporting the increasingly irrelevant metric, which it now rarely updates. PA is the custom metric each SEO vendor now calculates independently to gauge and rate (again, on a scale of 100) the link structure and authoritative strength of an individual page on any given domain. There is an SEO industry debate as to the validity of PA and DA, and how much influence the PageRank algorithm still holds in Google results (more on that in a bit), but outside of Google's own analytics, they're the most widely accepted metrics out there.
I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site; some pages cannot. Some links can. Some cannot. Some links are trusted enough to pass ranking signals to another page. Some are not. YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES & FILTERS.
SEM search results have ad extensions. SEO search results have featured snippets. When comparing SEM vs. SEO, you’ll also find differences in the appearance of the search results. SEM search results may include ad extensions, which can add on additional links, phone numbers, and callouts. On the other hand, SEO results may appear with featured snippets in search.

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.


In the enterprise space, one major trend we're seeing lately is data import across the big players. Much of SEO involves working with the data Google gives you and then filling in all of the gaps. Google Search Console (formerly, Webmaster Tools) only gives you a 90-day window of data, so enterprise vendors, such as Conductor and Screaming Frog, are continually adding and importing data sources from other crawling databases (like DeepCrawl's). They're combining that with Google Search Console data for more accurate, ongoing Search Engine Results Page (SERP) monitoring and position tracking on specific keywords. SEMrush and Searchmetrics (in its enterprise Suite packages) offer this level of enterprise SERP monitoring as well, which can give your business a higher-level view of how you're doing against competitors.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites or any third party service. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content and services.” Shaun Anderson, Hobo
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Webmaster Tools if you sign up.

All of this plays into a new way businesses and SEO professionals need to think when approaching what keywords to target and what SERP positions to chase. The enterprise SEO platforms are beginning to do this, but the next step in SEO is full-blown content recommendation engines and predictive analytics. By using all of the data you pull from your various SEO tools, Google Search Console, and keyword and trend data from social listening platforms, you can optimize for a given keyword or query before Google does it first. If your keyword research uncovers a high-value keyword or SERP for which Google has not yet monetized the page with a Quick Answer or a Featured Snippet, then pounce on that opportunity.
The caveat in all of this is that, in one way or another, most of the data and the rules governing what ranks and what doesn't (often on a week-to-week basis) comes from Google. If you know where to find and how to use the free and freemium tools Google provides under the surface—AdWords, Google Analytics, and Google Search Console being the big three—you can do all of this manually. Much of the data that the ongoing position monitoring, keyword research, and crawler tools provide is extracted in one form or another from Google itself. Doing it yourself is a disjointed, meticulous process, but you can piece together all the SEO data you need to come up with an optimization strategy should you be so inclined.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Sometimes, Google turns up the dial on demands on ‘quality’, and if your site falls short, a website traffic crunch is assured. Some sites invite problems ignoring Google’s ‘rules’ and some sites inadvertently introduce technical problems to their site after the date of a major algorithm update and are then impacted negatively by later refreshes of the algorithm.
Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.

SEM search results have ad extensions. SEO search results have featured snippets. When comparing SEM vs. SEO, you’ll also find differences in the appearance of the search results. SEM search results may include ad extensions, which can add on additional links, phone numbers, and callouts. On the other hand, SEO results may appear with featured snippets in search.
QUOTE: “In place of a pop-up try a full-screen inline ad. It offers the same amount of screen real estate as pop-ups without covering up any content. Fixing the problem depends on the issue you have for example if it’s a pop-up you’ll need to remove all the pop-up ads from your site but if the issue is high ad density on a page you’ll need to reduce the number of ads” Google, 2017

While Google never sells better ranking in our search results, several other search engines combine pay-per-click or pay-for-inclusion results with their regular web search results. Some SEOs will promise to rank you highly in search engines, but place you in the advertising section rather than in the search results. A few SEOs will even change their bid prices in real time to create the illusion that they "control" other search engines and can place themselves in the slot of their choice. This scam doesn't work with Google because our advertising is clearly labeled and separated from our search results, but be sure to ask any SEO you're considering which fees go toward permanent inclusion and which apply toward temporary advertising.
Additionally, there are many situations where PPC (a component of SEM) makes more sense than SEO. For example, if you are first launching a site and you want immediate visibility, it is a good idea to create a PPC campaign because it takes less time than SEO, but it would be unwise to strictly work with PPC and not even touch search engine optimization.

Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.


Most small businesses owners and marketers know a little something about SEO (search engine optimization) and the different tactics to help your website rank well in organic search engine results. Another important tactic for any Internet business to know about is SEM (search engine marketing), which includes things such as search engine optimization, paid listings and other search engine related services.
Crawlers are largely a separate product category. There is some overlap with the self-service keyword tools (Ahrefs, for instance, does both), but crawling is another important piece of the puzzle. We tested several tools with these capabilities either as their express purpose or as features within a larger platform. Ahrefs, DeepCrawl, Majestic, and LinkResearchTools are all primarily focused on crawling and backlink tracking, the inbound links coming to your site from another website. Moz Pro, SpyFu, SEMrush, and AWR Cloud all include domain crawling or backlink tracking features as part of their SEO arsenals.
×