There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2019 is still mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK in 2019):
Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.
I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2019.
When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.

“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.


Evaluating which self-service SEO tools are best suited to your business incorporates a number of factors, features, and SEO metrics. Ultimately, though, when we talk about "optimizing," it all comes down to how easy the tool makes it to get, understand, and take action on the SEO data you need. Particularly when it comes to ad hoc keyword investigation, it's about the ease with which you can zero in on the ground where you can make the most progress. In business terms, that means making sure you're targeting the most opportune and effective keywords available in your industry or space—the words for which your customers are searching.
Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Hi Claire, you’re welcome. It depends. If the keyword seems like a Featured Snippet would make sense (for example, it’s a term that could use a definition or there’s a list of steps or tips), I’d still try snippetbait. One other thing I’d keep in mind is that Featured Snippets tend to float in and out. For example, the keyword “how to get more subscribers on YouTube”. That featured snippet tends to appear (with us ranking in it) and disappear on a weekly basis. Just Google testing stuff out.

When Google trusts you it’s because you’ve earned its trust to help it satisfy its users in the quickest and most profitable way possible. You’ve helped Google achieve its goals. It trusts you and it will reward you with higher rankings. Google will list “friends” it trusts the most (who it knows to be reputable in a particular topic) at the top of SERPs.

LinkResearchTools makes backlink tracking its core mission and provides a wide swath of backlink analysis tools. LinkResearchTools and Majestic provide the best backlink crawling of this bunch. Aside from these two backlink powerhouses, many of the other tools we tested, such as Ahrefs, Moz Pro, Searchmetrics, SEMrush, and SpyFu, also include solid backlink tracking capabilities.

When your business has an idea about a new search topic for which you think your content has the potential to rank highly, the ability to spin up a query and investigate it right away is key. Even more importantly, the tool should give you enough data points, guidance, and recommendations to confirm whether or not that particular keyword, or a related keyword or search phrase, is an SEO battle worth fighting (and, if so, how to win). We'll get into the factors and metrics to help you make those decisions a little later.
I like the competition analysis tools, it provides paid and organic data, which gives me an idea on how to catch up and outrank the immediate competition for my clients. It also provides data for the potential traffic, which helps show clients the potential gains of the campaign. And with the marketing plan, I know what needs to be improved in order to get results for my clients.
SEO is the practice of optimizing websites to make them reach a high position in Google’s – or another search engine’s – search results. At Yoast, we believe that holistic SEO is the best way to rank your website because you focus on making every aspect of your site awesome. Don’t use any black-hat SEO tricks, because eventually, this will have negative consequences for your rankings. Instead, practice sustainable SEO, with your user in mind, and you will benefit in the long run.

QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018


“Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Webmaster Tools if you sign up.
If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
When Google trusts you it’s because you’ve earned its trust to help it satisfy its users in the quickest and most profitable way possible. You’ve helped Google achieve its goals. It trusts you and it will reward you with higher rankings. Google will list “friends” it trusts the most (who it knows to be reputable in a particular topic) at the top of SERPs.
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2016
Evaluating which self-service SEO tools are best suited to your business incorporates a number of factors, features, and SEO metrics. Ultimately, though, when we talk about "optimizing," it all comes down to how easy the tool makes it to get, understand, and take action on the SEO data you need. Particularly when it comes to ad hoc keyword investigation, it's about the ease with which you can zero in on the ground where you can make the most progress. In business terms, that means making sure you're targeting the most opportune and effective keywords available in your industry or space—the words for which your customers are searching.
Ideally, you will have unique pages, with unique page titles and unique page meta descriptions . Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I don’t worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
A page title that is highly relevant to the page it refers to will maximise usability, search engine ranking performance and user experience ratings as Google measures these. It will probably be displayed in a web browser’s window title bar, bookmarks and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring AT LEAST ONCE within it.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
Google used to make much of its ad hoc keyword search functionality available as well, but now the Keyword Planner is behind a paywall in AdWords as a premium feature. Difficulty scores are inspired by the way Google calculates its Competition Score metric in AdWords, though most vendors calculate difficulty using PA and DA numbers correlated with search engine positions, without AdWords data blended in at all. Search Volume is a different matter, and is almost always directly lifted from AdWords. Not to mention keyword suggestions and related keywords data, which in many tools come from Google's Suggest and Autocomplete application programming interfaces (APIs).
I was used to work with Tools like Sistrix, Ahrefs or Searchmetrics and did not know about SE Ranking before. But those tools were too cost-intensive for a small and quick start into SEO so I tried it out and I am quite satisfied with it. I like the ability to pay for certain services with credits, as I am not using them on a very frequent level, so it actually gives me greater flexibility to only use them when needed and not paying for them even when not using them.
×