For example, let's say the keyword difficulty of a particular term is in the 80s and 90s in the top five spots on a particular search results page. Then, in positions 6-9, the difficulty scores drop down into the 50s and 60s. Using that difficulty score, a business can begin targeting that range of spots and running competitive analysis on the pages to see who your website could knock out of their spot.
Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.
I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site; some pages cannot. Some links can. Some cannot. Some links are trusted enough to pass ranking signals to another page. Some are not. YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES & FILTERS.
There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2019 is still mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK in 2019):

However, if possible, I would like you to expand a bit on your “zombie pages” tip..we run a site where are definitely enough pages to delete (no sessions, no links, probably not even relevant with the main theme of the site, not even important for the architecture of the site)..Nonetheless, I am not very sure what is the best technical decision for these pages…just deleting them from my CMS, redirecting (if there is a relevant alternative) or something else? Unindex them on Search console? what response code they should have? ..
Your article reaches me at just the perfect time. I’ve been working on getting back to blogging and have been at it for almost a month now. I’ve been fixing SEO related stuff on my blog and after reading this article (by the way is way too long for one sitting) I’m kind of confused. I’m looking at bloggers like Darren Rowse, Brian Clark, and so many other bloggers who use blogging or their blogs as a platform to educate their readers more than thinking about search rankings (but I’m sure they do).
We concentrated on the keyword-based aspect of all the SEO tools that included the capabilities, because that's where most business users will primarily focus. Monitoring particular keywords and your existing URL positions in search rankings is important but, once you've set that up, it's largely an automated process. Automated position-monitoring features are a given in most SEO platforms and most will alert you to issues, but they don't actively improve your search position. Though in tools such as AWR Cloud, Moz Pro, and Searchmetrics, position monitoring can become a proactive process that feeds back into your SEO strategy. It can spur further keyword research and targeted site and competitor domain crawling.
The last time I looked Google displayed as many characters as it can fit into a block element that’s about 600px wide and doesn’t exceed 1 line of text (on desktop). So – THERE IS NO BEST PRACTICE AMOUNT OF CHARACTERS any SEO could lay down as exact best practice to GUARANTEE a title will display, in full in Google, at least, as the search snippet title, on every device. Ultimately – only the characters and words you use will determine if your entire page title will be seen in a Google search snippet.

Additionally, there are many situations where PPC (a component of SEM) makes more sense than SEO. For example, if you are first launching a site and you want immediate visibility, it is a good idea to create a PPC campaign because it takes less time than SEO, but it would be unwise to strictly work with PPC and not even touch search engine optimization.
I do not obsess about site architecture as much as I used to…. but I always ensure my pages I want to be indexed are all available from a crawl from the home page – and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page from internal links – but I avoid abusing internals and avoid overtly manipulative internal links that are not grammatically correct, for instance..
QUOTE: “I think that’s always an option. Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.” John Mueller, Google 
Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.
Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.

These cloud-based, self-service tools have plenty of other unique optimization features, too. Some, such as AWR Cloud and Searchmetrics, also do search position monitoring—which means tracking how your page is doing against popular search queries. Others, such as SpyFu and LinkResearchTools, have more interactive data visualizations, granular and customizable reports, and return on investment (ROI) metrics geared toward online marketing and sales goals. The more powerful platforms can sport deeper analytics on paid advertising and pay-per-click (PPC) SEO as well. Though, at their core, the tools are all rooted in their ability to perform on-demand keyword queries.

Mike Murray has shaped online marketing strategies for hundreds of businesses since 1997, including Fortune 500 companies. A former journalist, he has led SEO studies and spoken at regional and national Internet conferences. Founder of Online Marketing Coach, Mike is passionate about helping clients identify their best opportunities for online marketing success based on their strengths, his advice and industry trends. You can find him at his blog, Online Marketing Matters or on Twitter @mikeonlinecoach.
At the moment, I don’t know you, your business, your website, your resources, your competition or your product. Even with all that knowledge, calculating ROI is extremely difficult because ultimately Google decides on who ranks where in its results – sometimes that’s ranking better sites, and sometimes (often) it is ranking sites breaking the rules above yours.
You pay each time a user clicks on an SEM result. You pay nothing when a user clicks on an SEO result. SEM results are paid placements, and your brand is charged each time a user clicks on the result. Therefore, you must have a budget for continually showing SEM ads and using this form of PPC lead generation. On the flip side, you are never charged when a user clicks on an organic search result.
At Yoast, we practice what we call ‘holistic SEO‘. This means that your primary goal should be to build and maintain the best possible website. Don’t try to fool Google, but use a sustainable long-term strategy. Ranking will come automatically if your website is of extremely high quality. Google wants to get its users to the right place, as its mission is to index all the world’s online information and make it universally accessible and useful.

Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".

However, if possible, I would like you to expand a bit on your “zombie pages” tip..we run a site where are definitely enough pages to delete (no sessions, no links, probably not even relevant with the main theme of the site, not even important for the architecture of the site)..Nonetheless, I am not very sure what is the best technical decision for these pages…just deleting them from my CMS, redirecting (if there is a relevant alternative) or something else? Unindex them on Search console? what response code they should have? ..
×