Good news for web designers, content managers and search engine optimisers! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
Google expects pages to “be edited, reviewed, and updated on a regular basis” especially if they are for important issues like medical information, and states not all pages are held to such standards, but one can expect that Google wants information updated in a reasonable timescale. How reasonable this is, is dependant on the TOPIC and the PURPOSE of the web page RELATIVE to competing pages on the web.
Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.
Difficulty scores are the SEO market's answer to the patchwork state of all the data out there. All five tools we tested stood out because they do offer some version of a difficulty metric, or one holistic 1-100 score of how difficult it would be for your page to rank organically (without paying Google) on a particular keyword. Difficulty scores are inherently subjective, and each tool calculates it uniquely. In general, it incorporates PA, DA, and other factors, including search volume on the keyword, how heavily paid search ads are influencing the results, and how the strong the competition is in each spot on the current search results page.
There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2019 is still mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK in 2019):

Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank ‘thin’ pages in results – any page you want to rank – should have all the things Google is looking for. That’s a lot these days!


There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2019 is still mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK in 2019):

When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.

While that theory is sound (when focused on a single page, when the intent is to deliver utility content to a Google user) using old school SEO techniques on especially a large site spread out across many pages seems to amplify site quality problems, after recent algorithm changes, and so this type of optimisation without keeping an eye on overall site quality is self-defeating in the long run.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.
QUOTE: “As the Googlebot does not see [the text in the] the images directly, we generally concentrate on the information provided in the “alt” attribute. Feel free to supplement the “alt” attribute with “title” and other attributes if they provide value to your users! So for example, if you have an image of a puppy (these seem popular at the moment ) playing with a ball, you could use something like “My puppy Betsy playing with a bowling ball” as the alt-attribute for the image. If you also have a link around the image, pointing a large version of the same photo, you could use “View this image in high-resolution” as the title attribute for the link.”
to avoid throwing link equity away, you might create HIGH-LEVEL IN-DEPTH TOPIC PAGES on your site and redirect (or use canonical redirects) any related expired content that HAVE INCOMING BACKLINKS, to this topic page (and keep it updated, folding content from old pages, where relevant and there is traffic opportunity, to create TOPIC pages that are focused on the customer e.g. information pages)
While that theory is sound (when focused on a single page, when the intent is to deliver utility content to a Google user) using old school SEO techniques on especially a large site spread out across many pages seems to amplify site quality problems, after recent algorithm changes, and so this type of optimisation without keeping an eye on overall site quality is self-defeating in the long run.
I’m really grateful for your generous post, Brian. I’m definitely going to implement TOC on some of my over 4k words posts, where I’m trying to become the source. 😉 And I will also use the stats on some new posts. Thanks to you, I also researched big keywords, which I’d stayed away from, and found that many of the high CPC and ranking articles are from 2014. Hoping some of my fresh new content helps rank me higher. Love what you do, sir!
LinkResearchTools makes backlink tracking its core mission and provides a wide swath of backlink analysis tools. LinkResearchTools and Majestic provide the best backlink crawling of this bunch. Aside from these two backlink powerhouses, many of the other tools we tested, such as Ahrefs, Moz Pro, Searchmetrics, SEMrush, and SpyFu, also include solid backlink tracking capabilities.
Your article reaches me at just the perfect time. I’ve been working on getting back to blogging and have been at it for almost a month now. I’ve been fixing SEO related stuff on my blog and after reading this article (by the way is way too long for one sitting) I’m kind of confused. I’m looking at bloggers like Darren Rowse, Brian Clark, and so many other bloggers who use blogging or their blogs as a platform to educate their readers more than thinking about search rankings (but I’m sure they do).
Ranking refers to the process search engines use to determine where a particular piece of content should appear on a SERP. Search visibility refers to how prominently a piece of content is displayed in search engine results. Highly visible content (usually the content that ranks highest) may appear right at the top of organic search results or even in a featured snippet, while less-visible content may not appear until searchers click to page two and beyond
×