When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “SEO Scotland” as opposed to https://www.hobo-web.co.uk or “click here“. Saying that – in 2019, Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially with links).
QUOTE: “If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.” Matt Cutts, Google 2013

TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school SEO stuff like keyword stuffing a site.
QUOTE: ‘To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.
Rob Marvin served as PCMag's Associate Features Editor from December 2017 to December 2019. He wrote features, news, and trend stories on all manner of emerging technologies. Beats included: big tech coverage, startups, business and venture capital, blockchain and cryptocurrencies, artificial intelligence, augmented and virtual reality, IoT and aut... See Full Bio

Ask for a technical and search audit for your site to learn what they think needs to be done, why, and what the expected outcome should be. You'll probably have to pay for this. You will probably have to give them read-only access to your site on Search Console. (At this stage, don't grant them write access.) Your prospective SEO should be able to give you realistic estimates of improvement, and an estimate of the work involved. If they guarantee you that their changes will give you first place in search results, find someone else.
Many websites rely on other traffic generation methods such as traffic from social media, email, referrals, and direct traffic sources over search engines. For sites like these, SEO errors aren’t as important because search engines aren’t their #1 traffic source. For a smaller website, a couple of errors can have a much bigger negative effect than those same errors on a larger website.
For example, within the HubSpot Blogging App, users will find as-you-type SEO suggestions. This helpful inclusion serves as a checklist for content creators of all skill levels. HubSpot customers also have access to the Page Performance App, Sources Report, and the Keyword App. The HubSpot Marketing Platform will provide you with the tools you need to research keywords, monitor their performance, track organic search growth, and diagnose pages that may not be fully optimized.
SEO platforms are leaning into this shift by emphasizing mobile-specific analytics. What desktop and mobile show you for the same search results is now different. Mobile results will often pull key information into mobile-optimized "rich cards," while on desktop you'll see snippets. SEMrush splits its desktop and mobile indexes, actually providing thumbnails of each page of search results depending on the device, and other vendors including Moz are beginning to do the same.
Have you ever received a warning from Google Chrome to not visit a page? It will block the page and prevent you from going there because of some security issue. We begin by ensuring your website passes a SSL Certificate Validity Check. This a whole range of security protocols that should be within your website’s coding or built-in to the domain. It shows the world that your site is trustworthy!
×