Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.

In addition to on-page SEO factors, there are off-page SEO factors. These factors include links from other websites, social media attention, and other marketing activities outside your own website. These off-page SEO factors can be rather difficult to influence. The most important of these off-page factors is the number and quality of links pointing towards your site. The more quality, relevant sites that link to your website, the higher your position in Google will be.
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.

QUOTE: “Duplicated content is often not manipulative and is commonplace on many websites and often free from malicious intent. Copied content can often be penalised algorithmically or manually. Duplicate content is not penalised, but this is often not an optimal set-up for pages, either. Be VERY careful ‘spinning’ ‘copied’ text to make it unique!” Shaun Anderson, Hobo, 2018
QUOTE: “7.4.3 Automatically ­Generated Main Content Entire websites may be created by designing a basic template from which hundreds or thousands of pages are created, sometimes using content from freely available sources (such as an RSS feed or API). These pages are created with no or very little time, effort, or expertise, and also have no editing or manual curation. Pages and websites made up of auto­generated content with no editing or manual curation, and no original content or value added for users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017
Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
The goal of successful SEO is to obtain a high-ranking placement in the search results page of a search engine (e.g. Google, Bing, Yahoo and other search engines). Internet users often do not click through pages and pages of search results, so where a site ranks in a search results page is essential for directing more traffic toward the site. The higher a website naturally ranks in organic results of a search, the greater the chance that that site will be visited by a user.

Searching Google.com in an incognito window will bring up that all-familiar list of autofill options, many of which can help guide your keyword research. The incognito ensures that any customized search data Google stores when you’re signed in gets left out. Incognito may also be helpful to see where you truly rank on a results page for a certain term.
The basics of GOOD SEO hasn’t changed for years – though effectiveness of particular elements has certainly narrowed or changed in type of usefulness – you should still be focusing on building a simple site using VERY simple SEO best practices – don’t sweat the small stuff, while all-the-time paying attention to the important stuff  – add plenty of unique PAGE TITLES and plenty of new ORIGINAL CONTENT. Understand how Google SEES your website. CRAWL it, like Google does, with (for example) Screaming Frog SEO spider, and fix malformed links or things that result in server errors (500), broken links (400+) and unnecessary redirects (300+). Each page you want in Google should serve a 200 OK header message.
At Yoast, we practice what we call ‘holistic SEO‘. This means that your primary goal should be to build and maintain the best possible website. Don’t try to fool Google, but use a sustainable long-term strategy. Ranking will come automatically if your website is of extremely high quality. Google wants to get its users to the right place, as its mission is to index all the world’s online information and make it universally accessible and useful.
When it comes down to it, you want to choose a platform or invest in complementary tools that provide a single unified SEO workflow. It begins with keyword research to target optimal keywords and SERP positions for your business, along with SEO recommendations to help your rank. Those recommendations feed naturally into crawing tools, which should give you insight into your website and competitors' sites to then optimize for those targeted opportunities. Once you're ranking on those keywords, vigilant monitoring and rank tracking should help maintain your positions and grow your lead on competitors when it comes to the search positions that matter to your organization's bottom line. Finally, the best tools also tie those key search positions directly to ROI with easy-to-understand metrics, and feed your SEO deliverables and goals right back into your digital marketing strategy.
Link building is not JUST a numbers game, though. One link from a “trusted authority” site in Google could be all you need to rank high in your niche. Of course, the more “trusted” links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google in 2019.

QUOTE: “Google will now begin encrypting searches that people do by default, if they are logged into Google.com already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive “referrer” data that reveals what those people searched for, except in the case of ads.

Mike Levin is the Senior SEO Director at Ziff Davis, PCMag's parent company. His career goes back 25 years to the halls of Commodore Computers, as an original Amiga fanboy to be beamed up by the mothership just as it imploded. Over his past 10 years in NYC, Mike's highlights have included leading the Apple Store, Kraft and JCPenney SEO accounts whi... See Full Bio
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
I think it makes sense to have unique content as much as possible on these pages but it’s not not going to like sync the whole website if you don’t do that we don’t penalize a website for having this kind of deep duplicate content and kind of going back to the first thing though with regards to doorway pages that is something I definitely look into to make sure that you’re not running into that so in particular if this is like all going to the same clinic and you’re creating all of these different landing pages that are essentially just funneling everyone to the same clinic then that could be seen as a doorway page or a set of doorway pages on our side and it could happen that the web spam team looks at that and says this is this is not okay you’re just trying to rank for all of these different variations of the keywords and the pages themselves are essentially all the same and they might go there and say we need to take a manual action and remove all these pages from search so that’s kind of one thing to watch out for in the sense that if they are all going to the same clinic then probably it makes sense to create some kind of a summary page instead whereas if these are going to two different businesses then of course that’s kind of a different situation it’s not it’s not a doorway page situation.”
However, if possible, I would like you to expand a bit on your “zombie pages” tip..we run a site where are definitely enough pages to delete (no sessions, no links, probably not even relevant with the main theme of the site, not even important for the architecture of the site)..Nonetheless, I am not very sure what is the best technical decision for these pages…just deleting them from my CMS, redirecting (if there is a relevant alternative) or something else? Unindex them on Search console? what response code they should have? ..
×