QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations

A website or URL’s ranking for keywords or keyword combinations varies from search engine to search engine. A domain may rank for a certain keyword in the top 3 on Bing, but not even be on the first page of the Google search results for the same keyword. Of course, the same is true of all search engines – Bing, Google, Yahoo and every other search engine uses its own method for calculating rankings and therefore ranks websites differently.

A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.


Searching Google.com in an incognito window will bring up that all-familiar list of autofill options, many of which can help guide your keyword research. The incognito ensures that any customized search data Google stores when you’re signed in gets left out. Incognito may also be helpful to see where you truly rank on a results page for a certain term.
QUOTE: “The score is determined from quantities indicating user actions of seeking out and preferring particular sites and the resources found in particular sites. *****A site quality score for a particular site**** can be determined by computing a ratio of a numerator that represents user interest in the site as reflected in user queries directed to the site and a denominator that represents user interest in the resources found in the site as responses to queries of all kinds The site quality score for a site can be used as a signal to rank resources, or to rank search results that identify resources, that are found in one site relative to resources found in another site.” Navneet Panda, Google Patent
All of this plays into a new way businesses and SEO professionals need to think when approaching what keywords to target and what SERP positions to chase. The enterprise SEO platforms are beginning to do this, but the next step in SEO is full-blown content recommendation engines and predictive analytics. By using all of the data you pull from your various SEO tools, Google Search Console, and keyword and trend data from social listening platforms, you can optimize for a given keyword or query before Google does it first. If your keyword research uncovers a high-value keyword or SERP for which Google has not yet monetized the page with a Quick Answer or a Featured Snippet, then pounce on that opportunity.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Brian, I have a burning question regarding keyword placement and frequency. You wrote: “Use the key in the first 100 words … “. What else? I use Yoast and a WDF*IDF semantic analysis tool to check the content of the top10 rankings. Pretty often I have the feeling I overdo it, although Yoast and WDF/IDF told me I use the focus keyword not often enough.
When using the Keyword Explorer, Ahrefs will also produce the "parent topic" of the keyword you looked up, as you can see in the screenshot above, underneath the Keyword Difficulty meter. A keyword's parent topic is a broader keyword with higher search volume than your intended keyword, but likely has the same audience and ranking potential -- giving you more a valuable SEO opportunity when optimizing a particular blog post or webpage.

What does it mean for a site to be SEO friendly? It goes beyond just posting quality content (though that’s a very important part!). There are all kinds of ways big and small that can prevent your site from being seen by search engines and thus by users. Our free audit tool begins by looking at some of the most important facets of your website you might not even be aware of.
Ask for a technical and search audit for your site to learn what they think needs to be done, why, and what the expected outcome should be. You'll probably have to pay for this. You will probably have to give them read-only access to your site on Search Console. (At this stage, don't grant them write access.) Your prospective SEO should be able to give you realistic estimates of improvement, and an estimate of the work involved. If they guarantee you that their changes will give you first place in search results, find someone else.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
What about other search engines that use them? Hang on while I submit my site to those 75,000 engines first [sarcasm!]. Yes, ten years ago early search engines liked looking at your meta-keywords. I’ve seen OPs in forums ponder which is the best way to write these tags – with commas, with spaces, limiting to how many characters. Forget about meta-keyword tags – they are a pointless waste of time and bandwidth.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.

I do not obsess about site architecture as much as I used to…. but I always ensure my pages I want to be indexed are all available from a crawl from the home page – and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page from internal links – but I avoid abusing internals and avoid overtly manipulative internal links that are not grammatically correct, for instance..
Google’s bots crawl your site to determine its quality, and correct technical on page optimization is one of the main signals used. When you optimize your page based on the recommendations of the website analyzer, you can increase your organic traffic, improve ranking positions, and stay competitive against other sites ranking for your target keywords.
QUOTE: “I’ve got a slide here where I show I think 8 different URLs you know every single one of these URLs could return completely different content in practice we as humans whenever we look at ‘www.example.com’ or just regular ‘example.com’ or example.com/index or example.com/home.asp we think of it as the same page and in practice it usually is the same page so technically it doesn’t have to be but almost always web servers will return the same content for like these 8 different versions of the URL so that can cause a lot of problems in search engines if rather than having your backlinks all go to one page instead it’s split between (the versions) and it’s a really big headache….how do people fix this well …. the canonical link element” Matt Cutts, Google
QUOTE: “Shopping or financial transaction pages: webpages which allow users to make purchases, transfer money, pay bills, etc. online (such as online stores and online banking pages)…..We have very high Page Quality rating standards for YMYL pages because low-quality YMYL pages could potentially negatively impact users’ happiness, health, or wealth.“
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
One common scam is the creation of "shadow" domains that funnel users to a site by using deceptive redirects. These shadow domains often will be owned by the SEO who claims to be working on a client's behalf. However, if the relationship sours, the SEO may point the domain to a different site, or even to a competitor's domain. If that happens, the client has paid to develop a competing site owned entirely by the SEO.

Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.


QUOTE: “Content which is copied, but changed slightly from the original. This type of copying makes it difficult to find the exact matching original source. Sometimes just a few words are changed, or whole sentences are changed, or a “find and replace” modification is made, where one word is replaced with another throughout the text. These types of changes are deliberately done to make it difficult to find the original source of the content. We call this kind of content “copied with minimal alteration.” Google Search Quality Evaluator Guidelines March 2017

All of this plays into a new way businesses and SEO professionals need to think when approaching what keywords to target and what SERP positions to chase. The enterprise SEO platforms are beginning to do this, but the next step in SEO is full-blown content recommendation engines and predictive analytics. By using all of the data you pull from your various SEO tools, Google Search Console, and keyword and trend data from social listening platforms, you can optimize for a given keyword or query before Google does it first. If your keyword research uncovers a high-value keyword or SERP for which Google has not yet monetized the page with a Quick Answer or a Featured Snippet, then pounce on that opportunity.


Mike Levin is the Senior SEO Director at Ziff Davis, PCMag's parent company. His career goes back 25 years to the halls of Commodore Computers, as an original Amiga fanboy to be beamed up by the mothership just as it imploded. Over his past 10 years in NYC, Mike's highlights have included leading the Apple Store, Kraft and JCPenney SEO accounts whi... See Full Bio

While Google is on record as stating these quality raters do not directly influence where you rank (without more senior analysts making a call on the quality of your website, I presume?) – there are some things in this document, mostly of a user experience nature (UX) that all search engine optimisers and Webmasters of any kind should note going forward.

QUOTE: “I think that’s always an option. Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.” John Mueller, Google 
This relationship between rankings and clicks (and traffic) is strongest amongst the top 3 search results. However, changing layout of the search results pages is constantly changing, with the inclusion of Google’s Knowledge Graph data and the integration of Universal Search elements (SERP Features) like videos, maps and Google Shopping ads. These developments can mean that the top 3 organic rankings are no longer the 3 best positions on the SERP. This has been demonstrated in heatmap and eye-tracking tests.
×