QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018

So you have a new site. You fill your home page meta tags with the 20 keywords you want to rank for – hey, that’s what optimisation is all about, isn’t it? You’ve just told Google by the third line of text what to filter you for. The meta name=”Keywords” was actually originally for words that weren’t actually on the page that would help classify the document.
Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.

SE Ranking is the best seo platform our company has used so far. The interface of the platform is great & user-friendly. The available options are many. From tracking rankings, monitoring backlinks, keyword research to competitor analysis and website audit, everything we need to optimize our sites is just one click away. Also, for any questions or anything else we needed, the live support team replied & helped me with straight away.

The self-service keyword research tools we tested all handle pricing relatively similarly, pricing by month with discounts for annual billing with most SMB-focused plans ranging in the $50-$200 per month range. Depending on how your business plans to use the tools, the way particular products delineate pricing might make more sense. KWFinder.com is the cheapest of the bunch, but it's focused squarely on ad hoc keyword and Google SERP queries, which is why the product sets quotas for keyword lookups per 24 hours at different tiers. Moz and Ahrefs price by campaigns or projects, meaning the number of websites you're tracking in the dashboard. Most of the tools also cap the number of keyword reports you can run per day. SpyFu prices a bit differently, providing unlimited data access and results but capping the number of sales leads and domain contacts.
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017

The Java program is fairly intuitive, with easy-to-navigate tabs. Additionally, you can export any or all of the data into Excel for further analysis. So say you're using Optify, Moz, or RavenSEO to monitor your links or rankings for specific keywords -- you could simply create a .csv file from your spreadsheet, make a few adjustments for the proper formatting, and upload it to those tools.

Mike Murray has shaped online marketing strategies for hundreds of businesses since 1997, including Fortune 500 companies. A former journalist, he has led SEO studies and spoken at regional and national Internet conferences. Founder of Online Marketing Coach, Mike is passionate about helping clients identify their best opportunities for online marketing success based on their strengths, his advice and industry trends. You can find him at his blog, Online Marketing Matters or on Twitter @mikeonlinecoach.


In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.
Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
If you are interested in having the SEO Audit Tool on your web platform, you can have a free seven day trial of it. By embedding this tool directly on your page, you can generate great leads from your users by seeing their websites or the websites they are interested in. From here, you can target a more specific audience and see great improvements in your conversion rates!

The transparency you provide on your website in text and links about who you are, what you do, and how you’re rated on the web or as a business is one way that Google could use (algorithmically and manually) to ‘rate’ your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.
Critics will point out the higher the cost of expert SEO, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.
“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.
The last piece of the complicated SEO tool ecosystem is the enterprise tier. This roundup is geared toward SEO for small to midsize businesses (SMBs), for which these platforms are likely priced out of reach. But there's a handful of enterprise SEO software providers out there that essentially roll all of the self-service tools into one comprehensive platform. These platforms combine ongoing position monitoring, deep keyword research, and crawling with customizable reports andanalytics.
to avoid throwing link equity away, you might create HIGH-LEVEL IN-DEPTH TOPIC PAGES on your site and redirect (or use canonical redirects) any related expired content that HAVE INCOMING BACKLINKS, to this topic page (and keep it updated, folding content from old pages, where relevant and there is traffic opportunity, to create TOPIC pages that are focused on the customer e.g. information pages)
Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.

I added one keyword to the page in plain text because adding the actual ‘keyword phrase’ itself would have made my text read a bit keyword stuffed for other variations of the main term. It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research – and knowing which unique keywords to add.


I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3

TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
The actual content of your page itself is, of course, very important. Different types of pages will have different “jobs” – your cornerstone content asset that you want lots of folks to link to needs to be very different than your support content that you want to make sure your users find and get an answer from quickly. That said, Google has been increasingly favoring certain types of content, and as you build out any of the pages on your site, there are a few things to keep in mind:
QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017
Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. THINK CAREFULLY before auto-generating keyword phrase footprints across a site using boiler-plating and article spinning techniques. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot of competing pages out there in 2019.

When it comes down to it, you want to choose a platform or invest in complementary tools that provide a single unified SEO workflow. It begins with keyword research to target optimal keywords and SERP positions for your business, along with SEO recommendations to help your rank. Those recommendations feed naturally into crawing tools, which should give you insight into your website and competitors' sites to then optimize for those targeted opportunities. Once you're ranking on those keywords, vigilant monitoring and rank tracking should help maintain your positions and grow your lead on competitors when it comes to the search positions that matter to your organization's bottom line. Finally, the best tools also tie those key search positions directly to ROI with easy-to-understand metrics, and feed your SEO deliverables and goals right back into your digital marketing strategy.
The caveat in all of this is that, in one way or another, most of the data and the rules governing what ranks and what doesn't (often on a week-to-week basis) comes from Google. If you know where to find and how to use the free and freemium tools Google provides under the surface—AdWords, Google Analytics, and Google Search Console being the big three—you can do all of this manually. Much of the data that the ongoing position monitoring, keyword research, and crawler tools provide is extracted in one form or another from Google itself. Doing it yourself is a disjointed, meticulous process, but you can piece together all the SEO data you need to come up with an optimization strategy should you be so inclined.

The SEO tools in this roundup provide tremendous digital marketing value for businesses, but it's important not to forget that we're living in Google's world under Google's constantly evolving rules. Oh and don't forget to check the tracking data on Bing now and again, either. Google's the king with over 90 percent of worldwide internet search, according to StatCounter, but the latest ComScore numbers have Bing market share sitting at 23 percent. Navigable news and more useful results pages make Bing a viable choice in the search space as well.
QUOTE: “I think that’s always an option. Yeah. That’s something that–I’ve seen sites do that across the board,not specifically for blogs, but for content in general, where they would regularly go through all of their content and see, well, this content doesn’t get any clicks, or everyone who goes there kind of runs off screaming.” John Mueller, Google 
If Google finds two identical pieces of content, whether on your own site, or on another you’re not even aware of, it will only index one of those pages. You should be aware of scraper sites, stealing your content automatically and republishing as your own. Here’s Graham Charlton’s thorough investigation on what to if your content ends up working better for somebody else.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.
However, if possible, I would like you to expand a bit on your “zombie pages” tip..we run a site where are definitely enough pages to delete (no sessions, no links, probably not even relevant with the main theme of the site, not even important for the architecture of the site)..Nonetheless, I am not very sure what is the best technical decision for these pages…just deleting them from my CMS, redirecting (if there is a relevant alternative) or something else? Unindex them on Search console? what response code they should have? ..

I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site; some pages cannot. Some links can. Some cannot. Some links are trusted enough to pass ranking signals to another page. Some are not. YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES & FILTERS.

Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites or any third party service. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content and services.” Shaun Anderson, Hobo
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.

After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

ensure redirected domains redirect through a canonical redirect and this too has any chains minimised, although BE SURE to audit the backlink profile for any redirects you point at a page as with reward comes punishment if those backlinks are toxic (another example of Google opening up the war that is technical seo on a front that isn’t, and in fact is converse, to building backlinks to your site).
Your article reaches me at just the perfect time. I’ve been working on getting back to blogging and have been at it for almost a month now. I’ve been fixing SEO related stuff on my blog and after reading this article (by the way is way too long for one sitting) I’m kind of confused. I’m looking at bloggers like Darren Rowse, Brian Clark, and so many other bloggers who use blogging or their blogs as a platform to educate their readers more than thinking about search rankings (but I’m sure they do).
QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations
Optimizing a website may involve editing its content, adding content, and modifying HTML and associated coding to both increase its relevance to specific keywords and remove barriers to the indexing activities of search engines like Google ,Yahoo etc.[citation needed] Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3]
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
The tools we tested in this round of reviews were judged on which do the best job of giving you the research-driven investigation tools to identify SEO opportunities ripe for growth, along with offering enterprise-grade functionality at a reasonable price. Whether one of these optimization tools is an ideal fit for your business, or you end up combining more than one for a potent SEO tool suite, this roundup will help you decide what makes the most sense for you. There's a wealth of data out there to give your business an edge and boost pages higher and higher in key search results. Make sure you've got the right SEO tools in place to seize the opportunities.

******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******

The audit is for all pages and not only one. What happens in the majority of the cases is that pages / posts have similarities so you can group them together. For example the pages of a website may be ok but the blog post pages may be missing titles. It’s a lot of work especially for a 500 pages website but you can start from the most important pages first and work your way to the rest
×