Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.

QUOTE: “Duplicated content is often not manipulative and is commonplace on many websites and often free from malicious intent. Copied content can often be penalised algorithmically or manually. Duplicate content is not penalised, but this is often not an optimal set-up for pages, either. Be VERY careful ‘spinning’ ‘copied’ text to make it unique!” Shaun Anderson, Hobo, 2018
SEM is a broader term than SEO, and is used to encompass different options available to use a search engine’s technology, including paid ads. SEM is often used to describe acts associated with researching, submitting and positioning a website within search engines.  It includes things such as search engine optimization, paid listings and other search-engine related services and functions that will increase exposure and traffic to your Web site. 
Consider your competition. Look at what your competitors are doing and how they are performing in their search marketing before you decide how you can best compete with them. Research what search terms they rank organically for. Consider if you can execute a plan to top their SERP placements. Also, look at what paid terms they are using to drive traffic to their own sites. As you perform this research, look for gaps that you can fill and areas where you will be unable to compete in both paid and organic search.
In the enterprise space, one major trend we're seeing lately is data import across the big players. Much of SEO involves working with the data Google gives you and then filling in all of the gaps. Google Search Console (formerly, Webmaster Tools) only gives you a 90-day window of data, so enterprise vendors, such as Conductor and Screaming Frog, are continually adding and importing data sources from other crawling databases (like DeepCrawl's). They're combining that with Google Search Console data for more accurate, ongoing Search Engine Results Page (SERP) monitoring and position tracking on specific keywords. SEMrush and Searchmetrics (in its enterprise Suite packages) offer this level of enterprise SERP monitoring as well, which can give your business a higher-level view of how you're doing against competitors.

The existing content may speak to core audiences, but it isn’t producing many strong organic results. For example, the content header Capitalizing on the Right Skills at the Right Time With Business Agility may seem OK, but it doesn’t include a keyword phrase within striking distance. The lengthy URL doesn’t help matters. Extraneous words prevent any focus and the URL is bogged down by “business” and “agility” duplication:
Today, however, SEM is used to refer exclusively to paid search. According to Search Engine Land, Search Engine Marketing is “the process of gaining website traffic by purchasing ads on search engines.” Search Engine Optimization, on the other hand, is defined as “the process of getting traffic from free, organic, editorial or natural search results.”
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
Consider your competition. Look at what your competitors are doing and how they are performing in their search marketing before you decide how you can best compete with them. Research what search terms they rank organically for. Consider if you can execute a plan to top their SERP placements. Also, look at what paid terms they are using to drive traffic to their own sites. As you perform this research, look for gaps that you can fill and areas where you will be unable to compete in both paid and organic search.
Website-specific crawlers, or software that crawls one particular website at a time, are great for analyzing your own website's SEO strengths and weaknesses; they're arguably even more useful for scoping out the competition's. Website crawlers analyze a website's URL, link structure, images, CSS scripting, associated apps, and third-party services to evaluate SEO. Not unlike how a website monitoring tool scans for a webpage's overall "health," website crawlers can identify factors such as broken links and errors, website lag, and content or metadata with low keyword density and SEO value, while mapping a website's architecture. Website crawlers can help your business improve website user experience (UX) while identifying key areas of improvement to help pages rank better. DeepCrawl is, by far, the most granular and detailed website crawler in this roundup, although Ahrefs and Majestic also provide comprehensive domain crawling and website optimization recommendations. Another major crawler we didn't test is Screaming Frog, which we'll soon discuss in the section called "The Enterprise Tier."
Google asks quality raters to investigate your reputation by searching “giving the example [“ibm.com” reviews –site:ibm.com]: A search on Google for reviews of “ibm.com” which excludes pages on ibm.com.” – So I would do that search yourself and judge for yourself what your reputation is. Very low ratings on independent websites could play a factor in where you rank in the future – ” with Google stating clearly “very low ratings on the BBB site to be evidence for a negative reputation“. Other sites mentioned to review your business include YELP and Amazon. Often – using rich snippets containing schema.org information – you can get Google to display user ratings in the actual SERPs. I noted you can get ‘stars in SERPs’ within two days after I added the code (March 2014).
Most small businesses owners and marketers know a little something about SEO (search engine optimization) and the different tactics to help your website rank well in organic search engine results. Another important tactic for any Internet business to know about is SEM (search engine marketing), which includes things such as search engine optimization, paid listings and other search engine related services.
QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations
But essentially the idea there is that this is a good representative of the the content from your website and that’s all that we would show to users on the other hand if someone is specifically looking for let’s say dental bridges in Dublin then we’d be able to show the appropriate clinic that you have on your website that matches that a little bit better so we’d know dental bridges is something that you have a lot on your website and Dublin is something that’s unique to this specific page so we’d be able to pull that out and to show that to the user like that so from a pure content duplication point of view that’s not really something I totally worry about.
Crawlers are largely a separate product category. There is some overlap with the self-service keyword tools (Ahrefs, for instance, does both), but crawling is another important piece of the puzzle. We tested several tools with these capabilities either as their express purpose or as features within a larger platform. Ahrefs, DeepCrawl, Majestic, and LinkResearchTools are all primarily focused on crawling and backlink tracking, the inbound links coming to your site from another website. Moz Pro, SpyFu, SEMrush, and AWR Cloud all include domain crawling or backlink tracking features as part of their SEO arsenals.
John Lincoln (MBA) is CEO of Ignite Visibility (a 2017, 2018 & 2019 Inc. 5000 company) a highly sought-after digital marketing strategist, industry speaker and winner of the coveted Search Engine Land "Search Marketer of the Year" award. With 16+ years of demanding experience, Lincoln has worked with over 1,000 online businesses including amazing clients such as Office Depot, Tony Robbins, Morgan Stanley, Fox, USA Today, COX and The Knot World Wide.
After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.
KISS does not mean boring web pages. You can create stunning sites with smashing graphics – but you should build these sites using simple techniques – HTML & CSS, for instance. If you are new to web design, avoid things like Flash and JavaScript, especially for elements like scrolling news tickers, etc. These elements work fine for TV – but only cause problems for website visitors.
If you link out to irrelevant sites, Google may ignore the page, too – but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER – I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites, for example, don’t do well in Google these days without some good quality backlinks and higher quality pages.

Google used to make much of its ad hoc keyword search functionality available as well, but now the Keyword Planner is behind a paywall in AdWords as a premium feature. Difficulty scores are inspired by the way Google calculates its Competition Score metric in AdWords, though most vendors calculate difficulty using PA and DA numbers correlated with search engine positions, without AdWords data blended in at all. Search Volume is a different matter, and is almost always directly lifted from AdWords. Not to mention keyword suggestions and related keywords data, which in many tools come from Google's Suggest and Autocomplete application programming interfaces (APIs).
Where the free Google tools can provide complementary value is in fact-checking. If you're checking out more than one of these SEO tools, you'll quickly realize this isn't an exact science. If you were to look at the PA, DA, and keyword difficulty scores across KWFinder.com, Moz, SpyFu, SEMrush, Ahrefs, AWR Cloud, and Searchmetrics for the same set of keywords, you might get different numbers across each metric separated by anywhere from a few points to dozens. If your business is unsure about an optimization campaign on a particular keyword, you can cross-check with data straight from a free AdWords account and Search Console. Another trick: Enable Incognito mode in your browser along with an extension like the free Moz Toolbar and you can run case-by-case searches on specific keywords to get an organic look at your target search results page.
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Webmaster Tools if you sign up.
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.
The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
When it comes down to it, you want to choose a platform or invest in complementary tools that provide a single unified SEO workflow. It begins with keyword research to target optimal keywords and SERP positions for your business, along with SEO recommendations to help your rank. Those recommendations feed naturally into crawing tools, which should give you insight into your website and competitors' sites to then optimize for those targeted opportunities. Once you're ranking on those keywords, vigilant monitoring and rank tracking should help maintain your positions and grow your lead on competitors when it comes to the search positions that matter to your organization's bottom line. Finally, the best tools also tie those key search positions directly to ROI with easy-to-understand metrics, and feed your SEO deliverables and goals right back into your digital marketing strategy.
A lot of optimisation techniques that are in the short term effective at boosting a site’s position in Google are against Google’s guidelines. For example, many links that may have once promoted you to the top of Google, may, in fact, today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back. You must be smart, and cautious, when it comes to building links to your site in a manner that Google *hopefully* won’t have too much trouble with, in the FUTURE. Because they will punish you in the future.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
“Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
In a few cases, see what happens if you make more risky changes. I’m working with a website that wasn’t even in the top 100 positions for many of its 20 strategic keywords. Based on some data, it looked like the client’s sweet spot for keywords may be in the 10 to 30 range for average search value. We targeted one phrase with 700 searches a month. It’s now ranking No. 12 on Google after making two sets of SEO changes on one page. Ultimately, the client may need a new page to grab a spot among the top 10 positions.

At first glance, the Ads or SC appear to be MC. Some users may interact with Ads or SC, believing that the Ads or SC is the MC.Ads appear to be SC (links) where the user would expect that clicking the link will take them to another page within the same website, but actually take them to a different website. Some users may feel surprised or confused when clicking SC or links that go to a page on a completely different website.
Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. THINK CAREFULLY before auto-generating keyword phrase footprints across a site using boiler-plating and article spinning techniques. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot of competing pages out there in 2019.
When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. THINK CAREFULLY before auto-generating keyword phrase footprints across a site using boiler-plating and article spinning techniques. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot of competing pages out there in 2019.
SEO is an acronym for "search engine optimization" or "search engine optimizer." Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your site and reputation. Make sure to research the potential advantages as well as the damage that an irresponsible SEO can do to your site. Many SEOs and other agencies and consultants provide useful services for website owners, including:
All of this plays into a new way businesses and SEO professionals need to think when approaching what keywords to target and what SERP positions to chase. The enterprise SEO platforms are beginning to do this, but the next step in SEO is full-blown content recommendation engines and predictive analytics. By using all of the data you pull from your various SEO tools, Google Search Console, and keyword and trend data from social listening platforms, you can optimize for a given keyword or query before Google does it first. If your keyword research uncovers a high-value keyword or SERP for which Google has not yet monetized the page with a Quick Answer or a Featured Snippet, then pounce on that opportunity.

Catalant struggles with SEO – at least with its apparent target keyword phrases. For starters, it is centered on a narrow topic – focusing on agility to connect with the right talent. The company has accolades from Forbes, Entrepreneur, and Financial Times. However, Catalant’s site can’t seem to rank well for words it cares about. If you can’t rank in your niche, you have a problem.

Sometimes, Google turns up the dial on demands on ‘quality’, and if your site falls short, a website traffic crunch is assured. Some sites invite problems ignoring Google’s ‘rules’ and some sites inadvertently introduce technical problems to their site after the date of a major algorithm update and are then impacted negatively by later refreshes of the algorithm.


Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out.
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
The self-service keyword research tools we tested all handle pricing relatively similarly, pricing by month with discounts for annual billing with most SMB-focused plans ranging in the $50-$200 per month range. Depending on how your business plans to use the tools, the way particular products delineate pricing might make more sense. KWFinder.com is the cheapest of the bunch, but it's focused squarely on ad hoc keyword and Google SERP queries, which is why the product sets quotas for keyword lookups per 24 hours at different tiers. Moz and Ahrefs price by campaigns or projects, meaning the number of websites you're tracking in the dashboard. Most of the tools also cap the number of keyword reports you can run per day. SpyFu prices a bit differently, providing unlimited data access and results but capping the number of sales leads and domain contacts.
QUOTE: “Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted (also, you probably don’t want your site to rank well for the search query” GOOGLE

When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".


******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******
While Google never sells better ranking in our search results, several other search engines combine pay-per-click or pay-for-inclusion results with their regular web search results. Some SEOs will promise to rank you highly in search engines, but place you in the advertising section rather than in the search results. A few SEOs will even change their bid prices in real time to create the illusion that they "control" other search engines and can place themselves in the slot of their choice. This scam doesn't work with Google because our advertising is clearly labeled and separated from our search results, but be sure to ask any SEO you're considering which fees go toward permanent inclusion and which apply toward temporary advertising.

Consider your competition. Look at what your competitors are doing and how they are performing in their search marketing before you decide how you can best compete with them. Research what search terms they rank organically for. Consider if you can execute a plan to top their SERP placements. Also, look at what paid terms they are using to drive traffic to their own sites. As you perform this research, look for gaps that you can fill and areas where you will be unable to compete in both paid and organic search.
At Yoast, we practice what we call ‘holistic SEO‘. This means that your primary goal should be to build and maintain the best possible website. Don’t try to fool Google, but use a sustainable long-term strategy. Ranking will come automatically if your website is of extremely high quality. Google wants to get its users to the right place, as its mission is to index all the world’s online information and make it universally accessible and useful.
Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.

The last piece of the complicated SEO tool ecosystem is the enterprise tier. This roundup is geared toward SEO for small to midsize businesses (SMBs), for which these platforms are likely priced out of reach. But there's a handful of enterprise SEO software providers out there that essentially roll all of the self-service tools into one comprehensive platform. These platforms combine ongoing position monitoring, deep keyword research, and crawling with customizable reports andanalytics.
×