An SEO expert could probably use a combination of AdWords for the initial data, Google Search Console for website monitoring, and Google Analytics for internal website data. Then the SEO expert can transform and analyze the data using a BI tool. The problem for most business users is that's simply not an effective use of time and resources. These tools exist to take the manual data gathering and granular, piecemeal detective work out of SEO. It's about making a process that's core to modern business success more easily accessible to someone who isn't an SEO consultant or expert.
The actual content of your page itself is, of course, very important. Different types of pages will have different “jobs” – your cornerstone content asset that you want lots of folks to link to needs to be very different than your support content that you want to make sure your users find and get an answer from quickly. That said, Google has been increasingly favoring certain types of content, and as you build out any of the pages on your site, there are a few things to keep in mind:
Thick & Unique Content – There is no magic number in terms of word count, and if you have a few pages of content on your site with a handful to a couple hundred words you won’t be falling out of Google’s good graces, but in general recent Panda updates in particular favor longer, unique content. If you have a large number (think thousands) of extremely short (50-200 words of content) pages or lots of duplicated content where nothing changes but the page’s title tag and say a line of text, that could get you in trouble. Look at the entirety of your site: are a large percentage of your pages thin, duplicated and low value? If so, try to identify a way to “thicken” those pages, or check your analytics to see how much traffic they’re getting, and simply exclude them (using a noindex meta tag) from search results to keep from having it appear to Google that you’re trying to flood their index with lots of low value pages in an attempt to have them rank.
Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
Ideally, you will have unique pages, with unique page titles and unique page meta descriptions . Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I don’t worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals.
I’m really grateful for your generous post, Brian. I’m definitely going to implement TOC on some of my over 4k words posts, where I’m trying to become the source. 😉 And I will also use the stats on some new posts. Thanks to you, I also researched big keywords, which I’d stayed away from, and found that many of the high CPC and ranking articles are from 2014. Hoping some of my fresh new content helps rank me higher. Love what you do, sir!
When optimising a title, you are looking to rank for as many terms as possible, without keyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) – and take a more long-tail approach. Note that too many page titles and not enough actual page text per page could lead to doorway page type situations. A highly relevant unique page title is no longer enough to float a page with thin content. Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites.
While Google is on record as stating these quality raters do not directly influence where you rank (without more senior analysts making a call on the quality of your website, I presume?) – there are some things in this document, mostly of a user experience nature (UX) that all search engine optimisers and Webmasters of any kind should note going forward.
"Organic search" pertains to how vistors arrive at a website from running a search query (most notably Google, who has 90 percent of the search market according to StatCounter. Whatever your products or services are, appearing as close to the top of search results for your specific business has become a critical objective for most businesses. Google continously refines, and to the chagrin of search engine optimization (SEO) managers, revises its search algorithms. They employ new techniques and technologies including artificial intelligence (AI) to weed out low value, poorly created pages. This brings about monumental challenges in maintaining an effective SEO strategy and good search results. We've looked at the best tools to ket you optimize your website's placement within search rankings.
I added one keyword to the page in plain text because adding the actual ‘keyword phrase’ itself would have made my text read a bit keyword stuffed for other variations of the main term. It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research – and knowing which unique keywords to add.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
The caveat in all of this is that, in one way or another, most of the data and the rules governing what ranks and what doesn't (often on a week-to-week basis) comes from Google. If you know where to find and how to use the free and freemium tools Google provides under the surface—AdWords, Google Analytics, and Google Search Console being the big three—you can do all of this manually. Much of the data that the ongoing position monitoring, keyword research, and crawler tools provide is extracted in one form or another from Google itself. Doing it yourself is a disjointed, meticulous process, but you can piece together all the SEO data you need to come up with an optimization strategy should you be so inclined.
A poor 404 page and user interaction with it, can only lead to a ‘poor user experience’ signal at Google’s end, for a number of reasons. I will highlight a poor 404 page in my audits and actually programmatically look for signs of this issue when I scan a site. I don’t know if Google looks at your site that way to rate it e.g. algorithmically determines if you have a good 404 page – or if it is a UX factor, something to be taken into consideration further down the line – or purely to get you thinking about 404 pages (in general) to help prevent Google wasting resources indexing crud pages and presenting poor results to searchers. I think rather that any rating would be a second order scoring including data from user activity on the SERPs – stuff we as SEO can’t see.
People are searching for any manner of things directly related to your business. Beyond that, your prospects are also searching for all kinds of things that are only loosely related to your business. These represent even more opportunities to connect with those folks and help answer their questions, solve their problems, and become a trusted resource for them.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
After the audit has been completed, your team will be invited to a presentation in which your SEO specialist will talk through the findings and recommendations. The Three Deep SEO team will walk you and your team though the roadmap to completion so you know what to expect and when. In addition, you will receive a comprehensive analysis of your site’s health. All of these are customized to you and your specific situation.
QUOTE: “Starting April 21 (2015), we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high-quality search results that are optimized for their devices”. GOOGLE
Your site’s URL structure can be important both from a tracking perspective (you can more easily segment data in reports using a segmented, logical URL structure), and a shareability standpoint (shorter, descriptive URLs are easier to copy and paste and tend to get mistakenly cut off less frequently). Again: don’t work to cram in as many keywords as possible; create a short, descriptive URL.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.
If you want to *ENSURE* your FULL title tag shows in the desktop UK version of Google SERPs, stick to a shorter title of between 55-65 characters but that does not mean your title tag MUST end at 55 characters and remember your mobile visitors see a longer title (in the UK, in January 2018). What you see displayed in SERPs depends on the characters you use. In 2019 – I just expect what Google displays to change – so I don’t obsess about what Google is doing in terms of display. See the tests later on in this article.
Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.
If Google finds two identical pieces of content, whether on your own site, or on another you’re not even aware of, it will only index one of those pages. You should be aware of scraper sites, stealing your content automatically and republishing as your own. Here’s Graham Charlton’s thorough investigation on what to if your content ends up working better for somebody else.
This can be broken down into three primary categories: ad hoc keyword research, ongoing search position monitoring, and crawling, which is when Google bots search through sites to determine which pages to index. In this roundup, we'll explain what each of those categories means for your business, the types of platforms and tools you can use to cover all of your SEO bases, and what to look for when investing in those tools.
Alt text (alternative text), also known as "alt attributes" describe the appearance and function of an image on a page. Alt text uses: 1. Adding alternative text to photos is first and foremost a principle of web accessibility. Visually impaired users using screen readers will be read an alt attribute to better understand an on-page image. 2. Alt tags will be displayed in place of an image if an image file cannot be loaded. 3. Alt tags provide better image context/descriptions to search engine crawlers, helping them to index an image properly.