Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.

With my experience, about 65% of my traffic comes from search engines, & the rest is from social sites that include referrals & direct traffic. Communicating with similar kind of blogger is the best way to get traffic. It’s just like going to relevant sites comes under the micro niche site to you and ultimately making you get the direct quality traffic to you. Anyhow, it will then affect our keyword ranking and PageRank according to the Google guidelines. To get higher search rankings, you need not only focus on SEO but other factors to make you drive more attention to readers online. Thanks for this page, that will help me a lot and for other newbies too…

Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.


For the purposes of our testing, we standardized keyword queries across the five tools. To test the primary ad hoc keyword search capability with each tool, we ran queries on an identical set of keywords. From there we tested not only the kinds of data and metrics the tool gave, but how it handled keyword management and organization, and what kind of optimization recommendations and suggestions the tool provided.

It’s important to note that Google is responsible for the majority of the search engine traffic in the world. This may vary from one industry to another, but it’s likely that Google is the dominant player in the search results that your business or website would want to show up in, but the best practices outlined in this guide will help you to position your site and its content to rank in other search engines, as well.
Everyone knows intent behind the search matters. In e-commerce, intent is somewhat easy to see. B2B or, better yet, healthcare, isn't quite as easy. Matching persona intent to keywords requires a bit more thought. In this video, we'll cover how to find intent modifiers during keyword research, how to organize those modifiers into the search funnel, and how to quickly find unique universal results at different levels of the search funnel to utilize.
It’s important to note that Google is responsible for the majority of the search engine traffic in the world. This may vary from one industry to another, but it’s likely that Google is the dominant player in the search results that your business or website would want to show up in, but the best practices outlined in this guide will help you to position your site and its content to rank in other search engines, as well.

In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
Capturing and keeping attention is one of the hardest parts of our job today. Fact: It's just going to get harder with the advent of new technology and conversational interfaces. In the brave new world we're stepping into, the key questions are: How do we get discovered? How can we delight our audiences? And how can we grow revenue for our clients? Watch this session to learn how to make your marketing and advertising efforts something people are going to want to consume.
Google expects pages to “be edited, reviewed, and updated on a regular basis” especially if they are for important issues like medical information, and states not all pages are held to such standards, but one can expect that Google wants information updated in a reasonable timescale. How reasonable this is, is dependant on the TOPIC and the PURPOSE of the web page RELATIVE to competing pages on the web.
Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
I’m really grateful for your generous post, Brian. I’m definitely going to implement TOC on some of my over 4k words posts, where I’m trying to become the source. 😉 And I will also use the stats on some new posts. Thanks to you, I also researched big keywords, which I’d stayed away from, and found that many of the high CPC and ranking articles are from 2014. Hoping some of my fresh new content helps rank me higher. Love what you do, sir!
On the voice and natural language side, it's all about FAQs (frequently asked questions). Virtual assistants and smart home devices have made voice recognition and natural language processing (NLP) not only desirable but an expected search vector. To predict how to surface a business's results in a voice search, SEO professionals now need to concentrate on ranking for the common NL queries around target keywords. Google's Quick Answers exist to give its traditional text-based search results an easy NL component to pull from when Google Assistant is answering questions.
Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

I prefer simple SEO techniques and ones that can be measured in some way. I have never just wanted to rank for competitive terms; I have always wanted to understand at least some of the reasons why a page ranked for these key phrases. I try to create a good user experience for humans AND search engines. If you make high-quality text content relevant and suitable for both these audiences, you’ll more than likely find success in organic listings and you might not ever need to get into the technical side of things, like redirects and search engine friendly URLs.
If you want to *ENSURE* your FULL title tag shows in the desktop UK version of Google SERPs, stick to a shorter title of between 55-65 characters but that does not mean your title tag MUST end at 55 characters and remember your mobile visitors see a longer title (in the UK, in January 2018). What you see displayed in SERPs depends on the characters you use. In 2019 – I just expect what Google displays to change – so I don’t obsess about what Google is doing in terms of display. See the tests later on in this article.

Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.


The Java program is fairly intuitive, with easy-to-navigate tabs. Additionally, you can export any or all of the data into Excel for further analysis. So say you're using Optify, Moz, or RavenSEO to monitor your links or rankings for specific keywords -- you could simply create a .csv file from your spreadsheet, make a few adjustments for the proper formatting, and upload it to those tools.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Site. Migration. No two words elicit more fear, joy, or excitement to a digital marketer. When the idea was shared three years ago, the company was excited. They dreamed of new features and efficiency. But as SEOs we knew better. We knew there would be midnight strategy sessions with IT. More UAT environments than we could track. Deadlines, requirements, and compromises forged through hallway chats. ... The result was a stable transition with minimal dips in traffic. What we didn't know, however, was the amount of cross-functional coordination that was required to pull it off. Learn more in this video!
×