Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
When it comes to finally choosing the SEO tools that suit your organization's needs, the decision comes back to that concept of gaining tangible ground. It's about discerning which tools provide the most effective combination of keyword-driven SEO investigation capabilities, and then on top of that, the added keyword organization, analysis, recommendations, and other useful functionality to take action on the SEO insights you uncover. If a product is telling you what optimizations need to be made to your website, does it then provide technology to help you make those improvements?

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.

Think, that one day, your website will have to pass a manual review by ‘Google’ – the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google, at least classes even useful sites as spammy, according to leaked documents. If you want a site to rank high in Google – it better ‘do’ something other than exist only link to another site because of a paid commission. Know that to succeed, your website needs to be USEFUL, to a visitor that Google will send you – and a useful website is not just a website, with a sole commercial intent, of sending a visitor from Google to another site – or a ‘thin affiliate’ as Google CLASSIFIES it.


QUOTE: “Medium pages achieve their purpose and have neither high nor low expertise, authoritativeness, and trustworthiness. However, Medium pages lack the characteristics that would support a higher quality rating. Occasionally, you will find a page with a mix of high and low quality characteristics. In those cases, the best page quality rating may be Medium.” Google Quality Evaluator Guidelines, 2017

Hi, Brian. Thank you for the great article. I have a question about the part about 4 website addresses. Ours currently is set to https://www., and we would like to change it to just an https:// as the main website. Will this hurt our current link profile, or will everything stay the same? This might be a foolish question, but we are a bit worried. Thank you.
Your article reaches me at just the perfect time. I’ve been working on getting back to blogging and have been at it for almost a month now. I’ve been fixing SEO related stuff on my blog and after reading this article (by the way is way too long for one sitting) I’m kind of confused. I’m looking at bloggers like Darren Rowse, Brian Clark, and so many other bloggers who use blogging or their blogs as a platform to educate their readers more than thinking about search rankings (but I’m sure they do).

AWR Cloud, our third Editors' Choice, is rated slightly lower than Moz Pro and SpyFu as an all-in-one SEO platform. However, AWR Cloud leads the pack in ongoing position monitoring and proactive search rank tracking on top of solid overall functionality. On the ad hoc keyword research front, the KWFinder.com tool excels. DeepCrawl's laser focus on comprehensive domain scanning is unmatched for site crawling, while Ahrefs and Majetic can duke it out for the best internet-wide crawling index. When it comes to backlinks tracking, LinkResearchTools and Majestic are the top choices. SEMrush and Searchmetrics do a bit of everything.
The tools we tested in this round of reviews were judged on which do the best job of giving you the research-driven investigation tools to identify SEO opportunities ripe for growth, along with offering enterprise-grade functionality at a reasonable price. Whether one of these optimization tools is an ideal fit for your business, or you end up combining more than one for a potent SEO tool suite, this roundup will help you decide what makes the most sense for you. There's a wealth of data out there to give your business an edge and boost pages higher and higher in key search results. Make sure you've got the right SEO tools in place to seize the opportunities.
If you are just starting out, don’t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it’s best to keep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out – you may as well learn how to do it within Google’s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google’s guidelines, or not, and stick to it. Don’t be caught in the middle with an important project. Do not always follow the herd.
There are other parts of SEO which you should pay attention to after your audit to make sure you stay competitive. After all, the technical foundation isn't the end of the road for SEO success. It's important to pay attention to your competition's SEO activity, keep an eye on the newest search engine best practices, and maintain local SEO best practices if your business depends on customers visiting a physical address. All of these are elements of a successful SEO strategy and should be corollary to your audit and ongoing SEO maintenance.
×