Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.
QUOTE: “If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.” Matt Cutts, Google 2013
That's why PA and DA metrics often vary from tool to tool. Each ad hoc keyword tool we tested came up with slightly different numbers based on what they're pulling from Google and other sources, and how they're doing the calculating. The shortcoming of PA and DA is that, even though they give you a sense of how authoritative a page might be in the eyes of Google, they don't tell you how easy or difficult it will be to position it for a particular keyword. This difficulty is why a third, newer metric is beginning to emerge among the self-service SEO players: difficulty scores.
After the audit has been completed, your team will be invited to a presentation in which your SEO specialist will talk through the findings and recommendations. The Three Deep SEO team will walk you and your team though the roadmap to completion so you know what to expect and when. In addition, you will receive a comprehensive analysis of your site’s health. All of these are customized to you and your specific situation.
×