Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.

SEM is better for testing than SEO. Because you can immediately turn SEM paid ads off and on, it’s a great strategy for testing. You can quickly revise your ad copy, target new audiences, and change landing page content to test your new tactics. This flexibility allows you to see differences in your strategies immediately. You cannot accomplish this through SEO, as it would take too much time to make changes and monitor differences in results.
The ranking of your website is partly decided by on-page factors. On-page SEO factors are all those things you can influence from within your actual website. These factors include technical aspects (e.g. the quality of your code and site speed) and content-related aspects, like the structure of your website or the quality of the copy on your website. These are all crucial on-page SEO factors.
In the enterprise space, one major trend we're seeing lately is data import across the big players. Much of SEO involves working with the data Google gives you and then filling in all of the gaps. Google Search Console (formerly, Webmaster Tools) only gives you a 90-day window of data, so enterprise vendors, such as Conductor and Screaming Frog, are continually adding and importing data sources from other crawling databases (like DeepCrawl's). They're combining that with Google Search Console data for more accurate, ongoing Search Engine Results Page (SERP) monitoring and position tracking on specific keywords. SEMrush and Searchmetrics (in its enterprise Suite packages) offer this level of enterprise SERP monitoring as well, which can give your business a higher-level view of how you're doing against competitors.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
Google is all about ‘user experience’ and ‘visitor satisfaction’ in 2019 so it’s worth remembering that usability studies have shown that a good page title length is about seven or eight words long and fewer than 64 total characters. Longer titles are less scan-able in bookmark lists, and might not display correctly in many browsers (and of course probably will be truncated in SERPs).
While that theory is sound (when focused on a single page, when the intent is to deliver utility content to a Google user) using old school SEO techniques on especially a large site spread out across many pages seems to amplify site quality problems, after recent algorithm changes, and so this type of optimisation without keeping an eye on overall site quality is self-defeating in the long run.
SEM results show to a select target audience. SEO results do not. While successful SEO and SEM strategies are driven by a plan to connect with a select audience, you can only specify that target audience through SEM. Through SEM, you can (depending on the publisher) select what audiences you want to see the search results by assigning filters based on age, location, income, habits, and more. Through SEO, you cannot specifically choose who will see your search results.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
SEO stands for ‘Search Engine Optimization’. It’s the practice of optimizing your web pages to make them reach a high position in the search results of Google or other search engines. SEO focuses on improving the rankings in the organic – aka non paid – search results. If you have a website and you want to get more traffic, it should be part of your marketing efforts. Here, I’ll explain what SEO is and how we approach it at Yoast.
However, if possible, I would like you to expand a bit on your “zombie pages” tip..we run a site where are definitely enough pages to delete (no sessions, no links, probably not even relevant with the main theme of the site, not even important for the architecture of the site)..Nonetheless, I am not very sure what is the best technical decision for these pages…just deleting them from my CMS, redirecting (if there is a relevant alternative) or something else? Unindex them on Search console? what response code they should have? ..
×