Keeping your SEO the best it can be is an ongoing task, with many hours spent on ensuring everything is just right., However, as a recent article on Search Engine Watch highlighted, there are plenty of jobs that can be done in a matter of minutes, most just as important to keep on top of.
Something that surprises many SEOs, experienced or otherwise is the number of pages on their sites which are not fully interrogatable to the search engine’s robots. However, this can easily happen for many reasons.,
To see if you have any such problems, visit /robots.txt and check your robots.txt file to see if anything is being withheld.
Reviewing a site for duplicate title elements, using Google Webmaster Tools can quickly show duplicate pages, poor element structure and cannibalisation of keywords. It is all too easy for content to get into such a state, and will be poorly treated by search engine sweeps.
Any issues displayed here allows you to concentrate your efforts in areas you know will have an effect, whether rewriting elements, redirecting or selecting which pages should focus on the keywords identified.
There are many other easy to make adjustments to improve SEO too, such as analysing the most authoritative links and requesting that the link sites modify their anchor text . Similarly addressing other link deficiencies such as revising default pages to target absolute pages can help significantly.
Addressing such simple problems can result in substantial payoffs but, the concerted effort of long term strategy is still needed.
- How to surf the rising tide of content marketing - June 5, 2018
- Money saving tips for content marketing - May 16, 2018
- Is your content readable enough? - March 21, 2018