Across the internet, many analysts and researchers who take interest in such things are predicting another update to Google’s Panda algorithm. An update reputed to affect 1% of all sites on the internet today, many SEOs could be caught out.
Recent years have seen the US search engine giant make huge leaps forward to prevent spam sites ranking highly on their SERPs. However, with more than 30 million sites likely to be affected by this secret change, many bona fide sites could find themselves having issues.
The introduction of Panda in 2011 saw many so-called black hat sites knocked down the rankings; this is good. However, it also negatively affected highly regarded sites such as Forbes.com.
Keyword density problems have been the major issue here, but so has duplicate content. Though this can be hard to control, it is imperative that online marketers keep tabs on where their content is displayed and used by third parties.
A search engine optimisation campaign also requires the coherent management of links and meta tags, and ensuring these stay relevant to the page and the site. It is a process in and of cultivation and delivery.
The fact that the Panda changes are not to be fleshed out will rankle some in the search marketing industry. Particularly with Google claiming they would be more transparent.
However, employing white hat techniques, keeping keywords above the fold and not being to reliant on third party ads will always be favoured.
Monitoring sites found to be affected by the changes will also provide useful information for better targeted campaigns.
- What are the nuts and bolts of digital marketing? - September 10, 2020
- What is Google RankBrain and how do you use it? - September 9, 2020
- Three dos (and three don’ts) of writing great content - September 4, 2020