Geoff Paddocks Blog


How good web governance can improve your Google ranking

More and more organisations are seeking good rankings with online search engines such as Google and Microsoft’s Bing, and each new change in the evolution of the search engines is pored over endlessly.

It’s possible to worry too much about the finer details of Google’s famous algorithm , as well as recent updates codenamed Penguin, Panda and Hummingbird, while neglecting some basics in the art of making sure online users find your site.

Of course a good rank in Google will affect the bottom line of your organisation, especially if you sell your products online.

Good web governance - the application of standards that reduce risk and ensure your digital presence is functioning properly – can help with search engine optimisation.

Google and the other search engines hate broken links, missing images, email addresses that don’t work and dozens of other apparently minor problems and will mark the site down in the rankings when they are found.

If a site does not work well from a technical point of view then this will impact on how much web traffic finds it via search engines.

Google can also penalise websites unknowingly hosting malware or “phishing” links. By flagging up such links on searches, would-be users can be driven away in large numbers.

Web owners, especially those with a large online presence, can use web services like the Sitemorse Web Management Toolkit to promote a good standard of web governance on their sites and avoid such problems.

Our service can save you time as well as ensuring consistency, quality and web site performance, utilising more than 1,000 tests, checks and measures to ensure your site is delivering a good experience for visitors. 

Leave a comment

How you can manage and track web improvements

If you manage a large web presence, you will know one of the most difficult jobs is to be able to keep track of everything you have online, and see whether your sites are improving their performance against your key targets – or actually going the other way.

Most organisations are thinking about – and are somewhere on the road to achieving – web governance.

Put simply, web governance involves establishing a framework that contains web policies and a set of standards that need to be enforced across a web presence. All organisations are different, so details will change, but the important thing is to have a policy and stick to it. Large organisations can ensure this happens by monitoring their web presence, perhaps using outside assistance, and targeting priority fixes.

Governisation from Sitemorse allows its users to track their own key priorities simply and effectively and without wasting any time. The pre-agreed targets and priorities are constantly monitored and the user can access through a personalised dashboard.

You can get some idea of how this works by looking at this 40-second video on the Governisation website.

After log in the user is alerted to the number of pages in their web presence achieving pre-agreed standards. In the fictional example used, 87 per send of pages met the pre-agreed standards. An easy-to-understand management summary then points out the areas that need fixing first.

If you would like to find out more about Governsation, and how it could help you keep track of your web presence, contact us and we will be happy to arrange a full demo. The Governisation website explains more about the system and how it works.

Leave a comment

If your company’s website is slapdash, what does that say about your business?

More than fifteen years after the internet began to be a mass-market experience there are no longer any excuses for links that don’t work, or pages that do not have titles.

 Yet in a recent survey of the top 500 FTSE companies, Sitemorse still found well over two per cent of web pages that did not have a title, and well over 3 per cent failing basic functional tests.

 Since a company’s website is the first port of call for virtually all users nowadays, missing images and poor links can give a poor initial impression. After all, if an organisation’s  website is put together in a slapdash fashion, what does that say about the business itself? 

Google and other search engines may not properly catalogue or index a site that contains HTML errors, and that can mean less users finding what they are looking for – and in the case of e-commerce sites, perhaps a failure of sales and the consequential hit to the company’s bottom line. 

Around a quarter of users, according to recent research, will duck out of an online sale because of technical issues. A massive 82% of consumers said that if a business’ website performed badly it would dissuade them from buying goods from that organisation on the web – or even in- store.

Yet recent Sitemorse benchmarks show many online retailers either do not know, or choose to ignore this, with some of the best-known high street names performing very badly on quality issues.

Leave a comment

Make sure your website content is up to date – or face the wrath of Google

Changes at Google mean that stale website content will be penalised – and that means organisations wanting a high ranking on Google may need to change their ways.

The latest  update is designed to analyse whether a person wants up-to-date results or historical data.And Google estimates the alterations to its core ‘algorithm’ would make a difference to about 35% of searches.

The update to improve the “freshness” of results builds on a large-scale update made to the underlying infrastructure of Google’s core indexing system in August 2010 known as Caffeine, making it easier for Google to keep its index up to date and to add new sources of information.

Web analysts have described the changes as “huge”. The last big update to the Google algorithm, known as Panda, affected only 12% of searches.

Google Panda was built through an algorithm update that used artificial intelligence in a more sophisticated and scalable way than previously possible. Human quality testers rated thousands of websites based on measures of quality, including design, trustworthiness, speed and whether or not they would return to the website.

Google’s new Panda machine-learning algorithm, made possible by and named after engineer Navneet Panda was then used to look for similarities between websites people found to be high quality and low quality.

Many new ranking factors have been introduced to the Google algorithm as a result, while older ranking factors like PageRank have been downgraded in importance.

One thing is clear; those websites featuring good, regularly-updated copy will be rated more highly by Google. And those that do not will be less visible on the search engine.

Leave a comment